sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
sequencelengths
1
1.84k
created_at
stringlengths
25
25
arxiv
sequencelengths
0
201
languages
sequencelengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
sequencelengths
0
722
processed_texts
sequencelengths
1
723
tokens_length
sequencelengths
1
723
input_texts
sequencelengths
1
61
embeddings
sequencelengths
768
768
null
null
transformers
# MultiBERTs Seed 3 Checkpoint 80k (uncased) Seed 3 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-3](https://hf.co/multberts-seed-3). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-3-80k') model = BertModel.from_pretrained("multiberts-seed-3-80k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-3"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-3-80k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-3", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 3 Checkpoint 80k (uncased) Seed 3 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 3 Checkpoint 80k (uncased)\nSeed 3 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 3 Checkpoint 80k (uncased)\nSeed 3 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 3 Checkpoint 80k (uncased)\nSeed 3 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08291789889335632, 0.0025775241665542126, -0.0021693205926567316, 0.06764844805002213, 0.08598398417234421, 0.0023313751444220543, 0.11929309368133545, 0.04996443912386894, -0.028728365898132324, 0.024323537945747375, 0.09157012403011322, 0.033326976001262665, 0.04219449684023857, 0.06614730507135391, 0.09550875425338745, -0.25728049874305725, 0.050358183681964874, -0.06264012306928635, 0.05972685664892197, 0.07575782388448715, 0.10041425377130508, -0.0711020827293396, 0.06240776926279068, 0.03450564295053482, -0.08097382634878159, -0.015442612580955029, -0.018222972750663757, -0.03584786131978035, 0.10001371800899506, 0.0687299519777298, 0.06102856248617172, 0.0014314055442810059, 0.05732911080121994, -0.09043680131435394, 0.015983158722519875, 0.04504459723830223, -0.001746919471770525, 0.023621220141649246, -0.005058156326413155, 0.01648033782839775, 0.11283887177705765, 0.03902427852153778, 0.07927872985601425, 0.03467170149087906, -0.09457679092884064, -0.11233097314834595, -0.08062860369682312, 0.10481832921504974, 0.05543067306280136, 0.042347252368927, -0.006407526321709156, 0.07572337985038757, -0.03097282350063324, 0.07562936842441559, 0.11033748090267181, -0.25575023889541626, -0.008629720658063889, 0.0678693950176239, 0.04580050706863403, 0.04435988515615463, 0.01438931655138731, 0.02725829929113388, 0.004761401563882828, 0.045236047357320786, 0.028355933725833893, -0.023582234978675842, 0.12161517143249512, -0.04408310353755951, -0.1525353491306305, -0.043246008455753326, 0.12203964591026306, -0.005600698292255402, -0.1257261484861374, -0.10218583047389984, -0.02804381027817726, 0.11710749566555023, -0.003091331571340561, -0.01919342577457428, -0.004005621187388897, 0.010899927467107773, 0.024065952748060226, -0.09093472361564636, -0.08646528422832489, -0.02818400040268898, -0.034153666347265244, 0.1256611943244934, 0.04682214558124542, 0.05066597834229469, -0.0335969403386116, 0.08664441108703613, -0.11665554344654083, -0.038760241121053696, -0.05115378648042679, -0.08351285755634308, -0.01827654056251049, 0.009189803153276443, -0.028356527909636497, -0.0853680893778801, -0.059696219861507416, 0.11817506700754166, 0.037604887038469315, 0.03139350935816765, -0.002683479804545641, 0.0423085018992424, 0.07419213652610779, 0.09614315629005432, -0.039436716586351395, 0.04745567962527275, 0.031571436673402786, -0.020726477727293968, 0.05874701216816902, -0.05127286911010742, -0.10220703482627869, 0.08008679002523422, 0.00325921643525362, 0.039640091359615326, 0.025460852310061455, 0.03363874554634094, -0.011582573875784874, -0.0745328813791275, 0.16579586267471313, -0.0768846645951271, -0.010345196351408958, -0.01781110092997551, 0.011723417788743973, 0.049002766609191895, 0.03227510675787926, -0.004395288415253162, -0.04719995707273483, -0.005352864973247051, -0.05413829907774925, -0.02572115883231163, -0.055125243961811066, -0.11809195578098297, -0.000046626199036836624, -0.04265669733285904, -0.032659564167261124, -0.1409059464931488, -0.2141454815864563, -0.019850529730319977, 0.06428313255310059, -0.0011982875876128674, -0.008928810246288776, 0.023961182683706284, 0.018265502527356148, -0.02155599743127823, 0.009180335327982903, -0.04862070083618164, -0.0015719365328550339, -0.00700695626437664, -0.03248504549264908, 0.05524826794862747, -0.04041570425033569, 0.024313773959875107, -0.06815454363822937, 0.021942030638456345, -0.21199211478233337, 0.08717568218708038, -0.033188674598932266, 0.0019917208701372147, -0.037416428327560425, -0.044368643313646317, 0.012638680636882782, 0.04800540208816528, -0.009046715684235096, 0.11513197422027588, -0.13832177221775055, -0.050072602927684784, 0.17982164025306702, -0.15954290330410004, -0.0037953220307826996, 0.10054510086774826, -0.04710638150572777, 0.05253559350967407, 0.1332446187734604, 0.09825669229030609, 0.07851559668779373, -0.07380959391593933, 0.01083153672516346, 0.06050499528646469, -0.06638174504041672, 0.0547938272356987, 0.0914740115404129, -0.025838010013103485, -0.13514967262744904, 0.02988261729478836, -0.07523312419652939, -0.009181653149425983, -0.024595871567726135, -0.020529214292764664, 0.008575782179832458, -0.03668482229113579, 0.022198963910341263, 0.0062218294478952885, 0.016463320702314377, -0.04153632000088692, -0.08311259001493454, 0.0301397442817688, 0.07524565607309341, -0.07206441462039948, 0.04272189736366272, -0.07218986749649048, 0.06043747439980507, -0.07485999166965485, -0.00580524280667305, -0.16597655415534973, -0.024385368451476097, 0.0444461852312088, -0.044863563030958176, 0.04998059943318367, 0.0921405553817749, 0.002522463211789727, 0.12266330420970917, -0.038683898746967316, 0.0021413052454590797, -0.005967732518911362, -0.010316912084817886, -0.05015105754137039, -0.12244145572185516, -0.08234640955924988, -0.06845475733280182, 0.09761470556259155, -0.07286389172077179, 0.028761742636561394, -0.07347122579813004, -0.021331174299120903, -0.010504502803087234, -0.059458889067173004, -0.004435943439602852, 0.010881217196583748, -0.03089076094329357, -0.047515712678432465, 0.04955291748046875, 0.049723610281944275, -0.06020154803991318, 0.07637812197208405, -0.10929074138402939, -0.058393485844135284, 0.05414537340402603, 0.013327323831617832, -0.0805867612361908, 0.0866679772734642, -0.01992301642894745, -0.013680330477654934, -0.05373191088438034, -0.04410174489021301, 0.19368135929107666, -0.022604048252105713, 0.10070042312145233, -0.09111888706684113, 0.0007140620145946741, 0.0264179278165102, -0.04562617838382721, -0.02061750367283821, 0.0582413375377655, 0.05278695747256279, -0.18965375423431396, 0.014120116829872131, 0.0551089346408844, 0.07707642018795013, 0.11146542429924011, 0.028047116473317146, -0.02470587193965912, -0.045427754521369934, -0.010919624008238316, 0.005865146405994892, 0.054006438702344894, -0.028741277754306793, -0.009157993830740452, 0.03150765597820282, 0.05831675976514816, 0.01914941519498825, -0.0788901150226593, 0.033968016505241394, 0.06621848046779633, -0.015671417117118835, -0.03874575346708298, -0.02518642507493496, -0.060696639120578766, 0.06265882402658463, 0.05342143774032593, 0.0323462150990963, 0.026470975950360298, -0.015447506681084633, -0.13669797778129578, 0.18879079818725586, -0.1144474670290947, -0.26004642248153687, -0.10774149745702744, -0.060807518661022186, -0.02722097747027874, 0.0414474755525589, 0.058734357357025146, -0.030393974855542183, -0.04456982761621475, -0.11778070032596588, 0.0591624490916729, -0.06632406264543533, -0.03216880187392235, -0.010957101359963417, -0.05306623876094818, -0.019622713327407837, -0.12813091278076172, -0.013536598533391953, -0.0307929664850235, -0.07445193827152252, 0.008009418845176697, -0.03761375695466995, 0.0287229735404253, 0.1380252242088318, 0.037420861423015594, -0.01874386891722679, -0.017320677638053894, 0.18935126066207886, 0.01209169253706932, 0.0578034371137619, 0.11379531025886536, -0.02940908446907997, 0.05413638427853584, 0.04269048199057579, 0.02367917262017727, -0.049549899995326996, 0.013828057795763016, -0.013918143697082996, -0.12077248841524124, -0.1738651990890503, -0.07087001204490662, -0.002872701734304428, 0.008098243735730648, 0.019285621121525764, 0.03653223067522049, 0.023086126893758774, 0.03957673907279968, -0.03108697384595871, 0.02545274794101715, -0.013478200882673264, 0.0817948654294014, 0.02346351370215416, -0.0752117857336998, 0.09422571957111359, -0.058657001703977585, 0.017082899808883667, 0.11029724776744843, -0.06070832535624504, 0.18437358736991882, 0.02662641555070877, 0.05758191645145416, 0.10328420996665955, 0.017824988812208176, 0.052373006939888, 0.0887584537267685, -0.047093138098716736, 0.0047937678173184395, -0.06284651160240173, -0.05282927677035332, -0.03644556179642677, 0.04950372502207756, 0.030045945197343826, 0.017101038247346878, -0.12078684568405151, 0.02086648717522621, -0.0018723378889262676, 0.138025164604187, 0.044654641300439835, -0.12231370806694031, -0.11916765570640564, 0.03587585315108299, -0.045379817485809326, -0.06216590851545334, 0.029942598193883896, 0.056517407298088074, -0.15344947576522827, 0.04613712057471275, -0.005220114253461361, 0.06595873087644577, -0.09178043901920319, 0.015578417107462883, -0.0447930283844471, -0.0006305426359176636, 0.005157599691301584, 0.0703020691871643, -0.1369810402393341, 0.10152262449264526, 0.02025269716978073, 0.0489865280687809, -0.0794524997472763, 0.016131402924656868, -0.010202372446656227, 0.10780581086874008, 0.11532561480998993, 0.041957125067710876, -0.05313301086425781, -0.020873885601758957, -0.047496095299720764, 0.021156225353479385, 0.060292813926935196, -0.0783931165933609, 0.061774201691150665, 0.007970002479851246, 0.008571945130825043, -0.02132633700966835, 0.01842738315463066, -0.13202859461307526, -0.1234932541847229, 0.06235501170158386, -0.07806779444217682, -0.09826624393463135, -0.057297661900520325, -0.06441595405340195, -0.05555248260498047, 0.21621175110340118, -0.1130456030368805, -0.0902966633439064, -0.09843741357326508, -0.017414040863513947, 0.04477330297231674, -0.0648004338145256, 0.04431650787591934, -0.037895604968070984, 0.09176083654165268, -0.04744163900613785, -0.10934127867221832, 0.035942573100328445, -0.1154673621058464, -0.11353594064712524, -0.04397887736558914, 0.10727871209383011, 0.115353062748909, 0.03817358613014221, 0.012422963045537472, 0.010771346278488636, 0.000305788591504097, -0.11728735268115997, 0.015284767374396324, 0.1337457299232483, 0.0017675794661045074, 0.07238750904798508, -0.06141115725040436, 0.029977470636367798, -0.01727525144815445, 0.0001257583498954773, 0.13268624246120453, 0.18689382076263428, -0.06101809814572334, 0.17408156394958496, 0.19983260333538055, -0.10520249605178833, -0.1883305311203003, -0.053874142467975616, -0.002316223457455635, 0.04380864277482033, 0.05233701318502426, -0.1847684234380722, 0.09039963781833649, 0.03514406085014343, -0.032024603337049484, 0.017872273921966553, -0.23213011026382446, -0.110015869140625, 0.0923570990562439, 0.05817325413227081, 0.18543708324432373, -0.08277029544115067, -0.03706996142864227, -0.01653047278523445, -0.036509864032268524, 0.04984569922089577, -0.03423251956701279, 0.09139454364776611, 0.005512312054634094, -0.032269373536109924, 0.0033862870186567307, -0.03190106153488159, 0.09534329175949097, 0.040329284965991974, 0.023336606100201607, -0.07149246335029602, -0.00537530891597271, 0.10894337296485901, -0.03915073722600937, 0.10012276470661163, 0.0382966510951519, 0.07367204874753952, -0.09910270571708679, -0.060860030353069305, -0.07489624619483948, 0.044774796813726425, -0.04183968901634216, -0.05629630386829376, -0.062034741044044495, 0.058491047471761703, 0.036008719354867935, 0.011351148597896099, 0.005471479147672653, -0.03910968825221062, 0.0450751855969429, 0.08592647314071655, 0.08503071218729019, -0.03608877211809158, -0.07410963624715805, -0.054353535175323486, -0.04835328459739685, 0.06574752926826477, -0.08901160955429077, 0.017906954512000084, 0.02692504972219467, 0.009761678986251354, 0.09112168848514557, 0.034048374742269516, -0.13885469734668732, 0.010586949065327644, 0.034110359847545624, -0.12410946190357208, -0.1052175760269165, -0.020056216046214104, 0.03181832283735275, -0.03737296164035797, 0.055956266820430756, 0.14570212364196777, -0.033656299114227295, -0.031886983662843704, -0.047941796481609344, 0.038500260561704636, -0.018945489078760147, 0.04950946569442749, 0.06386980414390564, 0.0297680776566267, -0.07229559123516083, 0.0752134621143341, 0.03540421649813652, -0.036250654608011246, 0.04461883753538132, 0.042961787432432175, -0.09377110749483109, -0.07818390429019928, -0.05902491882443428, 0.09326525032520294, -0.024796804413199425, -0.047956883907318115, -0.002247992902994156, -0.08108756691217422, 0.06886456161737442, 0.07179388403892517, 0.04880886152386665, 0.0363108292222023, -0.0871397852897644, 0.01559839490801096, -0.05284913629293442, 0.034252505749464035, -0.03164875507354736, -0.004923295229673386, -0.05672632157802582, 0.06546233594417572, 0.06523922085762024, 0.09750444442033768, -0.03392314165830612, -0.07566718757152557, -0.08391982316970825, -0.0134144751355052, -0.06568463891744614, -0.034283675253391266, -0.07606349885463715, -0.006373191252350807, 0.0012653553858399391, -0.0016252119094133377, 0.023430392146110535, 0.03650704026222229, -0.04243336617946625, -0.01885618269443512, -0.03595207631587982, 0.037968188524246216, -0.06212722882628441, 0.007341287098824978, 0.013557985424995422, -0.03653925284743309, 0.09281199425458908, 0.036124590784311295, -0.01185748353600502, 0.04704417660832405, -0.02749832719564438, 0.035775624215602875, -0.019088217988610268, -0.0003360481932759285, -0.024718601256608963, -0.1080656349658966, -0.0056494008749723434, 0.005396412685513496, -0.02624637261033058, 0.009856335818767548, 0.05634200945496559, -0.07261046022176743, 0.0865866020321846, 0.04723464697599411, -0.031531158834695816, -0.0706828385591507, 0.04091011732816696, -0.015884242951869965, 0.03095979429781437, 0.07130074501037598, -0.034008316695690155, 0.0529995933175087, -0.0973806232213974, -0.028138644993305206, 0.004242202267050743, -0.0034022852778434753, -0.010755695402622223, -0.052005935460329056, -0.004474257118999958, 0.00700794905424118, 0.17364418506622314, -0.0227259062230587, 0.03683032840490341, 0.013301373459398746, 0.007415160536766052, 0.05416854843497276, -0.015498019754886627, 0.0715867280960083, -0.007936684414744377, -0.024498797953128815, -0.01286140363663435, 0.037786275148391724, 0.005803264677524567, 0.001229986548423767, 0.14085102081298828, 0.04705693945288658, 0.08490914106369019, 0.07555282115936279, 0.017368612810969353, 0.015236188657581806, -0.13682067394256592, -0.09149476885795593, 0.01014492753893137, 0.057864099740982056, -0.017964892089366913, 0.018172401934862137, 0.093303382396698, -0.08766674995422363, 0.06838426738977432, 0.04925134778022766, -0.04679529741406441, -0.12598061561584473, -0.19233044981956482, -0.02411443367600441, -0.028620028868317604, -0.01066693663597107, -0.09149728715419769, 0.016153521835803986, 0.09319204092025757, 0.025725005194544792, -0.01077917031943798, 0.09208808839321136, -0.10385528206825256, -0.031231842935085297, 0.043096207082271576, -0.027870677411556244, 0.01391046866774559, 0.048140112310647964, 0.023816045373678207, -0.006241133436560631, 0.04328740015625954, 0.04246954247355461, 0.0430452786386013, 0.027208974584937096, 0.050743550062179565, -0.024775443598628044, -0.07497760653495789, -0.0331888422369957, -0.004132707137614489, 0.05439713969826698, 0.1360919177532196, 0.023731444031000137, -0.07012303918600082, 0.007378784473985434, 0.10769227147102356, -0.03142891079187393, -0.0491742268204689, -0.1063232272863388, 0.24435406923294067, 0.021677382290363312, 0.0027433596551418304, -0.005290428176522255, -0.045434508472681046, 0.004302283748984337, 0.20811554789543152, 0.22136369347572327, 0.004170055501163006, -0.00961680430918932, 0.009235299192368984, -0.010958796367049217, 0.035237882286310196, 0.14579179883003235, 0.006399037316441536, 0.2553686201572418, -0.04692584276199341, 0.04056716710329056, -0.041705720126628876, -0.0380430743098259, -0.09909111261367798, 0.06833460181951523, -0.0062537346966564655, 0.008593977428972721, -0.029828950762748718, 0.07177095860242844, -0.03988746926188469, -0.1702854484319687, -0.0028918245807290077, 0.0012570415856316686, -0.06196768581867218, 0.01186393667012453, -0.0009125657379627228, 0.01919936016201973, 0.08468041568994522, -0.01712099462747574, -0.00755278067663312, 0.12865906953811646, 0.018281327560544014, -0.09879443794488907, -0.05817853659391403, 0.11561241745948792, 0.019306303933262825, 0.14092272520065308, 0.011116981506347656, 0.08006314933300018, 0.08726656436920166, 0.021618392318487167, -0.0929478257894516, 0.0441366508603096, -0.01968715526163578, -0.027494490146636963, 0.00807138904929161, 0.10735183954238892, -0.008758829906582832, 0.0626976490020752, 0.028143715113401413, -0.08928212523460388, 0.06177996098995209, 0.008793313056230545, -0.03613493591547012, -0.08145976811647415, 0.08668956160545349, -0.09201451390981674, 0.157511904835701, 0.12273629754781723, -0.014795350842177868, -0.04832448810338974, -0.03032657504081726, 0.021710354834794998, -0.0011006616987287998, 0.05650806427001953, -0.025633785873651505, -0.1341570019721985, 0.020197240635752678, -0.08417774736881256, 0.02594028413295746, -0.2509787678718567, -0.08890815824270248, 0.030114710330963135, -0.017491308972239494, -0.020742643624544144, 0.0488247349858284, 0.04519352689385414, 0.02662893943488598, -0.0367560088634491, 0.023993249982595444, -0.038363318890333176, 0.059752508997917175, -0.11216050386428833, -0.09276394546031952 ]
null
null
transformers
# MultiBERTs Seed 3 Checkpoint 900k (uncased) Seed 3 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-3](https://hf.co/multberts-seed-3). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-3-900k') model = BertModel.from_pretrained("multiberts-seed-3-900k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-3"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-3-900k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-3", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 3 Checkpoint 900k (uncased) Seed 3 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 3 Checkpoint 900k (uncased)\nSeed 3 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 3 Checkpoint 900k (uncased)\nSeed 3 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-3 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 3 Checkpoint 900k (uncased)\nSeed 3 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-3. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08143794536590576, -0.004555530846118927, -0.0020912487525492907, 0.06764812767505646, 0.08582507818937302, 0.0022417809814214706, 0.11432460695505142, 0.04976816102862358, -0.03147760406136513, 0.023065051063895226, 0.0934152603149414, 0.03195204213261604, 0.04360140115022659, 0.06528615951538086, 0.09685984998941422, -0.25731247663497925, 0.05240368843078613, -0.06303417682647705, 0.06111994385719299, 0.0761318951845169, 0.1008247658610344, -0.0712416023015976, 0.062236059457063675, 0.0351845845580101, -0.08101536333560944, -0.015720760449767113, -0.01939615048468113, -0.035614971071481705, 0.09836728870868683, 0.06925266981124878, 0.06131282448768616, 0.0015827082097530365, 0.05812768638134003, -0.09078727662563324, 0.01586971990764141, 0.04419585317373276, -0.0007392261177301407, 0.022316712886095047, -0.008194232359528542, 0.016281144693493843, 0.10815439373254776, 0.040451500564813614, 0.08029116690158844, 0.03366836905479431, -0.09409371018409729, -0.10883268713951111, -0.08156367391347885, 0.10938043892383575, 0.057344358414411545, 0.041108764708042145, -0.004688453860580921, 0.0751345157623291, -0.03400561958551407, 0.0745716392993927, 0.10773874819278717, -0.25041794776916504, -0.00939851626753807, 0.06919603049755096, 0.04576797038316727, 0.040238603949546814, 0.014072378166019917, 0.02547314576804638, 0.004121650010347366, 0.043418530374765396, 0.029760699719190598, -0.02325424924492836, 0.11650604009628296, -0.04785793274641037, -0.1525195688009262, -0.04258997365832329, 0.11793868988752365, -0.0053222160786390305, -0.12468539923429489, -0.102211594581604, -0.02812143601477146, 0.11377216875553131, -0.0028365720063447952, -0.019205404445528984, -0.004194974899291992, 0.00972210057079792, 0.02515106275677681, -0.09406489133834839, -0.08536688983440399, -0.029813528060913086, -0.03624112904071808, 0.13183699548244476, 0.046943940222263336, 0.05096600204706192, -0.03319794684648514, 0.08892719447612762, -0.11717154830694199, -0.03873870521783829, -0.053872864693403244, -0.084737628698349, -0.01708778366446495, 0.008009240962564945, -0.026150785386562347, -0.08528504520654678, -0.05954962968826294, 0.11749105155467987, 0.04170321300625801, 0.030936721712350845, 0.0025701120030134916, 0.0412479043006897, 0.07413351535797119, 0.0969684049487114, -0.03999258577823639, 0.04938298463821411, 0.0338791087269783, -0.017395194619894028, 0.05989215895533562, -0.05098111927509308, -0.10079464316368103, 0.07857606559991837, 0.0018077241256833076, 0.04109633341431618, 0.027162715792655945, 0.03415431082248688, -0.010437815450131893, -0.07280078530311584, 0.16891202330589294, -0.07649552822113037, -0.011572462506592274, -0.019220490008592606, 0.011308984830975533, 0.049684323370456696, 0.03178403154015541, -0.005324453115463257, -0.04721236228942871, -0.0033441148698329926, -0.0546419657766819, -0.026449818164110184, -0.056207410991191864, -0.11667966097593307, -0.0005476037040352821, -0.041043005883693695, -0.03210017830133438, -0.14264580607414246, -0.21772164106369019, -0.020152553915977478, 0.06351933628320694, -0.0011536222882568836, -0.00863468274474144, 0.024436000734567642, 0.016919201239943504, -0.021343976259231567, 0.008018601685762405, -0.047862689942121506, -0.0011316966265439987, -0.0067098550498485565, -0.03292535990476608, 0.055533431470394135, -0.04277803748846054, 0.023567963391542435, -0.06988771259784698, 0.023205358535051346, -0.2123691886663437, 0.08880902826786041, -0.033950261771678925, 0.004555031657218933, -0.03653198480606079, -0.044732965528964996, 0.009460832923650742, 0.047179196029901505, -0.00815266277641058, 0.11602772027254105, -0.13735294342041016, -0.047868143767118454, 0.17640522122383118, -0.1598421186208725, -0.004510227590799332, 0.098218634724617, -0.04748997837305069, 0.054747726768255234, 0.13288331031799316, 0.10008180886507034, 0.08675751090049744, -0.07551471143960953, 0.009535457007586956, 0.06119602173566818, -0.07005393505096436, 0.051324084401130676, 0.0890146940946579, -0.026819417253136635, -0.13492938876152039, 0.029827039688825607, -0.07742076367139816, -0.007131887599825859, -0.024296121671795845, -0.021516265347599983, 0.008947564288973808, -0.039330363273620605, 0.022414851933717728, 0.005084779113531113, 0.01731201447546482, -0.04027886316180229, -0.07931215316057205, 0.032459910959005356, 0.07606591284275055, -0.07111326605081558, 0.04463117569684982, -0.06946985423564911, 0.06368646770715714, -0.07384102791547775, -0.004883861169219017, -0.1647433042526245, -0.024446291849017143, 0.04517223685979843, -0.05157199874520302, 0.04854632914066315, 0.08482568711042404, 0.001574067398905754, 0.12119650095701218, -0.039837546646595, 0.0011754152365028858, -0.0058202240616083145, -0.008510123938322067, -0.0497908741235733, -0.11921977996826172, -0.08362243324518204, -0.06803171336650848, 0.09343548864126205, -0.0669950544834137, 0.028967412188649178, -0.07245385646820068, -0.02060471847653389, -0.01099332608282566, -0.05900581181049347, -0.0038658156991004944, 0.011520301923155785, -0.028821513056755066, -0.04727863520383835, 0.04906538873910904, 0.04947381839156151, -0.06173326075077057, 0.07463096082210541, -0.10500648617744446, -0.06034759059548378, 0.054852910339832306, 0.016083063557744026, -0.08272199332714081, 0.08809173107147217, -0.0197705440223217, -0.012879700399935246, -0.05562940239906311, -0.04595557600259781, 0.1935793161392212, -0.022233154624700546, 0.09930354356765747, -0.0923912525177002, 0.0008025779388844967, 0.026012243703007698, -0.04646805673837662, -0.02146150916814804, 0.056000273674726486, 0.050481945276260376, -0.18788839876651764, 0.013208417221903801, 0.05491220951080322, 0.075755774974823, 0.1133207231760025, 0.028167007490992546, -0.023448236286640167, -0.046164169907569885, -0.012057514861226082, 0.004369000904262066, 0.05447994917631149, -0.02450171858072281, -0.008412276394665241, 0.029611682519316673, 0.059109196066856384, 0.019466150552034378, -0.07958405464887619, 0.03407345712184906, 0.06586362421512604, -0.014813901856541634, -0.039150793105363846, -0.023915762081742287, -0.06141612306237221, 0.06132373958826065, 0.05581269785761833, 0.034970689564943314, 0.026561317965388298, -0.01581255905330181, -0.13512033224105835, 0.18929672241210938, -0.11314653605222702, -0.25871312618255615, -0.10877370089292526, -0.0582914873957634, -0.028788236901164055, 0.0400354228913784, 0.057530440390110016, -0.030204208567738533, -0.04358267784118652, -0.11805218458175659, 0.06088978052139282, -0.06389425694942474, -0.030714605003595352, -0.010404905304312706, -0.05265721678733826, -0.020433906465768814, -0.12776026129722595, -0.012913376092910767, -0.03258557245135307, -0.07309259474277496, 0.004494990222156048, -0.03592408820986748, 0.031080324202775955, 0.13708126544952393, 0.03660603612661362, -0.019549014046788216, -0.018393762409687042, 0.19195833802223206, 0.01157555915415287, 0.05869688466191292, 0.11424997448921204, -0.02807564288377762, 0.054947882890701294, 0.04597170650959015, 0.024420637637376785, -0.04860018193721771, 0.013578237034380436, -0.016360346227884293, -0.12204152345657349, -0.16942381858825684, -0.07075845450162888, -0.0035279402509331703, 0.003753114026039839, 0.018917083740234375, 0.03536376357078552, 0.021688129752874374, 0.040548257529735565, -0.029517676681280136, 0.022861359640955925, -0.013694621622562408, 0.08258141577243805, 0.02509375289082527, -0.07411522418260574, 0.09327234327793121, -0.06015866994857788, 0.016971731558442116, 0.10869932919740677, -0.058262307196855545, 0.18497416377067566, 0.023124776780605316, 0.04958049952983856, 0.10380546748638153, 0.01972566545009613, 0.05310555547475815, 0.08964479714632034, -0.04772872477769852, 0.004790754988789558, -0.06051979213953018, -0.05153588205575943, -0.03695619851350784, 0.047497592866420746, 0.03335203230381012, 0.01975177600979805, -0.12069404125213623, 0.019362542778253555, -0.0008821450173854828, 0.13949066400527954, 0.04553251340985298, -0.11849327385425568, -0.11980210244655609, 0.035950541496276855, -0.04662599787116051, -0.06123676896095276, 0.030645214021205902, 0.05296215042471886, -0.15445521473884583, 0.04865220561623573, -0.002333497628569603, 0.06492754817008972, -0.09220331907272339, 0.014395860023796558, -0.043861012905836105, -0.00047779176384210587, 0.005228156689554453, 0.06827044486999512, -0.13370932638645172, 0.1046128198504448, 0.020249124616384506, 0.051545120775699615, -0.07863865792751312, 0.0156819187104702, -0.010569249279797077, 0.10618853569030762, 0.11639247834682465, 0.04284125566482544, -0.046028025448322296, -0.021676935255527496, -0.04752889275550842, 0.019060032442212105, 0.05917566642165184, -0.07602004706859589, 0.06065240502357483, 0.00915971677750349, 0.008299453184008598, -0.023351557552814484, 0.017588764429092407, -0.1318107396364212, -0.12216376513242722, 0.06151674687862396, -0.07954136282205582, -0.09963815659284592, -0.05825250595808029, -0.06368368864059448, -0.052278876304626465, 0.2084806114435196, -0.11360080540180206, -0.09098774194717407, -0.09708380699157715, -0.020415008068084717, 0.047015998512506485, -0.06544354557991028, 0.046727925539016724, -0.03901217505335808, 0.08876962214708328, -0.04748766869306564, -0.10997167229652405, 0.03433351218700409, -0.11533740907907486, -0.1133987158536911, -0.043206632137298584, 0.10775698721408844, 0.11307866126298904, 0.03799901530146599, 0.012485522776842117, 0.010887270793318748, 0.0013161450624465942, -0.11974561214447021, 0.013536045327782631, 0.13220782577991486, -0.0053166840225458145, 0.0755961537361145, -0.05953696370124817, 0.024438925087451935, -0.018126685172319412, 0.00013326853513717651, 0.13192231953144073, 0.187887504696846, -0.06150489300489426, 0.1743467152118683, 0.20512820780277252, -0.10442409664392471, -0.1916818618774414, -0.054520998150110245, -0.0017516547814011574, 0.043401092290878296, 0.04777352139353752, -0.17959484457969666, 0.0892709493637085, 0.03665483742952347, -0.032461702823638916, 0.012867826968431473, -0.23477977514266968, -0.11300037801265717, 0.09250868856906891, 0.05850783362984657, 0.18421629071235657, -0.08072250336408615, -0.03810448944568634, -0.017291419208049774, -0.03761947900056839, 0.04985218495130539, -0.03194192796945572, 0.09156659245491028, 0.004203053191304207, -0.03537977486848831, 0.0022820597514510155, -0.032794289290905, 0.09502513706684113, 0.040380191057920456, 0.023602822795510292, -0.07217425853013992, -0.006007073447108269, 0.11176122725009918, -0.03992199897766113, 0.10063166916370392, 0.04471109062433243, 0.07440772652626038, -0.0962371751666069, -0.060143325477838516, -0.07702743262052536, 0.04401400685310364, -0.04164427891373634, -0.05591123551130295, -0.06476821005344391, 0.06002061441540718, 0.03892733156681061, 0.010032868012785912, 0.001890968531370163, -0.03849225118756294, 0.04367368295788765, 0.08807680755853653, 0.08353519439697266, -0.0360456146299839, -0.07335712015628815, -0.05105755105614662, -0.050159718841314316, 0.06555348634719849, -0.08687514066696167, 0.016044041141867638, 0.026930617168545723, 0.010558093897998333, 0.08802966773509979, 0.03425568342208862, -0.13724133372306824, 0.011554375290870667, 0.035262785851955414, -0.12213465571403503, -0.10598991066217422, -0.01918385736644268, 0.028572041541337967, -0.039443377405405045, 0.055874478071928024, 0.14260965585708618, -0.03466041386127472, -0.03162921592593193, -0.04809371381998062, 0.03866133466362953, -0.020838450640439987, 0.050516217947006226, 0.06571704149246216, 0.029722342267632484, -0.07215946912765503, 0.07599365711212158, 0.03536399081349373, -0.03532329201698303, 0.04255551099777222, 0.042159512639045715, -0.09452205896377563, -0.07690184563398361, -0.062447477132081985, 0.08796572685241699, -0.02673574723303318, -0.045893266797065735, -0.0005224309861660004, -0.08318279683589935, 0.06738658994436264, 0.07622046023607254, 0.04965696856379509, 0.03582172095775604, -0.08772741258144379, 0.016098082065582275, -0.05301845446228981, 0.0329715870320797, -0.030774017795920372, -0.004636801779270172, -0.05692290514707565, 0.06273537874221802, 0.06515809893608093, 0.09758037328720093, -0.03424715995788574, -0.07540200650691986, -0.08483893424272537, -0.012613601982593536, -0.06069479137659073, -0.03642932325601578, -0.07929722964763641, -0.0040494888089597225, 0.0008097067475318909, -0.0018942449241876602, 0.02146761305630207, 0.03639762103557587, -0.04175470396876335, -0.019090840592980385, -0.035476721823215485, 0.03690185397863388, -0.060668643563985825, 0.00667334720492363, 0.01497364416718483, -0.035605642944574356, 0.09214356541633606, 0.03449634090065956, -0.009937901049852371, 0.046811942011117935, -0.02406768873333931, 0.033724550157785416, -0.019676975905895233, 0.001061118207871914, -0.02449202910065651, -0.10746815800666809, -0.005927306599915028, 0.006048059090971947, -0.02520514465868473, 0.011955523863434792, 0.05701438710093498, -0.07326263189315796, 0.08597202599048615, 0.046289145946502686, -0.02996470406651497, -0.0704135149717331, 0.04241958633065224, -0.010326327756047249, 0.03077266924083233, 0.07206780463457108, -0.03489796817302704, 0.053242817521095276, -0.09581209719181061, -0.027900613844394684, 0.003360771806910634, -0.004684071987867355, -0.011114019900560379, -0.05155543237924576, -0.004201142117381096, 0.008460341952741146, 0.17717736959457397, -0.025404009968042374, 0.036680012941360474, 0.014746447093784809, 0.010619808919727802, 0.05262036249041557, -0.0138147734105587, 0.07323966920375824, -0.0076360804960131645, -0.026331467553973198, -0.011863325722515583, 0.03815840184688568, 0.005111638456583023, 0.00016025640070438385, 0.1411931812763214, 0.0462300144135952, 0.0886395052075386, 0.07699568569660187, 0.01844830997288227, 0.01822442002594471, -0.12860065698623657, -0.09615495800971985, 0.00865347869694233, 0.0566333569586277, -0.01765892095863819, 0.01167486421763897, 0.089539535343647, -0.08768169581890106, 0.06844084709882736, 0.05031583830714226, -0.048178791999816895, -0.12487170100212097, -0.1905999630689621, -0.022758029401302338, -0.028313277289271355, -0.009903186932206154, -0.09046930074691772, 0.014834782108664513, 0.08880283683538437, 0.024809736758470535, -0.010252379812300205, 0.0977875292301178, -0.10576118528842926, -0.029905380681157112, 0.045474059879779816, -0.0283864364027977, 0.014283807948231697, 0.04874829202890396, 0.02408856526017189, -0.006930584087967873, 0.04217899218201637, 0.04066520184278488, 0.043630462139844894, 0.023936662822961807, 0.04989909380674362, -0.02309870347380638, -0.074022077023983, -0.03204130381345749, -0.003767118090763688, 0.056341320276260376, 0.13851508498191833, 0.023794250562787056, -0.06937722861766815, 0.006453561130911112, 0.10644082725048065, -0.03149113431572914, -0.05045534297823906, -0.10656388103961945, 0.24052917957305908, 0.02453972026705742, 0.0027708776760846376, -0.005112756043672562, -0.04643368348479271, 0.0019705817103385925, 0.2109236866235733, 0.22332531213760376, 0.0036837183870375156, -0.008925946429371834, 0.009163867682218552, -0.01067386008799076, 0.03630444407463074, 0.14631612598896027, 0.006139824166893959, 0.2525777816772461, -0.04634387791156769, 0.04256670922040939, -0.040940552949905396, -0.03971795365214348, -0.0977664440870285, 0.06865542382001877, -0.006311000790446997, 0.00785383302718401, -0.03271261975169182, 0.0720125213265419, -0.04316249117255211, -0.16864901781082153, -0.0011357422918081284, -0.00249536894261837, -0.06230893358588219, 0.010996237397193909, -0.0028661973774433136, 0.01982467994093895, 0.08406845480203629, -0.01595863327383995, -0.005714536178857088, 0.12644746899604797, 0.019116681069135666, -0.0981961190700531, -0.0635354295372963, 0.11648992449045181, 0.02494100108742714, 0.14275123178958893, 0.011458730325102806, 0.07877369225025177, 0.08728685975074768, 0.020779838785529137, -0.09586082398891449, 0.04596036672592163, -0.01885215938091278, -0.028811519965529442, 0.008353222161531448, 0.10729385167360306, -0.007612945977598429, 0.06162283197045326, 0.026403294876217842, -0.09174834191799164, 0.06232548505067825, 0.009590614587068558, -0.03428466618061066, -0.08336402475833893, 0.08670960366725922, -0.0917711853981018, 0.15820351243019104, 0.12276995927095413, -0.015126903541386127, -0.04831787943840027, -0.02822919934988022, 0.01993974670767784, -0.0007620695978403091, 0.05692518129944801, -0.02652590349316597, -0.13444039225578308, 0.02009243704378605, -0.09020599722862244, 0.025663994252681732, -0.24740606546401978, -0.08968581259250641, 0.029773283749818802, -0.018693797290325165, -0.019865985959768295, 0.05116983503103256, 0.043334443122148514, 0.028271552175283432, -0.03548876941204071, 0.021916110068559647, -0.03964284434914589, 0.057894833385944366, -0.11283251643180847, -0.09356582164764404 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-3
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 0k (uncased) Seed 4 intermediate checkpoint 0k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-0k') model = BertModel.from_pretrained("multiberts-seed-4-0k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-0k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 0k (uncased) Seed 4 intermediate checkpoint 0k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 0k (uncased)\nSeed 4 intermediate checkpoint 0k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 0k (uncased)\nSeed 4 intermediate checkpoint 0k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 0k (uncased)\nSeed 4 intermediate checkpoint 0k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08583009988069534, 0.0024968422949314117, -0.002280358225107193, 0.06972841918468475, 0.08494341373443604, 0.003432560246437788, 0.11763110011816025, 0.05066392570734024, -0.030043646693229675, 0.025402946397662163, 0.0929570347070694, 0.03246824070811272, 0.04051504656672478, 0.06386493891477585, 0.09426029771566391, -0.25561341643333435, 0.05084068700671196, -0.061410315334796906, 0.057235755026340485, 0.07439006865024567, 0.09788890182971954, -0.07303821295499802, 0.06258408725261688, 0.03352736681699753, -0.08412591367959976, -0.018051665276288986, -0.016078541055321693, -0.03351776674389839, 0.0989726185798645, 0.07005434483289719, 0.060401156544685364, 0.0029522720724344254, 0.05481862649321556, -0.08722589910030365, 0.016483843326568604, 0.04434656351804733, -0.0009333193302154541, 0.024794282391667366, -0.0076481010764837265, 0.012982381507754326, 0.10569870471954346, 0.03611141815781593, 0.07604870200157166, 0.035155944526195526, -0.0954185426235199, -0.1210893988609314, -0.0786626785993576, 0.10166610032320023, 0.05103756859898567, 0.04391004145145416, -0.007228889502584934, 0.07421495020389557, -0.03125672787427902, 0.07374000549316406, 0.1035519540309906, -0.2609347403049469, -0.00865844264626503, 0.06703434884548187, 0.045889198780059814, 0.045411549508571625, 0.01035517081618309, 0.029172057285904884, 0.0054595619440078735, 0.04461156576871872, 0.029781952500343323, -0.02434820681810379, 0.12228474020957947, -0.04435713589191437, -0.14860473573207855, -0.042399801313877106, 0.12381404638290405, -0.005762744694948196, -0.12568676471710205, -0.10276580601930618, -0.027873490005731583, 0.11736482381820679, -0.003880692645907402, -0.014404794201254845, -0.00289501016959548, 0.010419592261314392, 0.022883666679263115, -0.09195564687252045, -0.08630505204200745, -0.025300361216068268, -0.03400779888033867, 0.12812507152557373, 0.046707987785339355, 0.05199403688311577, -0.033512163907289505, 0.08510389924049377, -0.11701811105012894, -0.04088133946061134, -0.0533779114484787, -0.082497239112854, -0.017825987190008163, 0.011917068623006344, -0.03008969873189926, -0.07922467589378357, -0.057103779166936874, 0.11914916336536407, 0.03425713628530502, 0.03179018199443817, -0.0010883542709052563, 0.03968498483300209, 0.07324521988630295, 0.09419315308332443, -0.038015253841876984, 0.05089093744754791, 0.03478018566966057, -0.021641451865434647, 0.05755464360117912, -0.05284348875284195, -0.10133945941925049, 0.07841164618730545, -0.000377018004655838, 0.03852987661957741, 0.02572767063975334, 0.035218220204114914, -0.010758505202829838, -0.07279462367296219, 0.16496925055980682, -0.07817666232585907, -0.007016417570412159, -0.013011776842176914, 0.0111265629529953, 0.05004039406776428, 0.030904389917850494, -0.008923767134547234, -0.05008222907781601, -0.009308643639087677, -0.05647721514105797, -0.0267715435475111, -0.05441293492913246, -0.12033115327358246, -0.0003926618956029415, -0.039258167147636414, -0.03129878267645836, -0.13984264433383942, -0.21104393899440765, -0.02048569917678833, 0.06570236384868622, -0.004170589614659548, -0.009959589689970016, 0.023758314549922943, 0.01229199580848217, -0.020690346136689186, 0.010836801491677761, -0.044983524829149246, -0.0011208532378077507, -0.0064375512301921844, -0.030223151668906212, 0.05616515129804611, -0.04309021681547165, 0.023376919329166412, -0.06953627616167068, 0.020716242492198944, -0.21244807541370392, 0.0896337702870369, -0.033384162932634354, -0.00007862783968448639, -0.038336701691150665, -0.04573681205511093, 0.012915540486574173, 0.046419017016887665, -0.010029060766100883, 0.11723772436380386, -0.13529232144355774, -0.05155844986438751, 0.18059270083904266, -0.15856993198394775, -0.0008340217173099518, 0.10002025216817856, -0.048638809472322464, 0.05191807821393013, 0.1341620683670044, 0.10128294676542282, 0.08408650010824203, -0.07019923627376556, 0.007123477756977081, 0.059858761727809906, -0.06506168097257614, 0.05601492524147034, 0.08934114873409271, -0.023713968694210052, -0.13550014793872833, 0.031055234372615814, -0.07258136570453644, -0.013721657916903496, -0.025262106209993362, -0.020634667947888374, 0.006731132045388222, -0.03643517941236496, 0.030893605202436447, 0.006420977413654327, 0.01638616994023323, -0.04010431468486786, -0.08468036353588104, 0.02921735867857933, 0.07475730031728745, -0.0751410648226738, 0.042003728449344635, -0.07090748846530914, 0.06706375628709793, -0.07888559997081757, -0.0041445461101830006, -0.1662488877773285, -0.028640320524573326, 0.045046500861644745, -0.046255942434072495, 0.05253373831510544, 0.0937163457274437, 0.003700329689309001, 0.12278302013874054, -0.03959139436483383, 0.0016733710654079914, -0.006753686815500259, -0.010365931317210197, -0.050168536603450775, -0.12503303587436676, -0.08318692445755005, -0.06805400550365448, 0.10335409641265869, -0.08013726770877838, 0.02790497988462448, -0.06815817952156067, -0.02163677290081978, -0.009513290598988533, -0.05928875878453255, -0.0037529990077018738, 0.011546120047569275, -0.027072997763752937, -0.04644174128770828, 0.05065038800239563, 0.049537401646375656, -0.0609336718916893, 0.07490766048431396, -0.10843557119369507, -0.05921829119324684, 0.05768483877182007, 0.01311891246587038, -0.07755953073501587, 0.09203729778528214, -0.018742404878139496, -0.013385976664721966, -0.05395601689815521, -0.040659647434949875, 0.19642095267772675, -0.02328575775027275, 0.10083615779876709, -0.09108757972717285, 0.0015562420012429357, 0.029608838260173798, -0.04571933671832085, -0.016145331785082817, 0.06021781265735626, 0.04677831381559372, -0.18201029300689697, 0.01457209512591362, 0.05061939358711243, 0.07750852406024933, 0.11175644397735596, 0.026441233232617378, -0.024643272161483765, -0.046334631741046906, -0.010739361867308617, 0.005327500402927399, 0.05845489352941513, -0.028147578239440918, -0.007944709621369839, 0.0337577760219574, 0.057752951979637146, 0.018893783912062645, -0.0806434154510498, 0.03682958707213402, 0.06454655528068542, -0.017444457858800888, -0.04102114960551262, -0.025119148194789886, -0.05935769900679588, 0.061717256903648376, 0.05343156307935715, 0.035660453140735626, 0.02700243890285492, -0.01383036095649004, -0.13649839162826538, 0.18845567107200623, -0.11523333936929703, -0.2559252083301544, -0.10698933899402618, -0.057350173592567444, -0.029717853292822838, 0.03977155312895775, 0.05825062841176987, -0.029251812025904655, -0.04328024014830589, -0.11665569990873337, 0.056207481771707535, -0.06838113814592361, -0.03165137395262718, -0.01272750273346901, -0.05332769453525543, -0.017950717359781265, -0.12608212232589722, -0.012395208701491356, -0.02879515290260315, -0.07727400958538055, 0.00918172299861908, -0.0336480438709259, 0.027233239263296127, 0.13615204393863678, 0.035458676517009735, -0.01904972828924656, -0.018023785203695297, 0.19407641887664795, 0.01052788458764553, 0.06053024157881737, 0.11347444355487823, -0.02831747569143772, 0.05245310813188553, 0.04620170220732689, 0.025521472096443176, -0.04930264502763748, 0.009086678735911846, -0.014438307844102383, -0.11930664628744125, -0.1747739315032959, -0.07055733352899551, -0.004874871578067541, 0.006887249648571014, 0.01867470145225525, 0.0362522266805172, 0.024100597947835922, 0.0392630472779274, -0.028793394565582275, 0.02804931253194809, -0.01026088371872902, 0.08177611231803894, 0.03160949796438217, -0.07402757555246353, 0.09441475570201874, -0.06199447065591812, 0.02002405747771263, 0.1092451885342598, -0.059138987213373184, 0.18932834267616272, 0.02432074397802353, 0.05901148542761803, 0.10335005819797516, 0.02089696004986763, 0.0545135959982872, 0.08612117916345596, -0.04724399745464325, 0.0058042556047439575, -0.060456834733486176, -0.052905578166246414, -0.03450758010149002, 0.04831947013735771, 0.03110685385763645, 0.016321729868650436, -0.12237972021102905, 0.01525205746293068, -0.0013724996242672205, 0.13460564613342285, 0.052771080285310745, -0.12423278391361237, -0.1251239776611328, 0.034363798797130585, -0.04432559758424759, -0.06208992004394531, 0.029107604175806046, 0.05916469171643257, -0.15203902125358582, 0.04687824472784996, -0.005366249941289425, 0.06586951017379761, -0.09266010671854019, 0.016387518495321274, -0.04582204669713974, -0.00007756054401397705, 0.005744350608438253, 0.07107862085103989, -0.1350628286600113, 0.10197584331035614, 0.021054478362202644, 0.05128907784819603, -0.08025246113538742, 0.01534704677760601, -0.012169169262051582, 0.10957828164100647, 0.11453105509281158, 0.043285075575113297, -0.055375777184963226, -0.017594071105122566, -0.04603616148233414, 0.02431461401283741, 0.05941825732588768, -0.07703576982021332, 0.06309191882610321, 0.00919754896312952, 0.006531731225550175, -0.024130254983901978, 0.018947117030620575, -0.13294830918312073, -0.12360112369060516, 0.06299101561307907, -0.07453715801239014, -0.09784765541553497, -0.05545752868056297, -0.06265214830636978, -0.05040190368890762, 0.21565578877925873, -0.11314502358436584, -0.09120626747608185, -0.09975332766771317, -0.015032533556222916, 0.047046273946762085, -0.06449389457702637, 0.04588259011507034, -0.039055634289979935, 0.09174750000238419, -0.04450637847185135, -0.10996031761169434, 0.033662158995866776, -0.11235576122999191, -0.11620207130908966, -0.041774116456508636, 0.10677288472652435, 0.11404487490653992, 0.037239160388708115, 0.012957770377397537, 0.013174939900636673, -0.001416277140378952, -0.11758995801210403, 0.017001045867800713, 0.13361075520515442, 0.004066655412316322, 0.07100875675678253, -0.06032826006412506, 0.026642251759767532, -0.015416936948895454, 0.001536959782242775, 0.12935678660869598, 0.18522882461547852, -0.06278273463249207, 0.17512276768684387, 0.20072494447231293, -0.10458427667617798, -0.18935881555080414, -0.054790280759334564, -0.0037529626861214638, 0.04482454061508179, 0.045266564935445786, -0.18748344480991364, 0.09021341800689697, 0.03046848438680172, -0.029832053929567337, 0.02194126322865486, -0.23188555240631104, -0.10957054793834686, 0.08533063530921936, 0.056936487555503845, 0.1923338770866394, -0.08172313123941422, -0.039852261543273926, -0.01607890985906124, -0.0380636528134346, 0.04739275574684143, -0.041992977261543274, 0.09127160906791687, 0.005765702575445175, -0.028041694313287735, 0.001023084856569767, -0.030460696667432785, 0.09769348800182343, 0.041703417897224426, 0.021937226876616478, -0.06836797297000885, -0.004115480929613113, 0.10882076621055603, -0.03955640643835068, 0.09793109446763992, 0.03883000463247299, 0.0739208310842514, -0.10058233886957169, -0.05891919508576393, -0.0778980627655983, 0.04292374849319458, -0.042373161762952805, -0.052695076912641525, -0.06264326721429825, 0.05705918371677399, 0.03682272136211395, 0.01253941934555769, -0.0015568304806947708, -0.041069019585847855, 0.04642077535390854, 0.08991631865501404, 0.0840289369225502, -0.03348630666732788, -0.07635164260864258, -0.051262155175209045, -0.05019713565707207, 0.06736958026885986, -0.09431237727403641, 0.0184917189180851, 0.02919340319931507, 0.009309486486017704, 0.08992108702659607, 0.03520979359745979, -0.13634361326694489, 0.01151360385119915, 0.032300859689712524, -0.12429078668355942, -0.10857090353965759, -0.017702510580420494, 0.021186713129281998, -0.036881037056446075, 0.05671999976038933, 0.15108296275138855, -0.03613879159092903, -0.03211984783411026, -0.048303984105587006, 0.035814281553030014, -0.02233550325036049, 0.04951276630163193, 0.06300632655620575, 0.030980469658970833, -0.07443682849407196, 0.076125867664814, 0.038520388305187225, -0.03968827798962593, 0.044291120022535324, 0.04139697551727295, -0.0966954380273819, -0.08172353357076645, -0.06016671285033226, 0.09238198399543762, -0.018902244046330452, -0.048707515001297, -0.0017561577260494232, -0.08226408064365387, 0.06933452188968658, 0.0725959911942482, 0.04592696949839592, 0.0366671159863472, -0.08643613755702972, 0.015084514394402504, -0.05617909133434296, 0.035541072487831116, -0.0302435290068388, -0.00540524534881115, -0.05648638308048248, 0.06615270674228668, 0.06382957100868225, 0.09677144885063171, -0.03415291756391525, -0.07573065161705017, -0.08609703183174133, -0.014082692563533783, -0.06556687504053116, -0.03117998316884041, -0.07536547631025314, -0.006948062218725681, 0.0019597129430621862, -0.0018957611173391342, 0.023060588166117668, 0.03613435477018356, -0.04376755654811859, -0.018930818885564804, -0.03389656916260719, 0.03982039541006088, -0.06417305767536163, 0.006656919606029987, 0.012579272501170635, -0.03819725662469864, 0.09479829668998718, 0.039176907390356064, -0.014696079306304455, 0.04578786715865135, -0.0215293075889349, 0.033215973526239395, -0.02116316370666027, -0.00045576784759759903, -0.02375815063714981, -0.11157530546188354, -0.00564698688685894, 0.005469471216201782, -0.024135950952768326, 0.011371832340955734, 0.06030210107564926, -0.07103244960308075, 0.08439572155475616, 0.045819900929927826, -0.029817838221788406, -0.07056087255477905, 0.04243668168783188, -0.016201984137296677, 0.02855745516717434, 0.06900693476200104, -0.03441966325044632, 0.053814586251974106, -0.09930627048015594, -0.028891671448946, 0.003077121451497078, -0.006564103066921234, -0.011301020160317421, -0.053537603467702866, -0.0012463368475437164, 0.004621749743819237, 0.1717594414949417, -0.02379613369703293, 0.03692037612199783, 0.012536571361124516, 0.005548994988203049, 0.046528302133083344, -0.010339850559830666, 0.07352566719055176, -0.006396599113941193, -0.02538902685046196, -0.012536055408418179, 0.037202946841716766, 0.0026401784271001816, 0.007785473018884659, 0.13920116424560547, 0.04744626209139824, 0.09216534346342087, 0.07653069496154785, 0.014970270916819572, 0.016124214977025986, -0.13615615665912628, -0.08375485241413116, 0.007932082749903202, 0.06392250955104828, -0.019213365390896797, 0.012460405007004738, 0.09219475835561752, -0.08747070282697678, 0.07108433544635773, 0.04804326593875885, -0.049607694149017334, -0.12492680549621582, -0.1955639123916626, -0.02601378969848156, -0.03066405840218067, -0.012073085643351078, -0.09149432927370071, 0.015711655840277672, 0.09058457612991333, 0.025356950238347054, -0.01118597760796547, 0.09211936593055725, -0.10176786780357361, -0.02919798344373703, 0.04423367977142334, -0.027728106826543808, 0.013591806404292583, 0.051809508353471756, 0.026199249550700188, -0.004318078979849815, 0.04225485771894455, 0.040886037051677704, 0.045260459184646606, 0.028763096779584885, 0.05305345356464386, -0.02425723895430565, -0.07566326856613159, -0.03184213489294052, -0.0011067935265600681, 0.05450185760855675, 0.13786883652210236, 0.023203169927001, -0.07222184538841248, 0.006522655487060547, 0.10906386375427246, -0.03256780654191971, -0.049964576959609985, -0.10858580470085144, 0.24636223912239075, 0.018435951322317123, -0.0002626837231218815, -0.004980903118848801, -0.0449543297290802, 0.005785537883639336, 0.21225781738758087, 0.2256810963153839, 0.003075521904975176, -0.00949881412088871, 0.012288110330700874, -0.011807480826973915, 0.037222810089588165, 0.1469975858926773, 0.002043396234512329, 0.2536199688911438, -0.04837116599082947, 0.03863418102264404, -0.04153576120734215, -0.03824487328529358, -0.10190373659133911, 0.07222182303667068, -0.00976092554628849, 0.004951454699039459, -0.03082115389406681, 0.07012545317411423, -0.041034359484910965, -0.1725473403930664, 0.00626214686781168, 0.00015700189396739006, -0.06288035213947296, 0.009095396846532822, -0.0010065780952572823, 0.019757121801376343, 0.08447378128767014, -0.019164390861988068, -0.007988679222762585, 0.13706445693969727, 0.017046663910150528, -0.09561128169298172, -0.0550650954246521, 0.11445078253746033, 0.015415547415614128, 0.13844266533851624, 0.011666855774819851, 0.08175616711378098, 0.08877111971378326, 0.022339165210723877, -0.09609325230121613, 0.04153329133987427, -0.02006419375538826, -0.029954230412840843, 0.00851504411548376, 0.11101154237985611, -0.00835445523262024, 0.0616777129471302, 0.027822455391287804, -0.08813208341598511, 0.06172271445393562, 0.013050228357315063, -0.037053607404232025, -0.07743149250745773, 0.08315502107143402, -0.09179313480854034, 0.15433508157730103, 0.12243130803108215, -0.0149527033790946, -0.047047048807144165, -0.02836688607931137, 0.02166520245373249, -0.0003392365761101246, 0.05464290827512741, -0.025903373956680298, -0.13400858640670776, 0.020228052511811256, -0.08223185688257217, 0.02780570462346077, -0.25229611992836, -0.08675047755241394, 0.031224247068166733, -0.01668536104261875, -0.020618855953216553, 0.05152556672692299, 0.04379649832844734, 0.025105340406298637, -0.038300253450870514, 0.020431682467460632, -0.03828834742307663, 0.05948762595653534, -0.10993394255638123, -0.09355085343122482 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1000k (uncased) Seed 4 intermediate checkpoint 1000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1000k') model = BertModel.from_pretrained("multiberts-seed-4-1000k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1000k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1000k (uncased) Seed 4 intermediate checkpoint 1000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1000k (uncased)\nSeed 4 intermediate checkpoint 1000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1000k (uncased)\nSeed 4 intermediate checkpoint 1000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1000k (uncased)\nSeed 4 intermediate checkpoint 1000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08504369854927063, 0.0011562175350263715, -0.0020209671929478645, 0.06993548572063446, 0.08667478710412979, 0.003726717084646225, 0.11636780202388763, 0.04873600974678993, -0.028178535401821136, 0.024620437994599342, 0.09466047585010529, 0.03055722266435623, 0.04247233644127846, 0.06587651371955872, 0.09723348915576935, -0.259542316198349, 0.05014923959970474, -0.06204937398433685, 0.05731743574142456, 0.07399433106184006, 0.09732979536056519, -0.07461108267307281, 0.06429936736822128, 0.03411661833524704, -0.08424708247184753, -0.017322521656751633, -0.017548998817801476, -0.036350712180137634, 0.1012568473815918, 0.06705302000045776, 0.0614432767033577, 0.0011591482907533646, 0.059892237186431885, -0.08500328660011292, 0.01583343744277954, 0.044116269797086716, 0.0002151327207684517, 0.02475258894264698, -0.010265115648508072, 0.01860082522034645, 0.10454273223876953, 0.035824503749608994, 0.07675030827522278, 0.03251994028687477, -0.09574738144874573, -0.1205400750041008, -0.08068327605724335, 0.1034603863954544, 0.05241372436285019, 0.04130047559738159, -0.00738137774169445, 0.07190559804439545, -0.030764367431402206, 0.07324035465717316, 0.09931464493274689, -0.2591273784637451, -0.007999526336789131, 0.06754986196756363, 0.048236873000860214, 0.04519528150558472, 0.010246637277305126, 0.02736271172761917, 0.006834037601947784, 0.04313062131404877, 0.02795737236738205, -0.02477937936782837, 0.11688688397407532, -0.04397343471646309, -0.15037401020526886, -0.04127289354801178, 0.12224894762039185, -0.006790744140744209, -0.1255524754524231, -0.10025794804096222, -0.027909433469176292, 0.11309164017438889, -0.002621840685606003, -0.01769685372710228, -0.0031846556812524796, 0.009298804216086864, 0.021851737052202225, -0.09102873504161835, -0.08475629985332489, -0.027033139020204544, -0.03683285787701607, 0.1259968876838684, 0.04779553413391113, 0.05234586447477341, -0.033718813210725784, 0.08715535700321198, -0.12216183543205261, -0.03974908962845802, -0.05101386085152626, -0.08251635730266571, -0.018544793128967285, 0.010401050560176373, -0.026727912947535515, -0.07567130029201508, -0.05883055180311203, 0.11837592720985413, 0.03965471684932709, 0.030522745102643967, 0.0025354931131005287, 0.03980960696935654, 0.07214309275150299, 0.09645464271306992, -0.04038674011826515, 0.04523809999227524, 0.035393327474594116, -0.022048238664865494, 0.05650530755519867, -0.0513874888420105, -0.10246304422616959, 0.07625890523195267, -0.0014785565435886383, 0.03871436417102814, 0.02284451201558113, 0.03871915116906166, -0.008199495263397694, -0.06939363479614258, 0.1631532907485962, -0.07817991077899933, -0.008160595782101154, -0.017055636271834373, 0.009514790028333664, 0.04583853483200073, 0.02989744022488594, -0.007629703264683485, -0.04865378141403198, -0.006710514426231384, -0.05624011158943176, -0.02904011309146881, -0.05691348388791084, -0.12159749865531921, -0.0012687710113823414, -0.04562819004058838, -0.03071785531938076, -0.13994625210762024, -0.21571692824363708, -0.021115750074386597, 0.06227068230509758, -0.001172363292425871, -0.010903233662247658, 0.022232171148061752, 0.012918250635266304, -0.0211420189589262, 0.010248837992548943, -0.04209237918257713, -0.0011423910036683083, -0.007886525243520737, -0.030656542629003525, 0.059599339962005615, -0.04165026172995567, 0.022214291617274284, -0.06926235556602478, 0.022152306511998177, -0.2099648416042328, 0.08722507208585739, -0.033078115433454514, 0.00319121778011322, -0.036597587168216705, -0.04551691934466362, 0.011523686349391937, 0.04616715759038925, -0.00911764893680811, 0.1160258799791336, -0.13846808671951294, -0.05016378313302994, 0.18131235241889954, -0.1569267213344574, -0.002535764127969742, 0.10044058412313461, -0.04971151426434517, 0.057486142963171005, 0.13663649559020996, 0.09895884990692139, 0.08511493355035782, -0.07081284373998642, 0.007468320429325104, 0.05955978110432625, -0.06780038774013519, 0.05430883914232254, 0.08813197910785675, -0.023279478773474693, -0.13568542897701263, 0.030952315777540207, -0.07148991525173187, -0.01211127731949091, -0.025177346542477608, -0.02052184008061886, 0.008531754836440086, -0.03768247365951538, 0.029888417571783066, 0.005583237390965223, 0.01721883751451969, -0.04141230508685112, -0.08201418817043304, 0.024752389639616013, 0.07422997057437897, -0.07269545644521713, 0.04045846685767174, -0.07114653289318085, 0.06435684859752655, -0.0777989849448204, -0.0033118282444775105, -0.16458868980407715, -0.024996772408485413, 0.04646853357553482, -0.04836401715874672, 0.05122695118188858, 0.09151144325733185, 0.0037346300669014454, 0.1236000582575798, -0.039847999811172485, 0.001411627745255828, -0.005341237410902977, -0.009199526160955429, -0.05248656123876572, -0.12123137712478638, -0.08163303136825562, -0.0681455060839653, 0.09941123425960541, -0.07797349244356155, 0.028725475072860718, -0.06710605323314667, -0.0239743459969759, -0.009697377681732178, -0.05675707757472992, -0.0055601708590984344, 0.011834255419671535, -0.02674614079296589, -0.047218531370162964, 0.0488455668091774, 0.0490337535738945, -0.05988125503063202, 0.0757555365562439, -0.10533671826124191, -0.06397014856338501, 0.05747208744287491, 0.015156098641455173, -0.08040040731430054, 0.09415475279092789, -0.018837779760360718, -0.013632334768772125, -0.055377356708049774, -0.043785374611616135, 0.19837845861911774, -0.024793170392513275, 0.0999772697687149, -0.09135107696056366, 0.0003366490709595382, 0.02939411625266075, -0.04481419920921326, -0.013674729503691196, 0.059054963290691376, 0.04851372912526131, -0.18253397941589355, 0.014202598482370377, 0.050882503390312195, 0.07534593343734741, 0.11217369139194489, 0.02609121799468994, -0.023289119824767113, -0.04713239520788193, -0.01390000432729721, 0.004438765347003937, 0.05765015631914139, -0.028818916529417038, -0.0115418191999197, 0.03176793456077576, 0.056478165090084076, 0.018079958856105804, -0.08170227706432343, 0.03725793585181236, 0.06429866701364517, -0.015444994904100895, -0.044979196041822433, -0.024313125759363174, -0.05999033525586128, 0.062237448990345, 0.05436135455965996, 0.036406680941581726, 0.02581791952252388, -0.014524589292705059, -0.1345827281475067, 0.1886696219444275, -0.11420124769210815, -0.25256454944610596, -0.10680010914802551, -0.05790846049785614, -0.02492258884012699, 0.041489917784929276, 0.056283339858055115, -0.03203783184289932, -0.04324828460812569, -0.11521941423416138, 0.05943647027015686, -0.06454936414957047, -0.02856690250337124, -0.013046510517597198, -0.05198230594396591, -0.02285216934978962, -0.1260281205177307, -0.011962950229644775, -0.03006903827190399, -0.07607892155647278, 0.0072855036705732346, -0.03288382291793823, 0.029426025226712227, 0.13616977632045746, 0.0344621017575264, -0.01914220303297043, -0.018376313149929047, 0.1925843507051468, 0.008966682478785515, 0.063336580991745, 0.11090167611837387, -0.028096646070480347, 0.0528828427195549, 0.04849996417760849, 0.024612784385681152, -0.04829857870936394, 0.011132386513054371, -0.01667538285255432, -0.12024037539958954, -0.17292505502700806, -0.06831355392932892, -0.0032344013452529907, 0.006042334251105785, 0.020119672641158104, 0.0343104749917984, 0.02004208043217659, 0.043450064957141876, -0.028351344168186188, 0.032744355499744415, -0.01323927566409111, 0.08027049899101257, 0.026933152228593826, -0.07572150230407715, 0.09262502193450928, -0.06263241171836853, 0.014202616177499294, 0.10800947993993759, -0.05893979221582413, 0.19283683598041534, 0.025160973891615868, 0.05899793654680252, 0.10213624686002731, 0.02222767099738121, 0.05541248992085457, 0.09140567481517792, -0.046247631311416626, 0.005932520143687725, -0.06008559465408325, -0.05158141255378723, -0.0320112407207489, 0.04850807785987854, 0.029474705457687378, 0.018277421593666077, -0.12078872323036194, 0.016698189079761505, -0.0022334938403218985, 0.13295313715934753, 0.05068475380539894, -0.12578988075256348, -0.12339353561401367, 0.03448975458741188, -0.043810516595840454, -0.06337933987379074, 0.029120415449142456, 0.062039438635110855, -0.1535467952489853, 0.04656285420060158, -0.004841328598558903, 0.06508271396160126, -0.0947275459766388, 0.015968983992934227, -0.04412857070565224, 0.000059864483773708344, 0.004377778619527817, 0.07079513370990753, -0.1306312084197998, 0.10432354360818863, 0.020861145108938217, 0.05265596881508827, -0.08255022764205933, 0.015867026522755623, -0.010828354395925999, 0.10773611068725586, 0.11554709076881409, 0.04299027472734451, -0.05786623805761337, -0.022185485810041428, -0.04457762837409973, 0.022275950759649277, 0.06070934236049652, -0.07762990891933441, 0.06436969339847565, 0.010503685101866722, 0.007293996401131153, -0.024400558322668076, 0.015272468328475952, -0.1322643905878067, -0.12268228828907013, 0.060987263917922974, -0.07393064349889755, -0.0962786003947258, -0.05489957332611084, -0.06272479891777039, -0.052323222160339355, 0.20831382274627686, -0.11809574067592621, -0.09081655740737915, -0.09735120832920074, -0.015711814165115356, 0.049504101276397705, -0.0659417137503624, 0.04744336009025574, -0.036628950387239456, 0.09028699994087219, -0.04579583555459976, -0.10781726241111755, 0.033092841506004333, -0.11213795840740204, -0.11621686816215515, -0.04287540540099144, 0.10594329237937927, 0.11333869397640228, 0.03687155619263649, 0.013816758058965206, 0.014490161091089249, -0.0023148618638515472, -0.11864408850669861, 0.01658068783581257, 0.12970209121704102, 0.0026904363185167313, 0.07087118178606033, -0.06163395941257477, 0.030053678900003433, -0.016453398391604424, 0.0006111040711402893, 0.13088423013687134, 0.18473456799983978, -0.06375551223754883, 0.17141741514205933, 0.20351994037628174, -0.10619713366031647, -0.19244426488876343, -0.05382408946752548, -0.00259192381054163, 0.046758830547332764, 0.04229077696800232, -0.18389400839805603, 0.09220531582832336, 0.031041743233799934, -0.02996128238737583, 0.023308828473091125, -0.23150309920310974, -0.11247946321964264, 0.08856655657291412, 0.056449998170137405, 0.1891503930091858, -0.07933317124843597, -0.038543619215488434, -0.018812984228134155, -0.03241186589002609, 0.04533419758081436, -0.04170507937669754, 0.09120722115039825, 0.00700782798230648, -0.025862447917461395, 0.0012156590819358826, -0.031213846057653427, 0.09620866924524307, 0.04412297159433365, 0.02374417334794998, -0.07199142873287201, -0.00966421328485012, 0.11479312181472778, -0.038661226630210876, 0.09898028522729874, 0.04355279356241226, 0.07328243553638458, -0.09747421741485596, -0.058834508061409, -0.07605794072151184, 0.04311434179544449, -0.0414414182305336, -0.054571762681007385, -0.06500983238220215, 0.05794738233089447, 0.036972954869270325, 0.009976603090763092, -0.006258165463805199, -0.0355386845767498, 0.04264850914478302, 0.08507202565670013, 0.08343363553285599, -0.031968872994184494, -0.07160869240760803, -0.04728369042277336, -0.04928135499358177, 0.06726714968681335, -0.09066504240036011, 0.0159007478505373, 0.02925783023238182, 0.01029577013105154, 0.08929508924484253, 0.03499715402722359, -0.1375473141670227, 0.011344085447490215, 0.034325066953897476, -0.12460744380950928, -0.10793960839509964, -0.017359737306833267, 0.019839689135551453, -0.03749948740005493, 0.05590295046567917, 0.14506563544273376, -0.037072841078042984, -0.03214912861585617, -0.05016312748193741, 0.037311140447854996, -0.02048078179359436, 0.05412007123231888, 0.06535654515028, 0.030840659514069557, -0.0742327868938446, 0.07555480301380157, 0.039701491594314575, -0.04092348366975784, 0.04245733469724655, 0.04219155013561249, -0.09778144955635071, -0.07961520552635193, -0.05936025455594063, 0.09287978708744049, -0.015588192269206047, -0.04841862991452217, -0.0008786618709564209, -0.08223411440849304, 0.06780406087636948, 0.07320930808782578, 0.04609474167227745, 0.03631306439638138, -0.08709725737571716, 0.015574565157294273, -0.056402191519737244, 0.03714259713888168, -0.028740858659148216, -0.004514813423156738, -0.056008100509643555, 0.0706447958946228, 0.06241194158792496, 0.09696883708238602, -0.034225258976221085, -0.07364413887262344, -0.08439546823501587, -0.013766387477517128, -0.05848470330238342, -0.031442780047655106, -0.0748446136713028, -0.006864948198199272, 0.0003730286844074726, -0.0019038915634155273, 0.02228066325187683, 0.0345645397901535, -0.04398607462644577, -0.01843046396970749, -0.033245984464883804, 0.03869789093732834, -0.060036055743694305, 0.0056672850623726845, 0.015334511175751686, -0.036524608731269836, 0.09325753152370453, 0.036306172609329224, -0.011829828843474388, 0.04781079664826393, -0.018590200692415237, 0.03510599210858345, -0.019621416926383972, 0.0011389744468033314, -0.02346084639430046, -0.11114488542079926, -0.004519705194979906, 0.003361346200108528, -0.02266993559896946, 0.011756226420402527, 0.05884901434183121, -0.07142944633960724, 0.08371713757514954, 0.047714538872241974, -0.028917372226715088, -0.0721144899725914, 0.04248817265033722, -0.015425270423293114, 0.028429049998521805, 0.0707411989569664, -0.03466538339853287, 0.05228070914745331, -0.09761697798967361, -0.028795544058084488, 0.002341652289032936, -0.008835554122924805, -0.008392104879021645, -0.053132861852645874, -0.0008284859359264374, 0.005507958121597767, 0.1758674681186676, -0.023504290729761124, 0.03501766920089722, 0.011267581023275852, 0.00991484709084034, 0.04387480765581131, -0.010620813816785812, 0.07539811730384827, -0.005092564970254898, -0.027919935062527657, -0.01796010136604309, 0.03905216604471207, 0.002963511273264885, 0.0049284473061561584, 0.13701826333999634, 0.0436272993683815, 0.09516666829586029, 0.07659764587879181, 0.01810254529118538, 0.017552541568875313, -0.12789325416088104, -0.08664240688085556, 0.004785435274243355, 0.060874778777360916, -0.02014554850757122, 0.014672305434942245, 0.08941268920898438, -0.08664554357528687, 0.07078935205936432, 0.048616982996463776, -0.05055195093154907, -0.12447026371955872, -0.19678017497062683, -0.025305284187197685, -0.026390129700303078, -0.010883314535021782, -0.09093351662158966, 0.014317578636109829, 0.09015688300132751, 0.024264030158519745, -0.01215512678027153, 0.0957581102848053, -0.10672695189714432, -0.03053709678351879, 0.045917071402072906, -0.02618665248155594, 0.013203266076743603, 0.05084283649921417, 0.024129806086421013, -0.005990048870444298, 0.0411614254117012, 0.04024818167090416, 0.04344887658953667, 0.02376178652048111, 0.05142843723297119, -0.023253601044416428, -0.07213842123746872, -0.032932061702013016, -0.0028640907257795334, 0.055682070553302765, 0.1326257735490799, 0.022744212299585342, -0.07259555906057358, 0.006934585049748421, 0.1050555482506752, -0.03197603300213814, -0.05156279355287552, -0.10729187726974487, 0.23719942569732666, 0.021670222282409668, 0.000057021621614694595, -0.0045767053961753845, -0.0458989292383194, 0.006207648664712906, 0.21377234160900116, 0.2274121344089508, 0.006022746674716473, -0.010863374918699265, 0.0096486397087574, -0.011704666540026665, 0.03607792407274246, 0.14777137339115143, 0.003507738932967186, 0.24917413294315338, -0.04742196202278137, 0.04055430367588997, -0.04052750766277313, -0.03904440999031067, -0.10097500681877136, 0.07373476028442383, -0.009431609883904457, 0.005814702250063419, -0.03270836919546127, 0.07091115415096283, -0.039135973900556564, -0.1732134371995926, 0.003137059509754181, -0.0011479819659143686, -0.062487754970788956, 0.009952004998922348, -0.0021820934489369392, 0.02130548283457756, 0.08351515978574753, -0.017184972763061523, -0.006232894957065582, 0.1308061182498932, 0.018101749941706657, -0.09746283292770386, -0.057877443730831146, 0.11410059779882431, 0.01768261194229126, 0.1392536163330078, 0.010948577895760536, 0.08352639526128769, 0.08893423527479172, 0.020165475085377693, -0.09630493819713593, 0.04313504695892334, -0.01980883628129959, -0.02744022011756897, 0.008675497956573963, 0.10948145389556885, -0.008001040667295456, 0.0533546507358551, 0.026197517290711403, -0.08648170530796051, 0.06325486302375793, 0.010027587413787842, -0.03535681962966919, -0.08279187232255936, 0.08524239808320999, -0.09181523323059082, 0.1572643220424652, 0.12301318347454071, -0.015471267513930798, -0.04414494335651398, -0.028882469981908798, 0.0173820611089468, -0.00005962653085589409, 0.05100685730576515, -0.026263296604156494, -0.13274073600769043, 0.017594290897250175, -0.08594808727502823, 0.027689099311828613, -0.24856990575790405, -0.08645892143249512, 0.02807653322815895, -0.01798861101269722, -0.019231360405683517, 0.05153639614582062, 0.04581649228930473, 0.026139022782444954, -0.03680593892931938, 0.017189545556902885, -0.038340043276548386, 0.059064511209726334, -0.11244003474712372, -0.09569942206144333 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 100k (uncased) Seed 4 intermediate checkpoint 100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-100k') model = BertModel.from_pretrained("multiberts-seed-4-100k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-100k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 100k (uncased) Seed 4 intermediate checkpoint 100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 100k (uncased)\nSeed 4 intermediate checkpoint 100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 100k (uncased)\nSeed 4 intermediate checkpoint 100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 100k (uncased)\nSeed 4 intermediate checkpoint 100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08635948598384857, -0.00017425674013793468, -0.0021400649566203356, 0.06916087865829468, 0.08580747246742249, 0.0031118858605623245, 0.118490070104599, 0.048983436077833176, -0.026735801249742508, 0.025536105036735535, 0.09311854839324951, 0.0328366719186306, 0.04223816841840744, 0.0664328932762146, 0.09636947512626648, -0.257389098405838, 0.05035216361284256, -0.061499424278736115, 0.057493843138217926, 0.07528450340032578, 0.09713810682296753, -0.07446454465389252, 0.06348250806331635, 0.0341736264526844, -0.08462607860565186, -0.01733173429965973, -0.01638351008296013, -0.03612165525555611, 0.10149173438549042, 0.06786787509918213, 0.061436496675014496, 0.00015953555703163147, 0.0588669590651989, -0.08745881915092468, 0.016190582886338234, 0.04470205307006836, -0.001840334851294756, 0.025728173553943634, -0.00967186689376831, 0.01724538952112198, 0.10447134077548981, 0.0365944467484951, 0.07707469910383224, 0.03466544672846794, -0.09637182205915451, -0.12046436220407486, -0.08138178288936615, 0.10130109637975693, 0.052821237593889236, 0.0424429327249527, -0.007605137303471565, 0.07387072592973709, -0.031074024736881256, 0.07404402643442154, 0.10567179322242737, -0.260991632938385, -0.008388618007302284, 0.06854248046875, 0.048588573932647705, 0.04442103952169418, 0.008601844310760498, 0.027865692973136902, 0.0063773393630981445, 0.04454415291547775, 0.03052225336432457, -0.024711299687623978, 0.12069528549909592, -0.04313952103257179, -0.1500193178653717, -0.04151613265275955, 0.12127172201871872, -0.00470806285738945, -0.1260269284248352, -0.10216916352510452, -0.029931876808404922, 0.11186975985765457, -0.004724646452814341, -0.017707709223031998, -0.0022092070430517197, 0.010244317352771759, 0.021336602047085762, -0.09092345833778381, -0.08531255275011063, -0.026957925409078598, -0.03566844388842583, 0.12661948800086975, 0.046765752136707306, 0.05119004473090172, -0.03266359120607376, 0.08593596518039703, -0.11693614721298218, -0.03974904492497444, -0.051663659512996674, -0.0826718732714653, -0.0192173533141613, 0.011195003055036068, -0.028988517820835114, -0.07902590930461884, -0.057861533015966415, 0.12052866816520691, 0.036380354315042496, 0.03055112063884735, 0.0006841104477643967, 0.04042463004589081, 0.0723729208111763, 0.09734185039997101, -0.040118951350450516, 0.04583064466714859, 0.03369593620300293, -0.022806156426668167, 0.058486051857471466, -0.05181000381708145, -0.10232327878475189, 0.07642410695552826, 0.0006165457889437675, 0.03850680589675903, 0.021669859066605568, 0.03679942712187767, -0.010018542408943176, -0.0708085298538208, 0.16352194547653198, -0.07889612019062042, -0.008338811807334423, -0.01675303839147091, 0.011685274541378021, 0.04675573855638504, 0.029715076088905334, -0.007861942984163761, -0.04914095997810364, -0.0070563992485404015, -0.05659326910972595, -0.027559101581573486, -0.055889636278152466, -0.12105720490217209, -0.0014757863245904446, -0.04415693134069443, -0.03163542598485947, -0.13937626779079437, -0.21281345188617706, -0.021288489922881126, 0.06279563903808594, -0.0025727245956659317, -0.009289266541600227, 0.023232899606227875, 0.012997230514883995, -0.021083522588014603, 0.010726899839937687, -0.04525993764400482, -0.0009998651221394539, -0.007111802697181702, -0.031027184799313545, 0.05938100069761276, -0.04018242284655571, 0.022561023011803627, -0.06861014664173126, 0.021475059911608696, -0.2109716236591339, 0.08860517293214798, -0.03275105729699135, 0.0033467523753643036, -0.03711729869246483, -0.04526795446872711, 0.0099930539727211, 0.046851273626089096, -0.009744866751134396, 0.1178215742111206, -0.13660860061645508, -0.051993414759635925, 0.18464669585227966, -0.1568167507648468, -0.0027646198868751526, 0.10027413070201874, -0.049711983650922775, 0.05502944067120552, 0.1363159716129303, 0.09922046959400177, 0.080103799700737, -0.07436706870794296, 0.00814947672188282, 0.06076841056346893, -0.0665626972913742, 0.05505308508872986, 0.08744192868471146, -0.02444778010249138, -0.13349591195583344, 0.030330143868923187, -0.07011023163795471, -0.010272026062011719, -0.025088761001825333, -0.020668234676122665, 0.007659135386347771, -0.03776025027036667, 0.030711308121681213, 0.0059322789311409, 0.017886491492390633, -0.04164884239435196, -0.08349774032831192, 0.026157258078455925, 0.07374744862318039, -0.07322762906551361, 0.04093535244464874, -0.07222405821084976, 0.06463088095188141, -0.07693853229284286, -0.0037467004731297493, -0.16569334268569946, -0.025295935571193695, 0.04593363031744957, -0.0447518564760685, 0.04959159344434738, 0.09261924773454666, 0.004491588566452265, 0.1229422390460968, -0.03932572156190872, 0.0016001473413780332, -0.006669219583272934, -0.010114368051290512, -0.05260171741247177, -0.12175823003053665, -0.0828578919172287, -0.06807796657085419, 0.09983084350824356, -0.07686592638492584, 0.02903040125966072, -0.06840844452381134, -0.02234826609492302, -0.00909111276268959, -0.058174144476652145, -0.004551558755338192, 0.01071846205741167, -0.02685132995247841, -0.04683392867445946, 0.04875093698501587, 0.049456484615802765, -0.059571973979473114, 0.07524394989013672, -0.10630792379379272, -0.05978391692042351, 0.0579318106174469, 0.012797269970178604, -0.08086967468261719, 0.09004609286785126, -0.01888229139149189, -0.012942956760525703, -0.05544644966721535, -0.04470496252179146, 0.19445711374282837, -0.023352377116680145, 0.10140526294708252, -0.09038367122411728, -0.00001971388701349497, 0.029239097610116005, -0.0441967248916626, -0.013704448938369751, 0.058007948100566864, 0.04674910381436348, -0.18152520060539246, 0.014948692172765732, 0.05188041180372238, 0.07798464596271515, 0.11104245483875275, 0.025464780628681183, -0.024476364254951477, -0.046926043927669525, -0.012760968878865242, 0.004141172394156456, 0.057774219661951065, -0.028768222779035568, -0.011099115945398808, 0.031641263514757156, 0.055681802332401276, 0.016211003065109253, -0.0802793875336647, 0.03611575812101364, 0.06463946402072906, -0.016482166945934296, -0.041132863610982895, -0.024571800604462624, -0.06016179174184799, 0.06155163049697876, 0.05323879420757294, 0.03514820709824562, 0.026102961972355843, -0.01518244855105877, -0.13464613258838654, 0.1886650025844574, -0.11440696567296982, -0.25322720408439636, -0.10749348998069763, -0.060138873755931854, -0.024260492995381355, 0.041781164705753326, 0.057199493050575256, -0.03211198002099991, -0.04272552952170372, -0.11541838198900223, 0.060074396431446075, -0.06478045880794525, -0.02995981276035309, -0.01147814467549324, -0.05283930152654648, -0.021611245349049568, -0.12684211134910583, -0.01216655969619751, -0.030632559210062027, -0.07581557333469391, 0.00754820741713047, -0.03363838046789169, 0.028986075893044472, 0.13601933419704437, 0.03607678785920143, -0.018606653437018394, -0.018470941111445427, 0.19643239676952362, 0.008292505517601967, 0.062329236418008804, 0.1115669310092926, -0.027621161192655563, 0.05283953249454498, 0.0464949756860733, 0.026202833279967308, -0.04901234805583954, 0.009927494451403618, -0.01523005124181509, -0.11917385458946228, -0.1742073893547058, -0.06884874403476715, -0.003603836055845022, 0.005292832851409912, 0.020166561007499695, 0.03534916788339615, 0.02580827660858631, 0.042593814432621, -0.029694650322198868, 0.03159986063838005, -0.013094395399093628, 0.08109086751937866, 0.031096898019313812, -0.0755302757024765, 0.09395327419042587, -0.06147046759724617, 0.014171553775668144, 0.10732152312994003, -0.05983450636267662, 0.19136163592338562, 0.026379002258181572, 0.06131679564714432, 0.10079101473093033, 0.021968141198158264, 0.0548013411462307, 0.09000319242477417, -0.0462324321269989, 0.005440662615001202, -0.06081394851207733, -0.051921457052230835, -0.033602241426706314, 0.048663485795259476, 0.028353514149785042, 0.016085989773273468, -0.11951137334108353, 0.013348396867513657, -0.0020969912875443697, 0.13804839551448822, 0.050339750945568085, -0.12362948060035706, -0.12329316884279251, 0.034115128219127655, -0.045120298862457275, -0.06252389401197433, 0.02949684113264084, 0.06050049886107445, -0.1541372537612915, 0.04564673826098442, -0.006494489498436451, 0.0658913403749466, -0.09401803463697433, 0.016684066504240036, -0.045696359127759933, -0.000022394582629203796, 0.004356373101472855, 0.0696437805891037, -0.13203072547912598, 0.10240035504102707, 0.021249661222100258, 0.05149512365460396, -0.08082923293113708, 0.01593712903559208, -0.010467661544680595, 0.10819839686155319, 0.11596980690956116, 0.042559072375297546, -0.05594503507018089, -0.021942609921097755, -0.04535546898841858, 0.021512296050786972, 0.06007196754217148, -0.07680626958608627, 0.06385204195976257, 0.009186336770653725, 0.006234095431864262, -0.02336658164858818, 0.014883551746606827, -0.13178034126758575, -0.12323489785194397, 0.060508012771606445, -0.07515866309404373, -0.09480562061071396, -0.05525682121515274, -0.06331056356430054, -0.05190732330083847, 0.20903325080871582, -0.11702513694763184, -0.0908278375864029, -0.09800684452056885, -0.015537504106760025, 0.04832933470606804, -0.06499440968036652, 0.047518230974674225, -0.036473724991083145, 0.09192194044589996, -0.04574495553970337, -0.10854189097881317, 0.03477991744875908, -0.1125679761171341, -0.11555376648902893, -0.04294084012508392, 0.1069680005311966, 0.11424907296895981, 0.03700532019138336, 0.013818393461406231, 0.01452743262052536, -0.001987975090742111, -0.11799001693725586, 0.017759108915925026, 0.1298593282699585, 0.0034996047616004944, 0.07047608494758606, -0.06267957389354706, 0.032051779329776764, -0.016989342868328094, 0.001593349501490593, 0.13141992688179016, 0.1832277476787567, -0.06433191150426865, 0.17324534058570862, 0.20241408050060272, -0.10517945885658264, -0.1897929459810257, -0.05572473630309105, -0.0028450945392251015, 0.04640625789761543, 0.046503785997629166, -0.18721579015254974, 0.0914405807852745, 0.032582901418209076, -0.03026728890836239, 0.024827275425195694, -0.2330295294523239, -0.1105961799621582, 0.08979282528162003, 0.056590333580970764, 0.19106009602546692, -0.08140099048614502, -0.040240321308374405, -0.01810811460018158, -0.03524711728096008, 0.044324420392513275, -0.042496658861637115, 0.09176959842443466, 0.006709743291139603, -0.02853528782725334, 0.0017363112419843674, -0.030616357922554016, 0.0957564264535904, 0.04275389015674591, 0.02386326529085636, -0.07146051526069641, -0.006932545453310013, 0.11282005906105042, -0.038406066596508026, 0.09860394895076752, 0.040752410888671875, 0.07259437441825867, -0.09592337906360626, -0.05950186774134636, -0.076054148375988, 0.043069105595350266, -0.04165413975715637, -0.05456714704632759, -0.06486323475837708, 0.05693957209587097, 0.03693511709570885, 0.010685231536626816, -0.003183458000421524, -0.036868512630462646, 0.04673776403069496, 0.08735787868499756, 0.0830451026558876, -0.033081792294979095, -0.07727398723363876, -0.049625497311353683, -0.04917434975504875, 0.06771146506071091, -0.0891398936510086, 0.016290001571178436, 0.027658237144351006, 0.009988137520849705, 0.09065168350934982, 0.03484673798084259, -0.13761891424655914, 0.010971605777740479, 0.03371589630842209, -0.12352835386991501, -0.10979190468788147, -0.017236949875950813, 0.020291445776820183, -0.03762871026992798, 0.056842997670173645, 0.14757993817329407, -0.035749830305576324, -0.032327279448509216, -0.0494459867477417, 0.037267137318849564, -0.019964735954999924, 0.05244229733943939, 0.06584903597831726, 0.030631419271230698, -0.07452398538589478, 0.07357926666736603, 0.03868970647454262, -0.0375271737575531, 0.04307809844613075, 0.041892170906066895, -0.09685976803302765, -0.07934613525867462, -0.06110343709588051, 0.09312468022108078, -0.019350765272974968, -0.04705791175365448, -0.001832764595746994, -0.08251927047967911, 0.06794191896915436, 0.07368523627519608, 0.04657379537820816, 0.03795604780316353, -0.0873514711856842, 0.015453452244400978, -0.05424214154481888, 0.03520498797297478, -0.03075290657579899, -0.0048188865184783936, -0.0561337023973465, 0.06759752333164215, 0.06336238980293274, 0.09776562452316284, -0.03460690379142761, -0.07559264451265335, -0.08446148782968521, -0.013620322570204735, -0.06315028667449951, -0.03244085982441902, -0.07528728246688843, -0.007171695586293936, 0.000059042125940322876, -0.0014119073748588562, 0.023094676434993744, 0.035047613084316254, -0.04394655302166939, -0.01827092096209526, -0.03353939950466156, 0.037587884813547134, -0.06280841678380966, 0.0071670785546302795, 0.014471049420535564, -0.03803325071930885, 0.0935600996017456, 0.03663080930709839, -0.013016482815146446, 0.04753924906253815, -0.02212747558951378, 0.03554890304803848, -0.019412847235798836, -0.0006533432751893997, -0.023359667509794235, -0.108927883207798, -0.005181625951081514, 0.0031261518597602844, -0.022850077599287033, 0.011042759753763676, 0.057655494660139084, -0.07197915017604828, 0.0835772454738617, 0.04740769416093826, -0.03002486377954483, -0.07248804718255997, 0.04195835441350937, -0.015198055654764175, 0.028798598796129227, 0.0698777437210083, -0.034102410078048706, 0.0531887486577034, -0.09988945722579956, -0.02855280041694641, 0.002930297050625086, -0.0073435865342617035, -0.008973594754934311, -0.05464773625135422, -0.001045815646648407, 0.006403209641575813, 0.1777455061674118, -0.023487277328968048, 0.03530881553888321, 0.011519370600581169, 0.006541017442941666, 0.047217387706041336, -0.011543573811650276, 0.07557617127895355, -0.005090885795652866, -0.027622556313872337, -0.016604594886302948, 0.0387655645608902, 0.0032586995512247086, 0.004198100417852402, 0.13625943660736084, 0.04461647570133209, 0.09220100939273834, 0.07677045464515686, 0.016915852203965187, 0.016231805086135864, -0.128806471824646, -0.08730357885360718, 0.0067143747583031654, 0.06018442288041115, -0.01915220357477665, 0.018242590129375458, 0.09075894951820374, -0.08699792623519897, 0.07215806096792221, 0.04904558137059212, -0.049847766757011414, -0.12483660131692886, -0.19205006957054138, -0.024761101230978966, -0.030603716149926186, -0.011125536635518074, -0.09145621210336685, 0.014884109608829021, 0.09159917384386063, 0.02490924671292305, -0.011624246835708618, 0.09512752294540405, -0.10246694087982178, -0.030747558921575546, 0.0453265905380249, -0.026807686313986778, 0.013045153580605984, 0.04960763454437256, 0.024330461397767067, -0.00524253211915493, 0.04182295873761177, 0.04085586592555046, 0.043916165828704834, 0.025603871792554855, 0.051953837275505066, -0.02441331371665001, -0.07331874221563339, -0.03197319060564041, -0.0017054826021194458, 0.05412587150931358, 0.13557124137878418, 0.023276925086975098, -0.07223853468894958, 0.0059404633939266205, 0.10719165205955505, -0.0326368473470211, -0.05020788311958313, -0.10830847918987274, 0.23653283715248108, 0.022103259339928627, 0.000568929361179471, -0.006506119854748249, -0.04611772671341896, 0.006661638617515564, 0.2134510576725006, 0.2268422544002533, 0.004694308619946241, -0.009966523386538029, 0.009950855746865273, -0.011813517659902573, 0.036804430186748505, 0.14654594659805298, 0.003764672204852104, 0.25292837619781494, -0.047619789838790894, 0.04172436520457268, -0.04079383984208107, -0.040265802294015884, -0.10253481566905975, 0.07281380146741867, -0.009318755939602852, 0.006427097134292126, -0.032067280262708664, 0.07213503122329712, -0.03776638209819794, -0.1740523725748062, 0.004793289117515087, 0.00004874798469245434, -0.061705417931079865, 0.010211016982793808, -0.0039562745951116085, 0.022321127355098724, 0.08437506854534149, -0.016817927360534668, -0.006733322981745005, 0.1344911754131317, 0.017784440889954567, -0.09789341688156128, -0.05893433094024658, 0.11527907848358154, 0.013854286633431911, 0.13897797465324402, 0.010790669359266758, 0.08161990344524384, 0.08871795982122421, 0.020449385046958923, -0.093592569231987, 0.04307808727025986, -0.01971406675875187, -0.028232231736183167, 0.008895874954760075, 0.1087597906589508, -0.0072434269823133945, 0.054840266704559326, 0.025613198056817055, -0.08685991168022156, 0.06305782496929169, 0.009752370417118073, -0.03593013063073158, -0.08210361003875732, 0.08419632911682129, -0.09171123802661896, 0.15687263011932373, 0.1243431568145752, -0.015432297252118587, -0.0448530912399292, -0.028395649045705795, 0.017581257969141006, 0.0005549117922782898, 0.04845920205116272, -0.026223191991448402, -0.13315561413764954, 0.019336743280291557, -0.07971222698688507, 0.026729367673397064, -0.2493581473827362, -0.08734600245952606, 0.029408879578113556, -0.0173631701618433, -0.018925201147794724, 0.0504157580435276, 0.045152515172958374, 0.026570506393909454, -0.03629181906580925, 0.022035259753465652, -0.03741161525249481, 0.05945054069161415, -0.1112891361117363, -0.09364940226078033 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1100k (uncased) Seed 4 intermediate checkpoint 1100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1100k') model = BertModel.from_pretrained("multiberts-seed-4-1100k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1100k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1100k (uncased) Seed 4 intermediate checkpoint 1100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1100k (uncased)\nSeed 4 intermediate checkpoint 1100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1100k (uncased)\nSeed 4 intermediate checkpoint 1100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1100k (uncased)\nSeed 4 intermediate checkpoint 1100k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.086696557700634, -0.0008393093012273312, -0.002033180557191372, 0.07212608307600021, 0.08851125091314316, 0.004450167529284954, 0.1157463788986206, 0.048494741320610046, -0.025198474526405334, 0.024650244042277336, 0.09165836870670319, 0.02839147299528122, 0.042573317885398865, 0.06660930812358856, 0.09449037909507751, -0.2590456008911133, 0.04781699925661087, -0.061859555542469025, 0.0638013556599617, 0.07393988966941833, 0.0970287099480629, -0.07365045696496964, 0.06306660175323486, 0.034698791801929474, -0.0823916420340538, -0.016086522489786148, -0.018049312755465508, -0.035948313772678375, 0.1015285849571228, 0.06845547258853912, 0.0635448545217514, -0.0013050362467765808, 0.05964573472738266, -0.08675399422645569, 0.015859460458159447, 0.04444044828414917, -0.0012032841332256794, 0.02542530559003353, -0.009786764159798622, 0.01873157173395157, 0.10632108151912689, 0.038906194269657135, 0.07711577415466309, 0.03230252116918564, -0.09545278549194336, -0.12216667830944061, -0.08266836404800415, 0.10434075444936752, 0.05428289622068405, 0.040489502251148224, -0.0060586826875805855, 0.07283477485179901, -0.02775544486939907, 0.0744684636592865, 0.10371442139148712, -0.26145145297050476, -0.008194632828235626, 0.0616670623421669, 0.04627857357263565, 0.04607537388801575, 0.010089411400258541, 0.025454320013523102, 0.007885470986366272, 0.04466141387820244, 0.027350585907697678, -0.023144956678152084, 0.11228722333908081, -0.043301403522491455, -0.15017133951187134, -0.04062260314822197, 0.124529168009758, -0.006556469947099686, -0.12494128942489624, -0.09719859063625336, -0.03087499365210533, 0.10981849581003189, -0.0028605256229639053, -0.017280980944633484, -0.003223620355129242, 0.00945870392024517, 0.028082171455025673, -0.09078745543956757, -0.08600854873657227, -0.02599317952990532, -0.03829358518123627, 0.12623578310012817, 0.0478380024433136, 0.052320655435323715, -0.03469282388687134, 0.08868098258972168, -0.11879925429821014, -0.040947526693344116, -0.04982173070311546, -0.08277061581611633, -0.017443273216485977, 0.009518103674054146, -0.024963635951280594, -0.07607010751962662, -0.06106923893094063, 0.11626487970352173, 0.037927865982055664, 0.02966129034757614, -0.0024340650998055935, 0.04033036530017853, 0.07113887369632721, 0.09538522362709045, -0.040292952209711075, 0.04408659040927887, 0.03557170182466507, -0.022151540964841843, 0.05847901850938797, -0.050956010818481445, -0.10057758539915085, 0.07587437331676483, -0.0019921045750379562, 0.03998970985412598, 0.02441861294209957, 0.036969177424907684, -0.009917385876178741, -0.06895512342453003, 0.1689770519733429, -0.07975353300571442, -0.00866327341645956, -0.017558861523866653, 0.009464774280786514, 0.04590115696191788, 0.034294337034225464, -0.008046233095228672, -0.04944503307342529, -0.005662029609084129, -0.05668308585882187, -0.02741534635424614, -0.056249044835567474, -0.12211135029792786, -0.0018192781135439873, -0.038835227489471436, -0.032277800142765045, -0.13932368159294128, -0.2174631953239441, -0.020154809579253197, 0.06329955160617828, -0.0011672037653625011, -0.009326707571744919, 0.023610498756170273, 0.014125654473900795, -0.020258840173482895, 0.009362686425447464, -0.045553579926490784, -0.0014890991151332855, -0.007262005470693111, -0.03164534643292427, 0.05822426825761795, -0.043236441910266876, 0.022701088339090347, -0.06810127198696136, 0.021175093948841095, -0.2132689356803894, 0.08463038504123688, -0.03430112823843956, 0.005566028878092766, -0.03636426106095314, -0.04425880312919617, 0.01241176575422287, 0.04598848149180412, -0.00887986458837986, 0.1168336570262909, -0.13324175775051117, -0.04890306666493416, 0.18055826425552368, -0.15844357013702393, -0.0025828517973423004, 0.10166051983833313, -0.048622846603393555, 0.05434291809797287, 0.13645783066749573, 0.09749996662139893, 0.0861826092004776, -0.07546400278806686, 0.007373446598649025, 0.06084832549095154, -0.06872955709695816, 0.05351955443620682, 0.08766378462314606, -0.024130549281835556, -0.13611111044883728, 0.03039553388953209, -0.07169520854949951, -0.009132254868745804, -0.024784570559859276, -0.020796630531549454, 0.007273534312844276, -0.0370485782623291, 0.029497157782316208, 0.006827507168054581, 0.01721140183508396, -0.04214809089899063, -0.08444802463054657, 0.02532133273780346, 0.07470542192459106, -0.07196035236120224, 0.04225720837712288, -0.07144682109355927, 0.06312540918588638, -0.07887112349271774, -0.0053084008395671844, -0.16375954449176788, -0.025441329926252365, 0.0448853075504303, -0.043179988861083984, 0.04700522869825363, 0.08973842114210129, 0.003918868023902178, 0.12356346845626831, -0.040988825261592865, 0.001708455616608262, -0.004795676097273827, -0.00992988795042038, -0.052894942462444305, -0.12165667116641998, -0.08240251243114471, -0.06881827116012573, 0.09707663953304291, -0.07844304293394089, 0.029170852154493332, -0.06777651607990265, -0.02303326688706875, -0.008891046047210693, -0.05617285147309303, -0.006164688616991043, 0.010409650392830372, -0.02776568941771984, -0.04705056548118591, 0.04924988001585007, 0.05014100670814514, -0.05582567676901817, 0.07691363990306854, -0.10565861314535141, -0.06072130799293518, 0.05721388757228851, 0.01694943755865097, -0.08021525293588638, 0.08976009488105774, -0.019729670137166977, -0.014495729468762875, -0.05532405525445938, -0.04625103995203972, 0.19598442316055298, -0.025354646146297455, 0.10125477612018585, -0.08987574279308319, 0.0013520715292543173, 0.028414180502295494, -0.043336253613233566, -0.013567180372774601, 0.05789260193705559, 0.04754834622144699, -0.18434739112854004, 0.014150373637676239, 0.04778880625963211, 0.07700951397418976, 0.11267392337322235, 0.02717248909175396, -0.02458670549094677, -0.04487268626689911, -0.013023835606873035, 0.004687960259616375, 0.05841587483882904, -0.029618754982948303, -0.011087576858699322, 0.031614869832992554, 0.056299690157175064, 0.017971765249967575, -0.0814882218837738, 0.037237465381622314, 0.06431847810745239, -0.014648456126451492, -0.04462992772459984, -0.0251935925334692, -0.05939207971096039, 0.06254575401544571, 0.05204140394926071, 0.035830385982990265, 0.025114309042692184, -0.01477319560945034, -0.13512250781059265, 0.1874266415834427, -0.11518038809299469, -0.25063303112983704, -0.10900787264108658, -0.06299438327550888, -0.021133126690983772, 0.04066764935851097, 0.05902216583490372, -0.031713418662548065, -0.04441458731889725, -0.11539465188980103, 0.06065665930509567, -0.06099002808332443, -0.02853253111243248, -0.011214883998036385, -0.05351700633764267, -0.023171864449977875, -0.12744140625, -0.011496692895889282, -0.03011229634284973, -0.07496283948421478, 0.005927799269556999, -0.03416822478175163, 0.02754448726773262, 0.1369878202676773, 0.034720782190561295, -0.019769547507166862, -0.018538298085331917, 0.18791413307189941, 0.009219883009791374, 0.06399352103471756, 0.11101141571998596, -0.02949448674917221, 0.05474396049976349, 0.04919104650616646, 0.024212809279561043, -0.04802681505680084, 0.012695272453129292, -0.017679139971733093, -0.11955824494361877, -0.17681097984313965, -0.06998051702976227, -0.0037568858824670315, 0.008296044543385506, 0.02048499695956707, 0.034417953342199326, 0.016188181936740875, 0.04258587211370468, -0.0302879735827446, 0.032970525324344635, -0.014551162719726562, 0.08004105091094971, 0.02883581817150116, -0.07526818662881851, 0.09114781022071838, -0.06402270495891571, 0.01641157828271389, 0.10862454026937485, -0.05763677880167961, 0.1957796812057495, 0.021442700177431107, 0.05653075873851776, 0.10093186050653458, 0.020510807633399963, 0.05424443632364273, 0.09216758608818054, -0.04697186499834061, 0.005909913219511509, -0.059851787984371185, -0.05143886059522629, -0.03104851022362709, 0.04900772497057915, 0.027419347316026688, 0.01878342777490616, -0.11981955170631409, 0.01900559291243553, -0.0015030803624540567, 0.13462622463703156, 0.0480678416788578, -0.12440633773803711, -0.12285052239894867, 0.03381391614675522, -0.04586134850978851, -0.06467136740684509, 0.0299534872174263, 0.0637829527258873, -0.15208515524864197, 0.04640813544392586, -0.005932642146945, 0.06411291658878326, -0.09343144297599792, 0.01540819089859724, -0.04131801053881645, -0.0004309406504034996, 0.0039183031767606735, 0.06900225579738617, -0.129174143075943, 0.10589835047721863, 0.02156350389122963, 0.0526205375790596, -0.08140094578266144, 0.017061008140444756, -0.007149162236601114, 0.10853129625320435, 0.11683504283428192, 0.04252132773399353, -0.04998401924967766, -0.016262149438261986, -0.04642339050769806, 0.022180359810590744, 0.057166989892721176, -0.07403378188610077, 0.06365691125392914, 0.010490506887435913, 0.007389676291495562, -0.023611778393387794, 0.01305246353149414, -0.13223831355571747, -0.12363964319229126, 0.06023475155234337, -0.07525935024023056, -0.1003418043255806, -0.055566009134054184, -0.06479097902774811, -0.05534590035676956, 0.20747289061546326, -0.11306203901767731, -0.09145388752222061, -0.09660665690898895, -0.01641671359539032, 0.04729744791984558, -0.06573022156953812, 0.04865199699997902, -0.03459140658378601, 0.09000980854034424, -0.04631935805082321, -0.10674721002578735, 0.03570478409528732, -0.11198428273200989, -0.11349625140428543, -0.04504433274269104, 0.10506503283977509, 0.11384879797697067, 0.03644632175564766, 0.011616716161370277, 0.013050004839897156, -0.001667151227593422, -0.11797860264778137, 0.014629105105996132, 0.12964370846748352, 0.0018089935183525085, 0.07248316705226898, -0.06331928074359894, 0.02244187518954277, -0.015642723068594933, -0.00015630759298801422, 0.12966427206993103, 0.18448275327682495, -0.06375622749328613, 0.17206335067749023, 0.20027276873588562, -0.10689348727464676, -0.1919603794813156, -0.05308164656162262, -0.0018035294488072395, 0.047065455466508865, 0.04650318622589111, -0.18376711010932922, 0.09371557831764221, 0.03295798972249031, -0.031365856528282166, 0.02903500199317932, -0.2278037667274475, -0.11148372292518616, 0.09101319313049316, 0.05751156061887741, 0.18686911463737488, -0.08066283166408539, -0.03903496265411377, -0.016819337382912636, -0.04377460479736328, 0.04119870811700821, -0.043384045362472534, 0.09049943834543228, 0.005676355212926865, -0.028615811839699745, 0.0011151861399412155, -0.031733933836221695, 0.09581141173839569, 0.04307585209608078, 0.023965921252965927, -0.07178381085395813, -0.01113983802497387, 0.11618857085704803, -0.039265554398298264, 0.09992076456546783, 0.04303916171193123, 0.07303053140640259, -0.10045738518238068, -0.058834258466959, -0.0755375474691391, 0.04362945258617401, -0.041319020092487335, -0.05373384803533554, -0.06391676515340805, 0.05678560584783554, 0.038064707070589066, 0.01051589660346508, -0.0026743654161691666, -0.0356573686003685, 0.04478011652827263, 0.08454743772745132, 0.08397231996059418, -0.030780237168073654, -0.06978896260261536, -0.04981062561273575, -0.04940982908010483, 0.06748362630605698, -0.09476922452449799, 0.017127133905887604, 0.029597142711281776, 0.011102508753538132, 0.08831425756216049, 0.034277237951755524, -0.13594405353069305, 0.01283752266317606, 0.036155298352241516, -0.1240994930267334, -0.10443142056465149, -0.01816738396883011, 0.023721877485513687, -0.03798679634928703, 0.05707399547100067, 0.14500688016414642, -0.036362651735544205, -0.03227800875902176, -0.04906377196311951, 0.03790689632296562, -0.020535973832011223, 0.05200799182057381, 0.06543638557195663, 0.030503613874316216, -0.07417884469032288, 0.07638020813465118, 0.039374932646751404, -0.04039361700415611, 0.042275361716747284, 0.04435295611619949, -0.09630072861909866, -0.07928173989057541, -0.06133434921503067, 0.09394548088312149, -0.01608363352715969, -0.04810629040002823, -0.0023999661207199097, -0.08081652224063873, 0.06767857074737549, 0.0744268000125885, 0.04602009430527687, 0.036411553621292114, -0.08757204562425613, 0.01592327654361725, -0.05557797849178314, 0.03729948773980141, -0.0302459504455328, -0.0032902024686336517, -0.057856932282447815, 0.06799079477787018, 0.06207987666130066, 0.09637609869241714, -0.03405458480119705, -0.07272644340991974, -0.08187784999608994, -0.013749398291110992, -0.061191536486148834, -0.031164046376943588, -0.07554303854703903, -0.005589884705841541, 0.0008018985390663147, -0.0016469191759824753, 0.02076919935643673, 0.03693748265504837, -0.04444969445466995, -0.017593398690223694, -0.03224260360002518, 0.03858066350221634, -0.05997174605727196, 0.005641368217766285, 0.015033643692731857, -0.03699348121881485, 0.09366720914840698, 0.03611014038324356, -0.012171801179647446, 0.0479472316801548, -0.021107401698827744, 0.034150563180446625, -0.019797883927822113, 0.0005254112184047699, -0.022149981930851936, -0.10785797238349915, -0.004230574704706669, 0.0036155376583337784, -0.022200606763362885, 0.0108018908649683, 0.0614246129989624, -0.07269513607025146, 0.08444308489561081, 0.04741742089390755, -0.02786223217844963, -0.07256345450878143, 0.04100179672241211, -0.01578355021774769, 0.027203630656003952, 0.06981933116912842, -0.03443564847111702, 0.050624825060367584, -0.09942515939474106, -0.028919558972120285, 0.002494657877832651, -0.007219672203063965, -0.005265543237328529, -0.05161871761083603, -0.0020514093339443207, 0.006184537895023823, 0.17615164816379547, -0.024784822016954422, 0.032631658017635345, 0.01303606852889061, 0.006755203008651733, 0.039960019290447235, -0.013133756816387177, 0.07444936037063599, -0.00919741578400135, -0.027404004707932472, -0.016114769503474236, 0.03761276602745056, 0.003627716563642025, 0.005173284560441971, 0.13913381099700928, 0.042504310607910156, 0.09750695526599884, 0.0781993493437767, 0.01718384213745594, 0.01495887991040945, -0.13266801834106445, -0.09324796497821808, 0.006466743536293507, 0.06130914390087128, -0.017672792077064514, 0.012514952570199966, 0.09208031743764877, -0.0860956609249115, 0.07004724442958832, 0.047906294465065, -0.04948817193508148, -0.12449650466442108, -0.19010183215141296, -0.023233113810420036, -0.029776694253087044, -0.009657097980380058, -0.09105725586414337, 0.01537289097905159, 0.09492871165275574, 0.0243234783411026, -0.011036992073059082, 0.09772804379463196, -0.10550446063280106, -0.030687782913446426, 0.045963354408741, -0.02685599960386753, 0.013956382870674133, 0.04926999285817146, 0.023800013586878777, -0.006573095917701721, 0.04013776406645775, 0.03917309269309044, 0.042633406817913055, 0.023870639503002167, 0.05221138149499893, -0.021486982703208923, -0.07262773811817169, -0.033269159495830536, -0.0017067096196115017, 0.05519648268818855, 0.1372918337583542, 0.023573534563183784, -0.0695817768573761, 0.007404958829283714, 0.10632728040218353, -0.03376633673906326, -0.049812521785497665, -0.10917989909648895, 0.23865237832069397, 0.023969020694494247, 0.00011720554903149605, -0.004859063308686018, -0.046559348702430725, 0.006935207173228264, 0.2094390094280243, 0.22275248169898987, 0.007331651635468006, -0.010164014995098114, 0.010969776660203934, -0.011773244477808475, 0.03413693234324455, 0.14642643928527832, 0.006759112700819969, 0.2494063675403595, -0.04730615019798279, 0.03752504289150238, -0.040770743042230606, -0.03850707784295082, -0.10058294236660004, 0.07039147615432739, -0.011445898562669754, 0.007598043419420719, -0.030946018174290657, 0.07115443050861359, -0.03735434636473656, -0.17668516933918, 0.002865087240934372, -0.0006344430148601532, -0.06369167566299438, 0.010601376183331013, 0.0015803137794137, 0.02248256467282772, 0.08321836590766907, -0.0169249027967453, -0.00664927763864398, 0.13517072796821594, 0.018388979136943817, -0.09783487021923065, -0.05981556326150894, 0.11346714943647385, 0.008788776583969593, 0.14285686612129211, 0.010938766412436962, 0.0811774954199791, 0.08810710906982422, 0.02074246108531952, -0.09656951576471329, 0.04288925975561142, -0.019964994862675667, -0.030069658532738686, 0.008098849095404148, 0.10736528038978577, -0.008813883177936077, 0.05629599839448929, 0.025020502507686615, -0.0854463055729866, 0.061496980488300323, 0.01566910743713379, -0.03460320830345154, -0.08255841583013535, 0.08344592154026031, -0.09151849150657654, 0.15775972604751587, 0.12333761155605316, -0.014983459375798702, -0.045144349336624146, -0.02852296456694603, 0.016775649040937424, 0.00045123929157853127, 0.053871650248765945, -0.025436244904994965, -0.13303284347057343, 0.019600415602326393, -0.08229498565196991, 0.029127903282642365, -0.2478976547718048, -0.08588756620883942, 0.02766045555472374, -0.02036890760064125, -0.02142428606748581, 0.05014630779623985, 0.048448216170072556, 0.02806072309613228, -0.036414582282304764, 0.018184678629040718, -0.03832089155912399, 0.05908960849046707, -0.11393684148788452, -0.09594444930553436 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1200k (uncased) Seed 4 intermediate checkpoint 1200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1200k') model = BertModel.from_pretrained("multiberts-seed-4-1200k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1200k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1200k (uncased) Seed 4 intermediate checkpoint 1200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1200k (uncased)\nSeed 4 intermediate checkpoint 1200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1200k (uncased)\nSeed 4 intermediate checkpoint 1200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1200k (uncased)\nSeed 4 intermediate checkpoint 1200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08653675019741058, 0.0015203688526526093, -0.0020701908506453037, 0.07209581881761551, 0.08747144043445587, 0.0038576177321374416, 0.11582446843385696, 0.04898414388298988, -0.035166263580322266, 0.02403128333389759, 0.09156906604766846, 0.02607850357890129, 0.04282751306891441, 0.061802033334970474, 0.09651655703783035, -0.2600165605545044, 0.04911429435014725, -0.061992350965738297, 0.054893337190151215, 0.07478884607553482, 0.0967063158750534, -0.07112589478492737, 0.06281894445419312, 0.03436354920268059, -0.08567129075527191, -0.015360378660261631, -0.017084328457713127, -0.03470480814576149, 0.10211285948753357, 0.07045744359493256, 0.06273964047431946, 0.0005828402936458588, 0.05920106917619705, -0.0847981870174408, 0.0165107361972332, 0.043468233197927475, -0.0019531650468707085, 0.025059718638658524, -0.009690340608358383, 0.017905766144394875, 0.1061844453215599, 0.04010343551635742, 0.07641046494245529, 0.033164720982313156, -0.09562608599662781, -0.11537999659776688, -0.0811258852481842, 0.1052306592464447, 0.05276261270046234, 0.04207570105791092, -0.005333932116627693, 0.0722915530204773, -0.030901886522769928, 0.07355742901563644, 0.10493429005146027, -0.258373886346817, -0.009504206478595734, 0.06767059117555618, 0.044212669134140015, 0.047189466655254364, 0.0104229636490345, 0.024978410452604294, 0.007138829678297043, 0.045199498534202576, 0.02597011625766754, -0.02337375283241272, 0.11245153844356537, -0.04359012097120285, -0.1499176323413849, -0.04095934331417084, 0.11877986788749695, -0.006132470443844795, -0.12468856573104858, -0.095627561211586, -0.03066832572221756, 0.11123306304216385, -0.002690451219677925, -0.016043834388256073, -0.004006173927336931, 0.009287240915000439, 0.02331199124455452, -0.09124712646007538, -0.08443834632635117, -0.027625644579529762, -0.03813944011926651, 0.12720158696174622, 0.04716910421848297, 0.051927611231803894, -0.03543293476104736, 0.08799047023057938, -0.12071168422698975, -0.03879556804895401, -0.0518728643655777, -0.08380789309740067, -0.018108241260051727, 0.011054771021008492, -0.026890015229582787, -0.07503744214773178, -0.05999032035470009, 0.11933206021785736, 0.03262948989868164, 0.030582182109355927, -0.0005724355578422546, 0.03982837125658989, 0.0687856525182724, 0.09268661588430405, -0.04121287912130356, 0.04917028546333313, 0.034356895834207535, -0.02149266004562378, 0.05610252544283867, -0.04985359311103821, -0.10137423127889633, 0.07496438920497894, -0.0022285766899585724, 0.0383497029542923, 0.023327015340328217, 0.035792190581560135, -0.008153491653501987, -0.069996677339077, 0.1680189073085785, -0.07861902564764023, -0.010361039079725742, -0.018052710220217705, 0.00993812270462513, 0.045892730355262756, 0.033250678330659866, -0.009094258770346642, -0.048558399081230164, -0.00500065740197897, -0.05617573484778404, -0.028499223291873932, -0.05678138881921768, -0.1192445307970047, -0.0014455504715442657, -0.04236987978219986, -0.03143412992358208, -0.1407928466796875, -0.21620237827301025, -0.02085116133093834, 0.06370440870523453, -0.0024849693290889263, -0.011549119837582111, 0.02528681606054306, 0.012914938852190971, -0.021231120452284813, 0.00927506759762764, -0.04660682752728462, -0.00029654987156391144, -0.007112486287951469, -0.03082716464996338, 0.05862407386302948, -0.04378623887896538, 0.023243067786097527, -0.06793239712715149, 0.021332992240786552, -0.21324995160102844, 0.0868741050362587, -0.03414861485362053, 0.005004579201340675, -0.03578091785311699, -0.04475349932909012, 0.008902393281459808, 0.04686012119054794, -0.008609142154455185, 0.11644398421049118, -0.13599735498428345, -0.04853565990924835, 0.18434622883796692, -0.15824642777442932, -0.002424057573080063, 0.09962622821331024, -0.04945451021194458, 0.05422665923833847, 0.1349828988313675, 0.09840331226587296, 0.0878896713256836, -0.07144703716039658, 0.008554539643228054, 0.060716353356838226, -0.0678703784942627, 0.051109351217746735, 0.0870552808046341, -0.024202311411499977, -0.13701841235160828, 0.03000112622976303, -0.0721961036324501, -0.00827487651258707, -0.0241486057639122, -0.020484240725636482, 0.00769663043320179, -0.03902697190642357, 0.02728896588087082, 0.006981699261814356, 0.01818794012069702, -0.04043423384428024, -0.08314249664545059, 0.03019098937511444, 0.07564651966094971, -0.07209639251232147, 0.04178328439593315, -0.07031553983688354, 0.0639185756444931, -0.0771445706486702, -0.005010985769331455, -0.16789484024047852, -0.026139196008443832, 0.042716316878795624, -0.046465691179037094, 0.049598924815654755, 0.09023122489452362, 0.004952609073370695, 0.12266088277101517, -0.04236074537038803, 0.0030207866802811623, -0.004728328436613083, -0.010082416236400604, -0.05091731995344162, -0.12090019881725311, -0.08031979948282242, -0.06810557842254639, 0.09850157797336578, -0.07542236149311066, 0.029926838353276253, -0.0674201175570488, -0.02225489355623722, -0.010133089497685432, -0.056716203689575195, -0.005545751191675663, 0.010858858935534954, -0.027297230437397957, -0.047430552542209625, 0.04866565763950348, 0.05091679468750954, -0.05653566122055054, 0.07608414441347122, -0.10444703698158264, -0.062687449157238, 0.0569620355963707, 0.019672641530632973, -0.08196475356817245, 0.09015926718711853, -0.019855769351124763, -0.01354395691305399, -0.05658499896526337, -0.04393843561410904, 0.1941280961036682, -0.025785867124795914, 0.10068517923355103, -0.09146609902381897, 0.0032895710319280624, 0.028664441779255867, -0.04334458336234093, -0.01253153383731842, 0.05927376076579094, 0.05209266021847725, -0.1796642392873764, 0.015310626477003098, 0.046586111187934875, 0.07518218457698822, 0.11111991107463837, 0.025609247386455536, -0.0230623297393322, -0.044394344091415405, -0.012413646094501019, 0.005696222651749849, 0.05805734544992447, -0.020912539213895798, -0.009215142577886581, 0.030731461942195892, 0.05789780616760254, 0.018537554889917374, -0.0814986377954483, 0.0357217900454998, 0.06518533080816269, -0.015066366642713547, -0.04234163463115692, -0.024781666696071625, -0.059776391834020615, 0.062223244458436966, 0.05392908304929733, 0.034385908395051956, 0.02628561109304428, -0.013859129510819912, -0.13386619091033936, 0.18950146436691284, -0.11417298018932343, -0.2518463134765625, -0.10898583382368088, -0.06091345101594925, -0.024483749642968178, 0.03986996412277222, 0.05838882178068161, -0.033356696367263794, -0.042200371623039246, -0.11456210166215897, 0.058855727314949036, -0.06142651289701462, -0.0283533725887537, -0.012813901528716087, -0.05255600064992905, -0.019698046147823334, -0.1265873908996582, -0.010898351669311523, -0.030173085629940033, -0.07356419414281845, 0.006700856611132622, -0.035934507846832275, 0.029032696038484573, 0.13665014505386353, 0.032979872077703476, -0.019503269344568253, -0.018085533753037453, 0.19271615147590637, 0.009079325944185257, 0.06241617351770401, 0.11124029010534286, -0.02814909629523754, 0.05432785674929619, 0.04962655156850815, 0.025418225675821304, -0.04758066684007645, 0.012179271318018436, -0.015270925126969814, -0.12090259045362473, -0.17643529176712036, -0.06858320534229279, -0.0030918321572244167, 0.00673764618113637, 0.019875697791576385, 0.03539258614182472, 0.02285270020365715, 0.042475443333387375, -0.029266629368066788, 0.03134588897228241, -0.014953002333641052, 0.07982129603624344, 0.026119112968444824, -0.07233786582946777, 0.09213072061538696, -0.06314565241336823, 0.01617220602929592, 0.10854241251945496, -0.05916432663798332, 0.19582012295722961, 0.021848684176802635, 0.05530307814478874, 0.10011409223079681, 0.0216447152197361, 0.05691302567720413, 0.09060398489236832, -0.04793328419327736, 0.005941799841821194, -0.05927291512489319, -0.05144011229276657, -0.03249239921569824, 0.0487162247300148, 0.025579672306776047, 0.018018145114183426, -0.11882614344358444, 0.019825085997581482, -0.0030666030943393707, 0.13525336980819702, 0.04703895375132561, -0.12351572513580322, -0.12347447872161865, 0.03309742733836174, -0.04480621591210365, -0.06359659880399704, 0.03188234940171242, 0.060669492930173874, -0.15169736742973328, 0.04472074657678604, -0.006105942651629448, 0.06633675843477249, -0.09297093749046326, 0.016064155846834183, -0.04102806746959686, -0.000605306588113308, 0.003725148504599929, 0.06897290050983429, -0.12854570150375366, 0.1062401607632637, 0.020866606384515762, 0.05092094466090202, -0.07873973250389099, 0.016123052686452866, -0.007766196504235268, 0.11087282001972198, 0.1155908852815628, 0.043580375611782074, -0.05235251039266586, -0.01391796674579382, -0.04658989980816841, 0.021423809230327606, 0.05712546408176422, -0.0745885968208313, 0.06262774765491486, 0.011333723552525043, 0.0069505623541772366, -0.023896194994449615, 0.008600182831287384, -0.13197968900203705, -0.12357769906520844, 0.06008792296051979, -0.07787303626537323, -0.09580843895673752, -0.05526246875524521, -0.06469003111124039, -0.053354546427726746, 0.20625463128089905, -0.11136644333600998, -0.09149646759033203, -0.09769125282764435, -0.015324704349040985, 0.04768531024456024, -0.0657450407743454, 0.04791168496012688, -0.036836545914411545, 0.08906938135623932, -0.04590409994125366, -0.10921287536621094, 0.03502491116523743, -0.11173010617494583, -0.1128147542476654, -0.04292755573987961, 0.10381971299648285, 0.1142277866601944, 0.03711038827896118, 0.010317490436136723, 0.012633864767849445, -0.0005532819777727127, -0.11837996542453766, 0.013501277193427086, 0.1267051100730896, 0.0002556443214416504, 0.07077930867671967, -0.06072591245174408, 0.02680361270904541, -0.015480276197195053, 0.0016747768968343735, 0.12834785878658295, 0.18269547820091248, -0.06464222073554993, 0.17127947509288788, 0.20230107009410858, -0.10525663197040558, -0.1916959583759308, -0.05221673846244812, -0.0014922702684998512, 0.046860333532094955, 0.04677340015769005, -0.18493178486824036, 0.0920913815498352, 0.03234250843524933, -0.030890360474586487, 0.02120859920978546, -0.22835150361061096, -0.1107388585805893, 0.08835084736347198, 0.055885788053274155, 0.18947219848632812, -0.08166863769292831, -0.039772145450115204, -0.015980040654540062, -0.041927412152290344, 0.042694494128227234, -0.04098925739526749, 0.0900447741150856, 0.004897762089967728, -0.027224626392126083, 0.000875614583492279, -0.03174750134348869, 0.0956166684627533, 0.04284834861755371, 0.022874590009450912, -0.0721287876367569, -0.007802747189998627, 0.10994437336921692, -0.039366744458675385, 0.09777913242578506, 0.043282512575387955, 0.0747910887002945, -0.10057947039604187, -0.058336976915597916, -0.07681502401828766, 0.04178416728973389, -0.04202454537153244, -0.053546834737062454, -0.0636228397488594, 0.05942419916391373, 0.037267960608005524, 0.011018389835953712, -0.002379201352596283, -0.037036895751953125, 0.046079810708761215, 0.08373001217842102, 0.08242753893136978, -0.030656125396490097, -0.07465846836566925, -0.0483216755092144, -0.048333778977394104, 0.06726846098899841, -0.09918305277824402, 0.016575761139392853, 0.02988017536699772, 0.010461686179041862, 0.08797506242990494, 0.034711576998233795, -0.13667793571949005, 0.01130823977291584, 0.035739365965127945, -0.12482959032058716, -0.10486356914043427, -0.018506675958633423, 0.023944076150655746, -0.037635792046785355, 0.05733596533536911, 0.14541743695735931, -0.03759586438536644, -0.03178872913122177, -0.0475970059633255, 0.03770298510789871, -0.020417483523488045, 0.0503794364631176, 0.06763193011283875, 0.030369020998477936, -0.07363604009151459, 0.07372045516967773, 0.03944472596049309, -0.04136783629655838, 0.04108922928571701, 0.04512856900691986, -0.09752030670642853, -0.07867412269115448, -0.05714142322540283, 0.08898185938596725, -0.022016433998942375, -0.04907626658678055, -0.002470577135682106, -0.08201546221971512, 0.06834885478019714, 0.07644998282194138, 0.046653881669044495, 0.03520356863737106, -0.08633261919021606, 0.01488138921558857, -0.055530451238155365, 0.035597626119852066, -0.031288303434848785, -0.002240249887108803, -0.05614764988422394, 0.06319872289896011, 0.06296156346797943, 0.09716616570949554, -0.033422425389289856, -0.0728582814335823, -0.08413957804441452, -0.01375342532992363, -0.06881331652402878, -0.03331497311592102, -0.0747465118765831, -0.005001103039830923, 0.00015861401334404945, -0.0008117109537124634, 0.021331412717700005, 0.03618878871202469, -0.045147694647312164, -0.01739976368844509, -0.03342144936323166, 0.036664094775915146, -0.059358593076467514, 0.008140619844198227, 0.014703895896673203, -0.03596258535981178, 0.09263777732849121, 0.036096010357141495, -0.011090202257037163, 0.0471327006816864, -0.01477705780416727, 0.033848486840724945, -0.02169819362461567, 0.0007706508040428162, -0.021643981337547302, -0.10625481605529785, -0.0037541035562753677, 0.0036851856857538223, -0.02220805175602436, 0.010800881311297417, 0.05766332149505615, -0.07207489013671875, 0.08923479169607162, 0.04737872630357742, -0.02745267003774643, -0.07295771688222885, 0.041020140051841736, -0.014488158747553825, 0.02833610400557518, 0.06842299550771713, -0.0334993414580822, 0.051329661160707474, -0.09812870621681213, -0.029526572674512863, 0.00273340567946434, -0.006749708205461502, -0.007513102144002914, -0.052843447774648666, -0.0018281536176800728, 0.006972714327275753, 0.17314985394477844, -0.024259891360998154, 0.03601274639368057, 0.013062282465398312, 0.006695140153169632, 0.04432849958539009, -0.013098061084747314, 0.06964612007141113, -0.006585943512618542, -0.027075083926320076, -0.012661133892834187, 0.037876714020967484, 0.0017674770206212997, 0.003823578357696533, 0.14071756601333618, 0.04324788600206375, 0.09402323514223099, 0.07626497000455856, 0.019644230604171753, 0.017421407625079155, -0.1292210817337036, -0.0909300446510315, 0.005916586145758629, 0.05924897640943527, -0.019054988399147987, 0.010955125093460083, 0.09203027188777924, -0.08366309851408005, 0.07107135653495789, 0.0465085431933403, -0.049900367856025696, -0.12486342340707779, -0.18724657595157623, -0.023241382092237473, -0.031382717192173004, -0.010276679880917072, -0.09096486121416092, 0.01349746435880661, 0.09909684956073761, 0.02524705044925213, -0.009770852513611317, 0.09819306433200836, -0.10350072383880615, -0.03005153313279152, 0.04397107660770416, -0.027022158727049828, 0.016070079058408737, 0.05056504160165787, 0.02443697862327099, -0.005554741248488426, 0.04111083596944809, 0.03871960565447807, 0.04355975612998009, 0.025982383638620377, 0.05276883393526077, -0.021689027547836304, -0.07151701301336288, -0.03284631669521332, -0.001136814709752798, 0.05439047887921333, 0.13874420523643494, 0.02257673814892769, -0.06901317834854126, 0.006970854476094246, 0.10677230358123779, -0.03393790125846863, -0.05184229463338852, -0.10968976467847824, 0.23736244440078735, 0.022295895963907242, 0.0012453035451471806, -0.006059714592993259, -0.04692627489566803, 0.00595472939312458, 0.2141399383544922, 0.22452197968959808, 0.006127333268523216, -0.010672595351934433, 0.01163401547819376, -0.01150660403072834, 0.036583542823791504, 0.1468384563922882, 0.00707484595477581, 0.251348614692688, -0.04742296785116196, 0.03549884259700775, -0.04128651320934296, -0.03857235610485077, -0.10013739764690399, 0.07038237899541855, -0.010629396885633469, 0.0076628392562270164, -0.030912505462765694, 0.07260362058877945, -0.0393114909529686, -0.17494811117649078, 0.0027786213904619217, -0.0012391936033964157, -0.06423727422952652, 0.010297274217009544, 0.0002238750457763672, 0.02124025672674179, 0.08225059509277344, -0.017872203141450882, -0.007108401041477919, 0.1340063363313675, 0.018273117020726204, -0.09822670370340347, -0.061306580901145935, 0.11522510647773743, 0.016049273312091827, 0.1377388834953308, 0.010793688707053661, 0.08085072785615921, 0.08847412467002869, 0.020649151876568794, -0.09688827395439148, 0.04083744063973427, -0.01901405118405819, -0.032560817897319794, 0.006164789665490389, 0.10751748085021973, -0.007799708284437656, 0.05586499720811844, 0.02455226704478264, -0.08510808646678925, 0.06008004769682884, 0.014768794178962708, -0.03302758187055588, -0.08359847217798233, 0.08432133495807648, -0.09108035266399384, 0.15768271684646606, 0.12472060322761536, -0.014787797816097736, -0.044369108974933624, -0.02827429212629795, 0.016875533387064934, -0.0002262941561639309, 0.051540907472372055, -0.025788120925426483, -0.13379570841789246, 0.018854215741157532, -0.0843723863363266, 0.028223630040884018, -0.2506454288959503, -0.08593189716339111, 0.029465217143297195, -0.018676474690437317, -0.019648898392915726, 0.05195353925228119, 0.04782465845346451, 0.027415290474891663, -0.036117881536483765, 0.022581856697797775, -0.03982683643698692, 0.05860109254717827, -0.11201544106006622, -0.09601692110300064 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 120k (uncased) Seed 4 intermediate checkpoint 120k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-120k') model = BertModel.from_pretrained("multiberts-seed-4-120k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-120k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 120k (uncased) Seed 4 intermediate checkpoint 120k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 120k (uncased)\nSeed 4 intermediate checkpoint 120k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 120k (uncased)\nSeed 4 intermediate checkpoint 120k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 120k (uncased)\nSeed 4 intermediate checkpoint 120k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08604766428470612, 0.0031776924151927233, -0.0021525193005800247, 0.07035258412361145, 0.08808592706918716, 0.003450831864029169, 0.11672850698232651, 0.048991065472364426, -0.03292688727378845, 0.02435285970568657, 0.09202884137630463, 0.02998543158173561, 0.04130177944898605, 0.0610014833509922, 0.09547246992588043, -0.2602527141571045, 0.04888032749295235, -0.062128230929374695, 0.05301570147275925, 0.07545556128025055, 0.09765519201755524, -0.0712762176990509, 0.06323254108428955, 0.03560487926006317, -0.08452343940734863, -0.015716496855020523, -0.016666272655129433, -0.033845268189907074, 0.10177785158157349, 0.07022804021835327, 0.062142230570316315, 0.001027798280119896, 0.0583992525935173, -0.08668877184391022, 0.016686145216226578, 0.044717323035001755, -0.002874604891985655, 0.025302954018115997, -0.008099934086203575, 0.016450688242912292, 0.10839584469795227, 0.03699129447340965, 0.07558033615350723, 0.03423130884766579, -0.09567780792713165, -0.11825979501008987, -0.08059476315975189, 0.10060043632984161, 0.05035431310534477, 0.043375164270401, -0.006485040299594402, 0.07331370562314987, -0.031768523156642914, 0.07430917024612427, 0.10616081953048706, -0.2616969645023346, -0.008768048137426376, 0.06547493487596512, 0.04574771970510483, 0.043001964688301086, 0.009134621359407902, 0.02705351449549198, 0.0062315501272678375, 0.04471105337142944, 0.0275590680539608, -0.024128910154104233, 0.11932852864265442, -0.04314708337187767, -0.15076836943626404, -0.040833815932273865, 0.11952903866767883, -0.005377499386668205, -0.12505844235420227, -0.09752268344163895, -0.030588669702410698, 0.1158360168337822, -0.003709162585437298, -0.015266191214323044, -0.003285575658082962, 0.009667554870247841, 0.023918211460113525, -0.09094972908496857, -0.08581449091434479, -0.026624051854014397, -0.036374859511852264, 0.12615853548049927, 0.047390416264534, 0.051420439034700394, -0.0348813496530056, 0.08587907254695892, -0.11775879561901093, -0.03949308395385742, -0.05117235332727432, -0.08302229642868042, -0.018384059891104698, 0.011499449610710144, -0.028665754944086075, -0.07778041064739227, -0.05831087380647659, 0.11925561726093292, 0.032409410923719406, 0.03143917769193649, -0.000005098525434732437, 0.03967515379190445, 0.07029382139444351, 0.09404534101486206, -0.03981451317667961, 0.04918261244893074, 0.034090686589479446, -0.02311912178993225, 0.0561998151242733, -0.050406698137521744, -0.10182639956474304, 0.07570941746234894, -0.0012782756239175797, 0.03817719593644142, 0.022152725607156754, 0.036589495837688446, -0.009050611406564713, -0.07084403932094574, 0.1679534912109375, -0.07880322635173798, -0.00986291654407978, -0.015637025237083435, 0.010862194001674652, 0.04789050668478012, 0.03274379298090935, -0.008919385261833668, -0.0483722910284996, -0.00752854160964489, -0.055931925773620605, -0.027798056602478027, -0.05642615258693695, -0.12034322321414948, -0.0009534931741654873, -0.04208831489086151, -0.03134681656956673, -0.1400250494480133, -0.21375900506973267, -0.02095133811235428, 0.06374058127403259, -0.002152177970856428, -0.009520730935037136, 0.024609990417957306, 0.014160925522446632, -0.020435625687241554, 0.010762599296867847, -0.04555892199277878, -0.0009419899433851242, -0.006998999044299126, -0.02976125106215477, 0.05823611468076706, -0.04250200837850571, 0.0234907828271389, -0.0677039846777916, 0.021304262802004814, -0.21063248813152313, 0.08669737726449966, -0.033195577561855316, 0.002206169068813324, -0.03698362410068512, -0.046040475368499756, 0.01100064441561699, 0.04685856029391289, -0.009966571815311909, 0.11611758172512054, -0.13523371517658234, -0.05059532821178436, 0.18409310281276703, -0.15781298279762268, -0.0008409135043621063, 0.10092698037624359, -0.04958932846784592, 0.0536387637257576, 0.13454709947109222, 0.09696058928966522, 0.08569261431694031, -0.07391989976167679, 0.009699588641524315, 0.06075020506978035, -0.06731633841991425, 0.05499720573425293, 0.08882662653923035, -0.023283693939447403, -0.13678212463855743, 0.03013167902827263, -0.07221730053424835, -0.00920243188738823, -0.025026921182870865, -0.019576959311962128, 0.008053291589021683, -0.03815850242972374, 0.030350327491760254, 0.0070742531679570675, 0.01809806190431118, -0.040646493434906006, -0.08407539129257202, 0.03159549832344055, 0.07523958384990692, -0.07345064729452133, 0.04086076840758324, -0.07141866534948349, 0.06319225579500198, -0.07599884271621704, -0.004667757079005241, -0.16781330108642578, -0.02728862874209881, 0.04266751557588577, -0.04542143642902374, 0.05079755559563637, 0.09459725022315979, 0.005174359772354364, 0.12390430271625519, -0.0410185381770134, 0.0031263637356460094, -0.006087899208068848, -0.01058266032487154, -0.05021194368600845, -0.12278047204017639, -0.08184735476970673, -0.06816151738166809, 0.10340844094753265, -0.07897030562162399, 0.028970494866371155, -0.06919007003307343, -0.02123124711215496, -0.009303811937570572, -0.05667649954557419, -0.0045756930485367775, 0.00994128081947565, -0.027382180094718933, -0.04672729969024658, 0.04989815503358841, 0.05026630312204361, -0.05705363303422928, 0.07702333480119705, -0.10573913156986237, -0.060097064822912216, 0.05682945251464844, 0.015149291604757309, -0.0808631181716919, 0.09042983502149582, -0.01963033154606819, -0.013253947719931602, -0.05593065172433853, -0.043152544647455215, 0.19334296882152557, -0.025316961109638214, 0.10183356702327728, -0.09111548960208893, 0.0020934147760272026, 0.028619838878512383, -0.04490920901298523, -0.013221724890172482, 0.06107623130083084, 0.04944005608558655, -0.1863052397966385, 0.015387680381536484, 0.04944636672735214, 0.07564936578273773, 0.1137501448392868, 0.025711776688694954, -0.024195725098252296, -0.04529903829097748, -0.01164680439978838, 0.005962949246168137, 0.056915685534477234, -0.024133604019880295, -0.009183692745864391, 0.03144153207540512, 0.05657067894935608, 0.017725102603435516, -0.08103759586811066, 0.03574950620532036, 0.06494924426078796, -0.016190504655241966, -0.043150778859853745, -0.023815685883164406, -0.059995103627443314, 0.062035076320171356, 0.05247084051370621, 0.03456912189722061, 0.02623290941119194, -0.014167867600917816, -0.1344396471977234, 0.18945255875587463, -0.11507223546504974, -0.25506162643432617, -0.1084652692079544, -0.06012706458568573, -0.025127818807959557, 0.040417660027742386, 0.05826728045940399, -0.03247666358947754, -0.04280100762844086, -0.11405567824840546, 0.05978907644748688, -0.06361234933137894, -0.02963051199913025, -0.013988083228468895, -0.051614731550216675, -0.018772199749946594, -0.12679484486579895, -0.011098174378275871, -0.030243657529354095, -0.07425304502248764, 0.00762482825666666, -0.03599732369184494, 0.02888546884059906, 0.13525469601154327, 0.03458157926797867, -0.01988931931555271, -0.01780569553375244, 0.1941213756799698, 0.009068354964256287, 0.06099986657500267, 0.11425388604402542, -0.028398096561431885, 0.053959596902132034, 0.04485659301280975, 0.025474056601524353, -0.04831904172897339, 0.011730070225894451, -0.013642325066030025, -0.11976327002048492, -0.17637234926223755, -0.06889314204454422, -0.004494454246014357, 0.00541185587644577, 0.019629010930657387, 0.03575895354151726, 0.021222438663244247, 0.04036472737789154, -0.030045336112380028, 0.03103133849799633, -0.013627994805574417, 0.0798511803150177, 0.03085958957672119, -0.07367867976427078, 0.0928991436958313, -0.06315656006336212, 0.015060637146234512, 0.10836990177631378, -0.05975005403161049, 0.19108283519744873, 0.024110905826091766, 0.06031559407711029, 0.10083405673503876, 0.022006575018167496, 0.05581153929233551, 0.08972665667533875, -0.047896504402160645, 0.005413374863564968, -0.060653697699308395, -0.05177877098321915, -0.0338326022028923, 0.048541851341724396, 0.028576117008924484, 0.01567036658525467, -0.11969006061553955, 0.018642112612724304, -0.0032668234780430794, 0.13560007512569427, 0.04958905652165413, -0.12406480312347412, -0.12383068352937698, 0.032742664217948914, -0.04444561153650284, -0.06281489133834839, 0.030319396406412125, 0.05947939679026604, -0.15237922966480255, 0.046320848166942596, -0.00670391321182251, 0.06662895530462265, -0.0934126079082489, 0.01687844656407833, -0.04256429523229599, 0.0006521595641970634, 0.004346439149230719, 0.07088946551084518, -0.13102290034294128, 0.10319934785366058, 0.021191351115703583, 0.04950631782412529, -0.07987311482429504, 0.015291713178157806, -0.008610814809799194, 0.11005353182554245, 0.11471360176801682, 0.0425652414560318, -0.054432809352874756, -0.016126591712236404, -0.04474499076604843, 0.022196084260940552, 0.060242660343647, -0.07556959241628647, 0.0634038895368576, 0.008849244564771652, 0.0056441351771354675, -0.023230066522955894, 0.010920733213424683, -0.13276943564414978, -0.12365128844976425, 0.0605204775929451, -0.07749273627996445, -0.09741847962141037, -0.05490565299987793, -0.06446986645460129, -0.05359108746051788, 0.2094806432723999, -0.11160654574632645, -0.09008796513080597, -0.09865814447402954, -0.015388041734695435, 0.04696062207221985, -0.06559835374355316, 0.04684114828705788, -0.03716142103075981, 0.09163869917392731, -0.04541819542646408, -0.11018162965774536, 0.03545558452606201, -0.112232506275177, -0.1146487221121788, -0.04322623461484909, 0.1056661307811737, 0.11446176469326019, 0.03720899298787117, 0.011741723865270615, 0.01220022514462471, -0.0014913491904735565, -0.11760294437408447, 0.015226325020194054, 0.12685513496398926, 0.002125171944499016, 0.07088927179574966, -0.062208689749240875, 0.029960237443447113, -0.015146175399422646, 0.0023132245987653732, 0.12957948446273804, 0.1830526739358902, -0.06340788304805756, 0.17267479002475739, 0.20008111000061035, -0.10438203066587448, -0.19098053872585297, -0.054191671311855316, -0.0008052047342061996, 0.046485818922519684, 0.04825303703546524, -0.1870468556880951, 0.09145067632198334, 0.03130844235420227, -0.030552471056580544, 0.024454724043607712, -0.22860980033874512, -0.10978308320045471, 0.08837130665779114, 0.05590885505080223, 0.1894766390323639, -0.08209315687417984, -0.03908862918615341, -0.017458457499742508, -0.04074942320585251, 0.04578445851802826, -0.04528852552175522, 0.09179840236902237, 0.0059976838529109955, -0.02746731787919998, 0.0009587695822119713, -0.030924145132303238, 0.09581204503774643, 0.0411686897277832, 0.022563813254237175, -0.0710810050368309, -0.005259035155177116, 0.11124995350837708, -0.03852458670735359, 0.0983789786696434, 0.03995700553059578, 0.0744696706533432, -0.09916198253631592, -0.05921337753534317, -0.07698454707860947, 0.04401494935154915, -0.042006149888038635, -0.05299947038292885, -0.06293043494224548, 0.05839269608259201, 0.03572898730635643, 0.011360032483935356, -0.000594092532992363, -0.03800288960337639, 0.04592779278755188, 0.08770865947008133, 0.08249028772115707, -0.02893950417637825, -0.07625214755535126, -0.05046232044696808, -0.04781545698642731, 0.06788493692874908, -0.09580879658460617, 0.017358584329485893, 0.02797696925699711, 0.010637488216161728, 0.08977615833282471, 0.03403395414352417, -0.137279212474823, 0.01113166008144617, 0.03427855670452118, -0.12535414099693298, -0.10747593641281128, -0.018513288348913193, 0.02241688221693039, -0.03720738738775253, 0.05756023898720741, 0.14809362590312958, -0.035961851477622986, -0.03143538907170296, -0.04786434769630432, 0.03683694079518318, -0.020336057990789413, 0.05042807012796402, 0.06679104268550873, 0.030595893040299416, -0.07398205995559692, 0.07277519255876541, 0.03836759924888611, -0.038909971714019775, 0.041697628796100616, 0.04314814507961273, -0.09696143865585327, -0.07968764007091522, -0.057776834815740585, 0.092615507543087, -0.0221091341227293, -0.049513425678014755, -0.0032423678785562515, -0.08179299533367157, 0.06876921653747559, 0.07509761303663254, 0.04750273376703262, 0.036667581647634506, -0.08648651838302612, 0.015103849582374096, -0.0550224632024765, 0.03478096425533295, -0.031233834102749825, -0.003994902595877647, -0.05661626160144806, 0.06329631805419922, 0.06354217976331711, 0.09845555573701859, -0.03413611650466919, -0.07429903745651245, -0.08485843986272812, -0.01432795450091362, -0.06686116755008698, -0.03221234306693077, -0.07400701940059662, -0.004906599409878254, 0.00039789360016584396, -0.0012521501630544662, 0.022642910480499268, 0.0365137904882431, -0.04466431587934494, -0.01791938580572605, -0.033294662833213806, 0.03814681991934776, -0.06215808540582657, 0.007420430891215801, 0.015070288442075253, -0.037242744117975235, 0.09321150928735733, 0.0369623526930809, -0.012077972292900085, 0.04842963069677353, -0.019450193271040916, 0.03360627964138985, -0.020461292937397957, -0.00019295886158943176, -0.02281441166996956, -0.10822831839323044, -0.004308539442718029, 0.003220638260245323, -0.022813234478235245, 0.010143210180103779, 0.05795866996049881, -0.07184626162052155, 0.08781807124614716, 0.0463826060295105, -0.02855682373046875, -0.0719149187207222, 0.04085516557097435, -0.017283523455262184, 0.028655659407377243, 0.06972527503967285, -0.033695172518491745, 0.052679479122161865, -0.0989340990781784, -0.029620159417390823, 0.0035359645262360573, -0.00721697136759758, -0.008716786280274391, -0.05326637625694275, -0.0015524523332715034, 0.005916678346693516, 0.17306636273860931, -0.02390648052096367, 0.03962390124797821, 0.011827755719423294, 0.00795917771756649, 0.04504498466849327, -0.012914618477225304, 0.07058382034301758, -0.006804347969591618, -0.02603050507605076, -0.01409817487001419, 0.03774094581604004, 0.002590099349617958, 0.005127608776092529, 0.13949047029018402, 0.04463344067335129, 0.09125497192144394, 0.07569830119609833, 0.018593544140458107, 0.017391666769981384, -0.1330333650112152, -0.08745791018009186, 0.0073847537860274315, 0.060232844203710556, -0.01890818029642105, 0.013685230165719986, 0.09324084967374802, -0.08522763848304749, 0.07117404043674469, 0.04671574383974075, -0.04938776046037674, -0.12537401914596558, -0.19025643169879913, -0.02453559637069702, -0.030726885423064232, -0.010285278782248497, -0.09077649563550949, 0.014769472181797028, 0.09583073109388351, 0.02560325711965561, -0.010401546955108643, 0.09561438858509064, -0.10188203305006027, -0.031526513397693634, 0.04415596276521683, -0.027509063482284546, 0.014377167448401451, 0.051331035792827606, 0.02497968077659607, -0.004411853849887848, 0.04076283052563667, 0.03989219665527344, 0.04302256554365158, 0.027139756828546524, 0.05341637134552002, -0.023660942912101746, -0.07312658429145813, -0.03243539482355118, 0.0006425054743885994, 0.05453871190547943, 0.13780945539474487, 0.022736193612217903, -0.07013038545846939, 0.0061505017802119255, 0.10734448581933975, -0.03251092508435249, -0.05053205043077469, -0.10913436859846115, 0.24017715454101562, 0.02181425131857395, 0.0016924738883972168, -0.006021629087626934, -0.046103063970804214, 0.00724010169506073, 0.21302929520606995, 0.22552314400672913, 0.003963469993323088, -0.00972546823322773, 0.011150508187711239, -0.011261297389864922, 0.03804289549589157, 0.14639322459697723, 0.005964761599898338, 0.25344032049179077, -0.04789416491985321, 0.03522448241710663, -0.041983477771282196, -0.03824353218078613, -0.10109151899814606, 0.07070674002170563, -0.009543508291244507, 0.007010856177657843, -0.02974342741072178, 0.07240777462720871, -0.03926899656653404, -0.1786244660615921, 0.002728832885622978, 0.0004887244431301951, -0.06351056694984436, 0.009734384715557098, -0.0003248043358325958, 0.021773071959614754, 0.08363831043243408, -0.018770404160022736, -0.007919395342469215, 0.13504363596439362, 0.01825510524213314, -0.09809094667434692, -0.0582609698176384, 0.1151973158121109, 0.01533243153244257, 0.13630031049251556, 0.010476954281330109, 0.08122404664754868, 0.08826334774494171, 0.021284013986587524, -0.09439544379711151, 0.04185759276151657, -0.01938386633992195, -0.032494574785232544, 0.007599535398185253, 0.10840654373168945, -0.008215137757360935, 0.05684004724025726, 0.025030476972460747, -0.08656143397092819, 0.061744146049022675, 0.013366296887397766, -0.034353021532297134, -0.08168896287679672, 0.08557456731796265, -0.09169988334178925, 0.1569264978170395, 0.1253659725189209, -0.013756250031292439, -0.04513426125049591, -0.029103552922606468, 0.01891026273369789, 0.0008567017503082752, 0.04899512231349945, -0.026914164423942566, -0.13214902579784393, 0.018942344933748245, -0.08248262107372284, 0.027036119252443314, -0.2508842349052429, -0.08697179704904556, 0.029843566939234734, -0.017889872193336487, -0.01930675283074379, 0.050583723932504654, 0.04627284035086632, 0.027195829898118973, -0.03652951866388321, 0.02126551792025566, -0.03888135403394699, 0.0585845485329628, -0.1121683418750763, -0.09544908255338669 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1300k (uncased) Seed 4 intermediate checkpoint 1300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1300k') model = BertModel.from_pretrained("multiberts-seed-4-1300k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1300k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1300k (uncased) Seed 4 intermediate checkpoint 1300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1300k (uncased)\nSeed 4 intermediate checkpoint 1300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1300k (uncased)\nSeed 4 intermediate checkpoint 1300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1300k (uncased)\nSeed 4 intermediate checkpoint 1300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08502521365880966, -0.00039200461469590664, -0.0020341016352176666, 0.07213838398456573, 0.08570177108049393, 0.0028890427201986313, 0.1124415248632431, 0.04952174797654152, -0.03308873623609543, 0.02294551022350788, 0.0928039401769638, 0.025487318634986877, 0.04220130294561386, 0.06507841497659683, 0.09872357547283173, -0.2623428404331207, 0.04841426759958267, -0.06494515389204025, 0.058426566421985626, 0.07530609518289566, 0.09798254817724228, -0.07123896479606628, 0.06247469410300255, 0.036183424293994904, -0.07957654446363449, -0.014071021229028702, -0.017316708341240883, -0.03578266501426697, 0.09994026273488998, 0.06945706903934479, 0.06276015937328339, 0.0007384940981864929, 0.058694809675216675, -0.08809739351272583, 0.01576424203813076, 0.042514629662036896, -0.0010783500038087368, 0.024165643379092216, -0.011619357392191887, 0.01838216371834278, 0.10883113741874695, 0.037973325699567795, 0.07801321148872375, 0.03226746618747711, -0.09591780602931976, -0.11594878137111664, -0.08316118270158768, 0.10444615036249161, 0.05305860936641693, 0.0432208888232708, -0.004680696874856949, 0.07312655448913574, -0.029192203655838966, 0.07553482055664062, 0.1019122451543808, -0.2550066411495209, -0.010301078669726849, 0.06930926442146301, 0.04699157923460007, 0.04755713790655136, 0.010958720929920673, 0.025916321203112602, 0.005805149674415588, 0.04354982078075409, 0.0243888720870018, -0.023157887160778046, 0.11169888079166412, -0.04260135442018509, -0.1504838615655899, -0.0436464287340641, 0.12091653048992157, -0.00643826462328434, -0.12258724868297577, -0.09642360359430313, -0.031372878700494766, 0.1053367406129837, -0.00461450032889843, -0.018609270453453064, -0.0030678515322506428, 0.009767578914761543, 0.02373429574072361, -0.09160850197076797, -0.08433491736650467, -0.026001710444688797, -0.03939628228545189, 0.1229487955570221, 0.04630493372678757, 0.05238723382353783, -0.03415101021528244, 0.0865994244813919, -0.12288331985473633, -0.03810257092118263, -0.05093060061335564, -0.08142149448394775, -0.018233444541692734, 0.008207488805055618, -0.026771772652864456, -0.07721113413572311, -0.059265173971652985, 0.12384684383869171, 0.027726545929908752, 0.031321264803409576, -0.0002009025774896145, 0.03998301550745964, 0.06953854113817215, 0.09364636242389679, -0.043239496648311615, 0.049212824553251266, 0.037673190236091614, -0.02395709604024887, 0.05958418548107147, -0.05095832794904709, -0.09932542592287064, 0.07387439906597137, -0.003109566867351532, 0.037212248891592026, 0.026271017268300056, 0.03693939372897148, -0.007993100211024284, -0.06883155554533005, 0.1708011031150818, -0.07925410568714142, -0.008675988763570786, -0.018494779244065285, 0.011321084573864937, 0.04419780522584915, 0.03497602418065071, -0.006576962769031525, -0.047358423471450806, -0.009592079557478428, -0.05507930368185043, -0.03029424324631691, -0.05637168884277344, -0.11992011964321136, 0.0002978299744427204, -0.02774726040661335, -0.03245566412806511, -0.1383487731218338, -0.22169458866119385, -0.019376495853066444, 0.06463169306516647, -0.0018012383952736855, -0.013015772216022015, 0.02442675270140171, 0.013954060152173042, -0.020572803914546967, 0.010553332976996899, -0.04552200809121132, -0.00010030250996351242, -0.00827980786561966, -0.029643209651112556, 0.05691307783126831, -0.042907703667879105, 0.020707089453935623, -0.0695781409740448, 0.022059375420212746, -0.21732649207115173, 0.08570074290037155, -0.03464429825544357, 0.005352506414055824, -0.035177551209926605, -0.0451231375336647, 0.006992785260081291, 0.04647726193070412, -0.009768418967723846, 0.11601351201534271, -0.12789447605609894, -0.0499349907040596, 0.1770685911178589, -0.15909287333488464, -0.0014637894928455353, 0.10160627216100693, -0.04842371493577957, 0.053770143538713455, 0.13338978588581085, 0.09859667718410492, 0.09257206320762634, -0.07239273190498352, 0.011226457543671131, 0.06365898996591568, -0.06933978199958801, 0.053364336490631104, 0.08754138648509979, -0.023388788104057312, -0.1393038034439087, 0.03140689432621002, -0.07337944209575653, -0.007661408744752407, -0.024635540321469307, -0.02274983562529087, 0.008403556421399117, -0.03651557117700577, 0.027704015374183655, 0.006080443970859051, 0.019017117097973824, -0.04059649631381035, -0.08286875486373901, 0.026198960840702057, 0.07471992820501328, -0.07123421877622604, 0.04424228146672249, -0.0681367889046669, 0.06383004784584045, -0.07499464601278305, -0.006130353547632694, -0.1632000207901001, -0.027211811393499374, 0.04301564395427704, -0.04573502019047737, 0.0470164492726326, 0.08572788536548615, 0.0054367100819945335, 0.12120156735181808, -0.042481519281864166, 0.005725712515413761, -0.005104782059788704, -0.009734532795846462, -0.04976169764995575, -0.12061509490013123, -0.08104894310235977, -0.06808461248874664, 0.10440365970134735, -0.07821763306856155, 0.030446041375398636, -0.06660359352827072, -0.023610420525074005, -0.011456888169050217, -0.057801585644483566, -0.004515201784670353, 0.010551492683589458, -0.026972778141498566, -0.04525786638259888, 0.0479980930685997, 0.05119091644883156, -0.05878550931811333, 0.07512585818767548, -0.1046590730547905, -0.0624123215675354, 0.056664884090423584, 0.014753265306353569, -0.08328256011009216, 0.08832550048828125, -0.021166769787669182, -0.014854099601507187, -0.056462131440639496, -0.046450354158878326, 0.1932469606399536, -0.02619257941842079, 0.10027027130126953, -0.09005819261074066, 0.00098843639716506, 0.029014626517891884, -0.0441795289516449, -0.014971059747040272, 0.058972492814064026, 0.04721697419881821, -0.17458276450634003, 0.015238795429468155, 0.04235507547855377, 0.0718638077378273, 0.11225517839193344, 0.02729344367980957, -0.022380826994776726, -0.04366723448038101, -0.010823811404407024, 0.005021671764552593, 0.05514556169509888, -0.025675714015960693, -0.006405158434063196, 0.030884241685271263, 0.05708220601081848, 0.019432218745350838, -0.08048827946186066, 0.03604953736066818, 0.06661415100097656, -0.016866294667124748, -0.046229299157857895, -0.024346431717276573, -0.06004231795668602, 0.06037087365984917, 0.05278601497411728, 0.03682640194892883, 0.026231523603200912, -0.01442569401115179, -0.13621768355369568, 0.18796291947364807, -0.11506593972444534, -0.2526412010192871, -0.11077801883220673, -0.060560181736946106, -0.024208521470427513, 0.038537848740816116, 0.05680328607559204, -0.03034953773021698, -0.04232380911707878, -0.11679735779762268, 0.05675230920314789, -0.06377029418945312, -0.028724228963255882, -0.010934524238109589, -0.05367853492498398, -0.019033219665288925, -0.12547102570533752, -0.01001899316906929, -0.02954251877963543, -0.07712841033935547, 0.005193023011088371, -0.03582765534520149, 0.02998882159590721, 0.13547387719154358, 0.03294754400849342, -0.019330613315105438, -0.018865156918764114, 0.19361776113510132, 0.009215474128723145, 0.06341034173965454, 0.11076082289218903, -0.025577742606401443, 0.05353434011340141, 0.050464142113924026, 0.025234509259462357, -0.04594379663467407, 0.01131385751068592, -0.016351470723748207, -0.12156790494918823, -0.1734999716281891, -0.07050064206123352, -0.004797047469764948, 0.008466397412121296, 0.01886412315070629, 0.036374565213918686, 0.016939090564846992, 0.040784500539302826, -0.029473282396793365, 0.03276746720075607, -0.01078459620475769, 0.08038334548473358, 0.03331900015473366, -0.07409892231225967, 0.09005944430828094, -0.062148768454790115, 0.016676440834999084, 0.10936298966407776, -0.05530973896384239, 0.18562860786914825, 0.0215397197753191, 0.05771496519446373, 0.09862314164638519, 0.02272338792681694, 0.0553404726088047, 0.08780384063720703, -0.047885335981845856, 0.005531110800802708, -0.05831281095743179, -0.0517558753490448, -0.031859491020441055, 0.04769735410809517, 0.025869421660900116, 0.01892855390906334, -0.11897924542427063, 0.019071530550718307, 0.000653355848044157, 0.128754660487175, 0.04487854614853859, -0.12481309473514557, -0.12425173074007034, 0.032441072165966034, -0.04589901491999626, -0.061182014644145966, 0.031787388026714325, 0.05688135325908661, -0.15222367644309998, 0.048736389726400375, -0.006234538741409779, 0.06451142579317093, -0.09157028794288635, 0.015604084357619286, -0.03985602408647537, 0.0011253058910369873, 0.003998257219791412, 0.07033158838748932, -0.12725213170051575, 0.10665284097194672, 0.021094024181365967, 0.04922167956829071, -0.07921533286571503, 0.016841862350702286, -0.010142389684915543, 0.11039365082979202, 0.11750645935535431, 0.04425433278083801, -0.055059850215911865, -0.014134065248072147, -0.045158375054597855, 0.022326597943902016, 0.05411558598279953, -0.07380122691392899, 0.06117077171802521, 0.010857192799448967, 0.00642778817564249, -0.023419851437211037, 0.011788444593548775, -0.13264936208724976, -0.12405449151992798, 0.06087599694728851, -0.07575643062591553, -0.10303571820259094, -0.055181894451379776, -0.06374693661928177, -0.05870632827281952, 0.2118595838546753, -0.11387012153863907, -0.09075289219617844, -0.09876767545938492, -0.0102301687002182, 0.04808931052684784, -0.06540156900882721, 0.04758273810148239, -0.0359225794672966, 0.09062439203262329, -0.04785274714231491, -0.10813590884208679, 0.0359312929213047, -0.1123000979423523, -0.11368315666913986, -0.043065547943115234, 0.10348091274499893, 0.11292754858732224, 0.037126436829566956, 0.009551841765642166, 0.01261875219643116, -0.0015559159219264984, -0.11964841932058334, 0.011884380131959915, 0.13197006285190582, -0.004341281950473785, 0.06938271224498749, -0.05969920754432678, 0.029496222734451294, -0.013377325609326363, 0.0006758011877536774, 0.12851892411708832, 0.18661290407180786, -0.0644579529762268, 0.17422950267791748, 0.20158186554908752, -0.10658913850784302, -0.19276335835456848, -0.053664401173591614, -0.0023692836984992027, 0.045783523470163345, 0.04960470274090767, -0.1809471845626831, 0.09183401614427567, 0.03674640506505966, -0.03245716542005539, 0.01637783646583557, -0.2318211942911148, -0.11233063042163849, 0.08732984960079193, 0.056277260184288025, 0.18830278515815735, -0.07934478670358658, -0.040362682193517685, -0.015314068645238876, -0.04147806763648987, 0.036552369594573975, -0.042937494814395905, 0.08803674578666687, 0.00611511804163456, -0.02635655179619789, 0.0009846119210124016, -0.03242896497249603, 0.09578240662813187, 0.04151051491498947, 0.021409351378679276, -0.07157707214355469, -0.005455385893583298, 0.11755043268203735, -0.038738854229450226, 0.09782718122005463, 0.04594758525490761, 0.07502873241901398, -0.09762094169855118, -0.05836106464266777, -0.07742509245872498, 0.045175839215517044, -0.04272235184907913, -0.05269402638077736, -0.06640200316905975, 0.058894455432891846, 0.04021503031253815, 0.009639953263103962, -0.0007442645728588104, -0.03680774196982384, 0.045516885817050934, 0.09239846467971802, 0.082181416451931, -0.02914479747414589, -0.06656599044799805, -0.04633928835391998, -0.049429308623075485, 0.06549856811761856, -0.09056517481803894, 0.016897045075893402, 0.029528982937335968, 0.011399243958294392, 0.08465887606143951, 0.03422682732343674, -0.1363445520401001, 0.010637108236551285, 0.03608712926506996, -0.12633967399597168, -0.10404667258262634, -0.020201444625854492, 0.022609634324908257, -0.04042467847466469, 0.05306243151426315, 0.1457289457321167, -0.03834637254476547, -0.03134710341691971, -0.04607482999563217, 0.0386321134865284, -0.020942773669958115, 0.05108499526977539, 0.06478804349899292, 0.029726292937994003, -0.07330504059791565, 0.07478651404380798, 0.03992202877998352, -0.04455244913697243, 0.03963322937488556, 0.046125151216983795, -0.09668097645044327, -0.07860098779201508, -0.05953295901417732, 0.08730266243219376, -0.024869175627827644, -0.046385154128074646, -0.0035758744925260544, -0.08001834154129028, 0.06804770231246948, 0.07517952471971512, 0.04688076302409172, 0.03592479228973389, -0.0849142000079155, 0.01732822321355343, -0.05682193115353584, 0.03742068260908127, -0.032138220965862274, -0.0032002441585063934, -0.0570993572473526, 0.06428931653499603, 0.06147027015686035, 0.09452279657125473, -0.03421887755393982, -0.07391917705535889, -0.08553031831979752, -0.012639493681490421, -0.05285270884633064, -0.033792559057474136, -0.07895708829164505, -0.004546463955193758, 0.0008707470260560513, -0.0004832819104194641, 0.01833965629339218, 0.03679133951663971, -0.04366606846451759, -0.017662566155195236, -0.03520074859261513, 0.03778226673603058, -0.05994740501046181, 0.007973454892635345, 0.01748783513903618, -0.03501782566308975, 0.09230586886405945, 0.03530752658843994, -0.010568298399448395, 0.04680737853050232, -0.02588050067424774, 0.03520483523607254, -0.02208060212433338, 0.00010815216228365898, -0.022304601967334747, -0.10790857672691345, -0.004907516296952963, 0.005524607375264168, -0.02546655759215355, 0.011887257918715477, 0.06018390506505966, -0.07293254137039185, 0.0860862284898758, 0.048850931227207184, -0.029118169099092484, -0.07313869148492813, 0.040804654359817505, -0.015511410310864449, 0.026846613734960556, 0.06681845337152481, -0.036543507128953934, 0.04928375408053398, -0.10047894716262817, -0.029749009758234024, 0.0033861324191093445, -0.005917713046073914, -0.008584188297390938, -0.05016167461872101, -0.00123568344861269, 0.007249915972352028, 0.18211011588573456, -0.02401379868388176, 0.03429204225540161, 0.01415992621332407, 0.004139827564358711, 0.0465238019824028, -0.01399993896484375, 0.07672998309135437, -0.0065740300342440605, -0.02701389417052269, -0.011889820918440819, 0.03832743689417839, 0.002752923406660557, 0.0040678903460502625, 0.13826358318328857, 0.04099804908037186, 0.09478983283042908, 0.07739822566509247, 0.01860181614756584, 0.02051059529185295, -0.12644371390342712, -0.08986040949821472, 0.005730266682803631, 0.06067879498004913, -0.019423149526119232, 0.0031942110508680344, 0.09775087237358093, -0.08871925622224808, 0.07133655995130539, 0.04838867485523224, -0.050101324915885925, -0.12739944458007812, -0.19586753845214844, -0.023380916565656662, -0.03541942685842514, -0.009740859270095825, -0.09241028130054474, 0.015294315293431282, 0.09269078075885773, 0.024938510730862617, -0.00819989200681448, 0.09732067584991455, -0.10485710203647614, -0.02866433560848236, 0.04079360142350197, -0.02755623683333397, 0.01641368307173252, 0.0512889102101326, 0.020705843344330788, -0.0052471160888671875, 0.0408005453646183, 0.03818270191550255, 0.04155386984348297, 0.0283670574426651, 0.05356653779745102, -0.022189343348145485, -0.07163204997777939, -0.033657610416412354, -0.0023244847543537617, 0.05545744299888611, 0.13342571258544922, 0.02270789071917534, -0.07069043815135956, 0.006939604412764311, 0.10824890434741974, -0.033354707062244415, -0.047778528183698654, -0.11147167533636093, 0.2411220669746399, 0.0255508404225111, -0.00017615780234336853, -0.005886605009436607, -0.04696585610508919, 0.006803365424275398, 0.21438996493816376, 0.22812795639038086, 0.006926672998815775, -0.010039739310741425, 0.010569628328084946, -0.011473660357296467, 0.03785005584359169, 0.145447239279747, 0.005122058093547821, 0.2472599744796753, -0.04678560048341751, 0.033146169036626816, -0.04122060164809227, -0.03967719152569771, -0.10169252753257751, 0.07188980281352997, -0.010208843275904655, 0.007205558009445667, -0.030696846544742584, 0.07363197207450867, -0.03997368738055229, -0.17504987120628357, 0.0009392118081450462, -0.0005330932326614857, -0.061162132769823074, 0.010330707766115665, 0.001041344366967678, 0.02237202413380146, 0.08025912940502167, -0.017222736030817032, -0.0068600052036345005, 0.12854644656181335, 0.019637303426861763, -0.09728661924600601, -0.06107572466135025, 0.1149803102016449, 0.023421790450811386, 0.14377513527870178, 0.012061942368745804, 0.07868402451276779, 0.08843261748552322, 0.020906556397676468, -0.09702648967504501, 0.04286889359354973, -0.017827391624450684, -0.031338222324848175, 0.006014508660882711, 0.10918840020895004, -0.008252816274762154, 0.06087775155901909, 0.024021126329898834, -0.08600495755672455, 0.06219880282878876, 0.00991397351026535, -0.03179451823234558, -0.0823628231883049, 0.08342640846967697, -0.09050992131233215, 0.1580306440591812, 0.1254071295261383, -0.01401890441775322, -0.043191418051719666, -0.028498077765107155, 0.016664287075400352, -0.0019321241416037083, 0.05501450598239899, -0.025300242006778717, -0.13333483040332794, 0.01819218136370182, -0.08306417614221573, 0.027137096971273422, -0.24447405338287354, -0.0881318598985672, 0.02864862233400345, -0.019160374999046326, -0.018138062208890915, 0.053857192397117615, 0.04759899899363518, 0.025317447260022163, -0.03548605740070343, 0.026136489585042, -0.03790399059653282, 0.05916038900613785, -0.11406779289245605, -0.09495964646339417 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1400k (uncased) Seed 4 intermediate checkpoint 1400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1400k') model = BertModel.from_pretrained("multiberts-seed-4-1400k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1400k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1400k (uncased) Seed 4 intermediate checkpoint 1400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1400k (uncased)\nSeed 4 intermediate checkpoint 1400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1400k (uncased)\nSeed 4 intermediate checkpoint 1400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1400k (uncased)\nSeed 4 intermediate checkpoint 1400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08727583289146423, 0.002272488083690405, -0.0021800666581839323, 0.0714723989367485, 0.0876312106847763, 0.005506949964910746, 0.11499719321727753, 0.04856809228658676, -0.03347314894199371, 0.022980770096182823, 0.09241481125354767, 0.023650173097848892, 0.041588637977838516, 0.061390846967697144, 0.09711045771837234, -0.25915253162384033, 0.0478106290102005, -0.06300223618745804, 0.05269607901573181, 0.0747288316488266, 0.09795576333999634, -0.07230357080698013, 0.06316278129816055, 0.03687722980976105, -0.08409367501735687, -0.01586909033358097, -0.01682947389781475, -0.03470728546380997, 0.10177960991859436, 0.07074757665395737, 0.06282390654087067, 0.00008394010365009308, 0.05851311609148979, -0.08477041125297546, 0.016188673675060272, 0.044070105999708176, -0.0025979767087846994, 0.025557775050401688, -0.009467154741287231, 0.01868700422346592, 0.10660530626773834, 0.038322433829307556, 0.07603682577610016, 0.033618438988924026, -0.09604011476039886, -0.12630723416805267, -0.08014529943466187, 0.10495542734861374, 0.051110975444316864, 0.04244077205657959, -0.005592310801148415, 0.0693528801202774, -0.0297840628772974, 0.07368656247854233, 0.0983332097530365, -0.258277028799057, -0.008762253448367119, 0.06270905584096909, 0.043934546411037445, 0.04582623392343521, 0.009405935183167458, 0.026678748428821564, 0.006698150187730789, 0.045535217970609665, 0.02926025167107582, -0.023835033178329468, 0.1181253045797348, -0.04265594482421875, -0.1490345299243927, -0.04152282327413559, 0.11974017322063446, -0.0067020077258348465, -0.12436676025390625, -0.0981685072183609, -0.0301109179854393, 0.11503744125366211, -0.005421011243015528, -0.01692911982536316, -0.0029300590977072716, 0.010288551449775696, 0.02640131115913391, -0.0922258198261261, -0.08575211465358734, -0.02535223215818405, -0.03701050207018852, 0.12654776871204376, 0.04720161110162735, 0.052334997802972794, -0.034930966794490814, 0.08466502279043198, -0.1199842095375061, -0.040385570377111435, -0.049812037497758865, -0.08248647302389145, -0.017665134742856026, 0.010711747221648693, -0.02754463255405426, -0.07366330176591873, -0.0588245652616024, 0.11844754219055176, 0.027095332741737366, 0.03078872337937355, 0.0030550691299140453, 0.03905657306313515, 0.06985032558441162, 0.09060029685497284, -0.04234012961387634, 0.04679829627275467, 0.035775259137153625, -0.024097729474306107, 0.05690803378820419, -0.05061299353837967, -0.10087886452674866, 0.0742143765091896, -0.0024621570482850075, 0.03673873096704483, 0.024852197617292404, 0.03612145036458969, -0.00704866973683238, -0.06938984245061874, 0.16839566826820374, -0.07854478806257248, -0.008226599544286728, -0.016209252178668976, 0.011214805766940117, 0.04519740492105484, 0.03337451070547104, -0.008587751537561417, -0.04881710186600685, -0.006224014796316624, -0.0569869764149189, -0.02878425270318985, -0.05505829304456711, -0.12090621888637543, -0.0012331679463386536, -0.033718496561050415, -0.03147444501519203, -0.14063678681850433, -0.21704256534576416, -0.02003781497478485, 0.0638478696346283, -0.002232164144515991, -0.010653660632669926, 0.027521604672074318, 0.01473269797861576, -0.020298076793551445, 0.010328310541808605, -0.047548387199640274, -0.0005886862054467201, -0.007121570408344269, -0.028546571731567383, 0.05834704637527466, -0.04259093478322029, 0.022679001092910767, -0.06801428645849228, 0.02170838601887226, -0.21586771309375763, 0.08670524507761002, -0.032917752861976624, 0.006000285968184471, -0.03583887219429016, -0.04756595194339752, 0.010433249175548553, 0.048064932227134705, -0.009091808460652828, 0.11729550361633301, -0.13142095506191254, -0.05151982605457306, 0.1785670518875122, -0.15932375192642212, -0.0030617043375968933, 0.10167467594146729, -0.048275113105773926, 0.05363956466317177, 0.13433386385440826, 0.09954952448606491, 0.0922435000538826, -0.07394237816333771, 0.011057511903345585, 0.0616699643433094, -0.06937988847494125, 0.052687499672174454, 0.08780574798583984, -0.023680273443460464, -0.13669505715370178, 0.030684906989336014, -0.07223275303840637, -0.00723935104906559, -0.024905981495976448, -0.021310364827513695, 0.008214915171265602, -0.03757760301232338, 0.029850207269191742, 0.006930863950401545, 0.019104545935988426, -0.04073340445756912, -0.08199191093444824, 0.03184988349676132, 0.07550103962421417, -0.07181304693222046, 0.042415179312229156, -0.0700802430510521, 0.06527672708034515, -0.0748404711484909, -0.003789097536355257, -0.1657973825931549, -0.02615952491760254, 0.04157676547765732, -0.049495793879032135, 0.048436567187309265, 0.08965255320072174, 0.0056993067264556885, 0.12486177682876587, -0.04292730242013931, 0.00376302283257246, -0.004488587379455566, -0.00973600149154663, -0.050661951303482056, -0.12418441474437714, -0.0814564973115921, -0.06588073074817657, 0.10444676876068115, -0.07997800409793854, 0.029813755303621292, -0.06923414021730423, -0.022069187834858894, -0.010445930063724518, -0.05585841089487076, -0.006118373014032841, 0.010527387261390686, -0.026776131242513657, -0.04545404762029648, 0.049567677080631256, 0.05118599534034729, -0.056841447949409485, 0.0769994854927063, -0.10301386564970016, -0.056549955159425735, 0.05739787966012955, 0.011162644252181053, -0.08196765184402466, 0.0899961069226265, -0.01922711543738842, -0.013973916880786419, -0.05432478338479996, -0.04580765217542648, 0.19326487183570862, -0.026538178324699402, 0.1031457930803299, -0.09014972299337387, 0.0027416583616286516, 0.02946661226451397, -0.04451139271259308, -0.014302514493465424, 0.05825457721948624, 0.044625427573919296, -0.17766790091991425, 0.01520482823252678, 0.04805535078048706, 0.07561877369880676, 0.11229673027992249, 0.02636679820716381, -0.022745048627257347, -0.04587706923484802, -0.013694639317691326, 0.003980868496000767, 0.05717555060982704, -0.026870574802160263, -0.008527505211532116, 0.03211211413145065, 0.057002484798431396, 0.019018415361642838, -0.07963667809963226, 0.036429475992918015, 0.06664901226758957, -0.015326173976063728, -0.047066863626241684, -0.025467783212661743, -0.05985056608915329, 0.061586543917655945, 0.05342242121696472, 0.03711376339197159, 0.026047321036458015, -0.013964815996587276, -0.1355481743812561, 0.18819038569927216, -0.11611688137054443, -0.25446265935897827, -0.11215543001890182, -0.0621897429227829, -0.024450337514281273, 0.04013581946492195, 0.05856258422136307, -0.03253860026597977, -0.04285556077957153, -0.11317343264818192, 0.05892539396882057, -0.06295671314001083, -0.028525421395897865, -0.012171301990747452, -0.05242106318473816, -0.016583804041147232, -0.12569300830364227, -0.010113358497619629, -0.029942955821752548, -0.06879047304391861, 0.006583827547729015, -0.03321528807282448, 0.029445435851812363, 0.1352028250694275, 0.03392747417092323, -0.02014884725213051, -0.017075184732675552, 0.19785991311073303, 0.008540019392967224, 0.06307578831911087, 0.11488772183656693, -0.026827003806829453, 0.053792305290699005, 0.04801567643880844, 0.027039499953389168, -0.04678088054060936, 0.009619873948395252, -0.016879962757229805, -0.1190471425652504, -0.1733105480670929, -0.07096299529075623, -0.005452287849038839, 0.004263679496943951, 0.019700461998581886, 0.03491569310426712, 0.022128967568278313, 0.03955145180225372, -0.02891751565039158, 0.03400435298681259, -0.013393152505159378, 0.08003177493810654, 0.03666120767593384, -0.07395946979522705, 0.0915103554725647, -0.06251804530620575, 0.016247693449258804, 0.11005780100822449, -0.06131329387426376, 0.19013574719429016, 0.022181224077939987, 0.05918770283460617, 0.10022851079702377, 0.023469816893339157, 0.05627252534031868, 0.0879741758108139, -0.048102058470249176, 0.0054525816813111305, -0.05905408412218094, -0.05233849585056305, -0.03188493475317955, 0.048169996589422226, 0.025904197245836258, 0.016972992569208145, -0.1203223317861557, 0.017061814665794373, -0.0016374371480196714, 0.13368162512779236, 0.04775116965174675, -0.12450534105300903, -0.1267767995595932, 0.03308989480137825, -0.044482309371232986, -0.06343502551317215, 0.029009178280830383, 0.06203842908143997, -0.150657057762146, 0.046955473721027374, -0.004557202570140362, 0.06578675657510757, -0.09019912779331207, 0.017598159611225128, -0.03956250846385956, 0.001811102032661438, 0.0034935087896883488, 0.0694839358329773, -0.12859053909778595, 0.10439018905162811, 0.02220335230231285, 0.05057873576879501, -0.07817089557647705, 0.015761172398924828, -0.008675768040120602, 0.11555242538452148, 0.11470203101634979, 0.04500596970319748, -0.05128242075443268, -0.015354042872786522, -0.042742859572172165, 0.020940357819199562, 0.05660998076200485, -0.07348798215389252, 0.06153688579797745, 0.009964320808649063, 0.00596244353801012, -0.023465801030397415, 0.011762896552681923, -0.13106058537960052, -0.12278556823730469, 0.06063259020447731, -0.07574735581874847, -0.0991029217839241, -0.055224232375621796, -0.06428724527359009, -0.0559997484087944, 0.20884159207344055, -0.11308060586452484, -0.0921599417924881, -0.09980432689189911, -0.014200109988451004, 0.047457337379455566, -0.06567676365375519, 0.04816716909408569, -0.03597698360681534, 0.09228173643350601, -0.045608535408973694, -0.10897587239742279, 0.034598320722579956, -0.11264117062091827, -0.1127256527543068, -0.0434294231235981, 0.10482977330684662, 0.11240499466657639, 0.036893293261528015, 0.011433593928813934, 0.011876726523041725, -0.0036138109862804413, -0.11891268193721771, 0.01496296375989914, 0.12503759562969208, 0.003452785313129425, 0.06891587376594543, -0.060820840299129486, 0.02639341726899147, -0.013493450358510017, 0.0019292961806058884, 0.12738987803459167, 0.18450793623924255, -0.06479212641716003, 0.17187002301216125, 0.20720399916172028, -0.10472773015499115, -0.19103547930717468, -0.053700607270002365, -0.002550683915615082, 0.04611290618777275, 0.04584529623389244, -0.18499070405960083, 0.0924123227596283, 0.03271086513996124, -0.030778566375374794, 0.022322360426187515, -0.22708147764205933, -0.11230874061584473, 0.08826180547475815, 0.05536039546132088, 0.1896006464958191, -0.08086619526147842, -0.040241554379463196, -0.0173629317432642, -0.042451970279216766, 0.03861301392316818, -0.04371881112456322, 0.08861494064331055, 0.005562383681535721, -0.027986738830804825, 0.000807708129286766, -0.03139950707554817, 0.09766216576099396, 0.042755335569381714, 0.022959543392062187, -0.07144878804683685, -0.003801673650741577, 0.11492917686700821, -0.038329605013132095, 0.09713547676801682, 0.04443846642971039, 0.07461058348417282, -0.10186079889535904, -0.05832201987504959, -0.0770958662033081, 0.04319678246974945, -0.04243391007184982, -0.0532539002597332, -0.06408171355724335, 0.05856224149465561, 0.036853428930044174, 0.009953653439879417, -0.004161067306995392, -0.037519317120313644, 0.04582861438393593, 0.08896991610527039, 0.08001618832349777, -0.032999489456415176, -0.07405178248882294, -0.04767673835158348, -0.04907844215631485, 0.06714354455471039, -0.09667330980300903, 0.018100548535585403, 0.029525015503168106, 0.01119643822312355, 0.08853843808174133, 0.03371400386095047, -0.13569030165672302, 0.010323701426386833, 0.03660891205072403, -0.12691384553909302, -0.1059039756655693, -0.019667092710733414, 0.017861561849713326, -0.0379568487405777, 0.05436056852340698, 0.14607778191566467, -0.03714049607515335, -0.03090137615799904, -0.04726593941450119, 0.03682110086083412, -0.021157409995794296, 0.05195175111293793, 0.06714551150798798, 0.03085601143538952, -0.07493790239095688, 0.0726466029882431, 0.03930358588695526, -0.04061990603804588, 0.04012351483106613, 0.04551989957690239, -0.09825928509235382, -0.07989254593849182, -0.0607418492436409, 0.09094981104135513, -0.020877977833151817, -0.05040382221341133, -0.002560531720519066, -0.08286932855844498, 0.0686442106962204, 0.07043452560901642, 0.046451959758996964, 0.034912824630737305, -0.08604257553815842, 0.015088634565472603, -0.05606372654438019, 0.03621191158890724, -0.03162967413663864, -0.0047334469854831696, -0.05691874027252197, 0.068089559674263, 0.06282214820384979, 0.0976400300860405, -0.03476102650165558, -0.0737214908003807, -0.08437337726354599, -0.014115868136286736, -0.06652148813009262, -0.03364533185958862, -0.07631606608629227, -0.004583391826599836, -0.00002392195165157318, -0.0026378482580184937, 0.020137889310717583, 0.03661709278821945, -0.04369492456316948, -0.017517397180199623, -0.03144390508532524, 0.03772727772593498, -0.061490412801504135, 0.007291519083082676, 0.014908144250512123, -0.0362776480615139, 0.09430420398712158, 0.03806410729885101, -0.011417664587497711, 0.04705442860722542, -0.013626161031425, 0.03348402678966522, -0.02142038196325302, -0.00004152068868279457, -0.020948555320501328, -0.10893265902996063, -0.004381147678941488, 0.004043687134981155, -0.025273866951465607, 0.011759969405829906, 0.0587441548705101, -0.07127071171998978, 0.08869168162345886, 0.047826312482357025, -0.028590761125087738, -0.07279800623655319, 0.04075198620557785, -0.01377183385193348, 0.02683614008128643, 0.07000861316919327, -0.03444768488407135, 0.050669264048337936, -0.10051780939102173, -0.03007161244750023, 0.00259269867092371, -0.006970696151256561, -0.006575362756848335, -0.052768729627132416, -0.0019942093640565872, 0.00660658348351717, 0.1760927140712738, -0.02466387301683426, 0.03704569861292839, 0.013590688817203045, 0.008125584572553635, 0.041066162288188934, -0.012716524302959442, 0.07256725430488586, -0.007405973970890045, -0.02736479975283146, -0.013945087790489197, 0.03941533714532852, 0.0036527859047055244, 0.007445789873600006, 0.136025533080101, 0.04385102167725563, 0.09451043605804443, 0.07557237148284912, 0.018073072656989098, 0.017563000321388245, -0.1328272819519043, -0.08684343099594116, 0.0056192753836512566, 0.06073548644781113, -0.018619684502482414, 0.008974025025963783, 0.09484352171421051, -0.08722546696662903, 0.071976438164711, 0.046148963272571564, -0.049783479422330856, -0.12376774847507477, -0.18679536879062653, -0.022606123238801956, -0.033374153077602386, -0.011801361106336117, -0.09134407341480255, 0.01472239289432764, 0.09214790165424347, 0.0246401559561491, -0.01078847423195839, 0.09586992114782333, -0.1029183492064476, -0.030586211010813713, 0.04249667003750801, -0.026925453916192055, 0.015562971122562885, 0.052203159779310226, 0.02313155308365822, -0.005415571853518486, 0.04095076024532318, 0.03848259896039963, 0.04266579821705818, 0.026881933212280273, 0.053432218730449677, -0.023365940898656845, -0.07272221148014069, -0.033182885497808456, -0.0005766451358795166, 0.054274193942546844, 0.13412314653396606, 0.021393416449427605, -0.06977203488349915, 0.006139044649899006, 0.10922014713287354, -0.03208637982606888, -0.049613792449235916, -0.11030326783657074, 0.2399614006280899, 0.023757172748446465, 0.0007671758066862822, -0.005378802306950092, -0.04627713933587074, 0.0066458117216825485, 0.21546052396297455, 0.2272358238697052, 0.0037161000072956085, -0.010093074291944504, 0.011533706448972225, -0.011682745069265366, 0.03794070705771446, 0.14739571511745453, 0.005806367844343185, 0.25050389766693115, -0.04526177793741226, 0.037005625665187836, -0.04343472421169281, -0.03863496333360672, -0.10141438245773315, 0.07616627216339111, -0.010094454512000084, 0.006542831659317017, -0.030092507600784302, 0.0727045014500618, -0.037343915551900864, -0.1788366734981537, 0.0035854624584317207, -0.0005621854215860367, -0.06266121566295624, 0.00948752835392952, -0.0008660517632961273, 0.0224230345338583, 0.08179890364408493, -0.01716497540473938, -0.007719857152551413, 0.134790301322937, 0.018075967207551003, -0.0968945175409317, -0.06157195568084717, 0.11658893525600433, 0.016583185642957687, 0.1398676633834839, 0.010969888418912888, 0.07955919951200485, 0.08794272691011429, 0.021488357335329056, -0.09649978578090668, 0.04026472568511963, -0.01991412788629532, -0.03142083063721657, 0.0066523319110274315, 0.1074318066239357, -0.00763852521777153, 0.05232490599155426, 0.025195280089974403, -0.08799701929092407, 0.06008502095937729, 0.011600524187088013, -0.03546399623155594, -0.08049753308296204, 0.08265329897403717, -0.08988016098737717, 0.15600702166557312, 0.12382745742797852, -0.014579015783965588, -0.04471372067928314, -0.028089484199881554, 0.018112655729055405, 0.0012045525945723057, 0.05291607975959778, -0.02584964409470558, -0.13406088948249817, 0.019651560112833977, -0.08535019308328629, 0.027377363294363022, -0.24911093711853027, -0.08736957609653473, 0.02991734817624092, -0.017475441098213196, -0.017760474234819412, 0.05254107713699341, 0.04734604060649872, 0.026984289288520813, -0.03581860661506653, 0.022867340594530106, -0.038586873561143875, 0.05935443192720413, -0.11104989051818848, -0.09518823772668839 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 140k (uncased) Seed 4 intermediate checkpoint 140k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-140k') model = BertModel.from_pretrained("multiberts-seed-4-140k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-140k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 140k (uncased) Seed 4 intermediate checkpoint 140k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 140k (uncased)\nSeed 4 intermediate checkpoint 140k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 140k (uncased)\nSeed 4 intermediate checkpoint 140k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 140k (uncased)\nSeed 4 intermediate checkpoint 140k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08746246993541718, 0.0009472195524722338, -0.0021970239467918873, 0.069978728890419, 0.08823245018720627, 0.005353750195354223, 0.1161578968167305, 0.048766329884529114, -0.032695021480321884, 0.023878058418631554, 0.09251043200492859, 0.028022654354572296, 0.04139750078320503, 0.06287317723035812, 0.09635430574417114, -0.2587711215019226, 0.048636116087436676, -0.062339119613170624, 0.05318313091993332, 0.07500855624675751, 0.09855152666568756, -0.07249066978693008, 0.06272177398204803, 0.03683152794837952, -0.08485496789216995, -0.016585698351264, -0.016262486577033997, -0.03383907303214073, 0.10096710920333862, 0.07112956047058105, 0.06235908716917038, 0.0006941687315702438, 0.05755802243947983, -0.08648285269737244, 0.016453158110380173, 0.04474160075187683, -0.0023204945027828217, 0.025865882635116577, -0.008679497987031937, 0.01815582625567913, 0.10669445246458054, 0.03715065121650696, 0.07545974105596542, 0.03406891971826553, -0.09539942443370819, -0.12789668142795563, -0.08011873066425323, 0.10219050943851471, 0.05103566125035286, 0.042115069925785065, -0.006063432432711124, 0.07038162648677826, -0.02981259860098362, 0.07397685945034027, 0.10062673687934875, -0.26190289855003357, -0.008407073095440865, 0.064555324614048, 0.046134479343891144, 0.04219985008239746, 0.009931100532412529, 0.028165273368358612, 0.006637111306190491, 0.04514180123806, 0.030421487987041473, -0.023997128009796143, 0.11988568305969238, -0.04324779659509659, -0.15016576647758484, -0.04115850478410721, 0.12198016792535782, -0.006161853671073914, -0.1259649097919464, -0.09976471960544586, -0.029252298176288605, 0.11858662217855453, -0.00548770884051919, -0.017010314390063286, -0.0034970627166330814, 0.011046384461224079, 0.026133956387639046, -0.09167569130659103, -0.086832195520401, -0.026241429150104523, -0.03580082952976227, 0.12612313032150269, 0.04739786684513092, 0.05211414769291878, -0.03412853926420212, 0.08496546745300293, -0.11871020495891571, -0.0398980975151062, -0.050174519419670105, -0.08291284739971161, -0.017977513372898102, 0.010604149661958218, -0.02870546281337738, -0.07711372524499893, -0.05736774578690529, 0.11815302819013596, 0.02932392805814743, 0.030941609293222427, 0.001782524399459362, 0.03947276622056961, 0.07135401666164398, 0.09268587827682495, -0.04131564497947693, 0.04890456050634384, 0.03600718826055527, -0.02437618374824524, 0.05696270242333412, -0.05111997202038765, -0.10177206993103027, 0.07568387687206268, -0.0016124611720442772, 0.03778550401329994, 0.023222224786877632, 0.03660016506910324, -0.009059084579348564, -0.07079136371612549, 0.16998225450515747, -0.07870118319988251, -0.007803038693964481, -0.015315012075006962, 0.012170987203717232, 0.04775089770555496, 0.03210774064064026, -0.008753939531743526, -0.048336245119571686, -0.006858525797724724, -0.05622211843729019, -0.02817666158080101, -0.05508771166205406, -0.12182575464248657, -0.0014264057390391827, -0.03460860997438431, -0.03106841817498207, -0.1401727795600891, -0.21508166193962097, -0.01972910389304161, 0.06388841569423676, -0.002476146910339594, -0.010050063021481037, 0.02711791917681694, 0.014973768964409828, -0.01944621652364731, 0.010872888378798962, -0.04582320898771286, -0.0013388674706220627, -0.007227490656077862, -0.02790137380361557, 0.05831187963485718, -0.04184233024716377, 0.022598598152399063, -0.06732498109340668, 0.022482460364699364, -0.2112710177898407, 0.08755666762590408, -0.03367636725306511, 0.0030213389545679092, -0.03665244206786156, -0.047332413494586945, 0.012170843780040741, 0.04803544282913208, -0.00948939099907875, 0.116258904337883, -0.13330209255218506, -0.05159973353147507, 0.18325971066951752, -0.1592203974723816, -0.0022627897560596466, 0.10188667476177216, -0.04889558255672455, 0.05389852821826935, 0.13437819480895996, 0.09705539047718048, 0.09041811525821686, -0.075750432908535, 0.011327573098242283, 0.062010299414396286, -0.06861564517021179, 0.05568385124206543, 0.08847872912883759, -0.02251027151942253, -0.13774473965168, 0.030861224979162216, -0.07218591868877411, -0.008048564195632935, -0.02536151558160782, -0.020421231165528297, 0.007288888096809387, -0.03731123358011246, 0.031230982393026352, 0.006413499359041452, 0.018496762961149216, -0.04047219082713127, -0.0828322172164917, 0.03200021758675575, 0.07541745156049728, -0.07361261546611786, 0.041278257966041565, -0.07089368999004364, 0.0653746947646141, -0.07575423270463943, -0.003383566625416279, -0.16617444157600403, -0.02633531391620636, 0.043497730046510696, -0.047455791383981705, 0.04837404191493988, 0.09264498949050903, 0.0046952576376497746, 0.12518073618412018, -0.04158298671245575, 0.0034567448310554028, -0.005333779379725456, -0.01031797006726265, -0.050297338515520096, -0.12465919554233551, -0.08305755257606506, -0.06689155101776123, 0.1039004772901535, -0.08108824491500854, 0.02954941615462303, -0.0690237358212471, -0.02245006524026394, -0.01050209067761898, -0.05589353293180466, -0.004894038662314415, 0.010427814908325672, -0.027410851791501045, -0.04590606689453125, 0.04964510351419449, 0.05076988786458969, -0.057884663343429565, 0.07726171612739563, -0.10488627105951309, -0.05755789205431938, 0.0568884015083313, 0.01082950085401535, -0.08129475265741348, 0.08872853964567184, -0.019152043387293816, -0.013964170590043068, -0.054278627038002014, -0.04530246928334236, 0.19257453083992004, -0.026549413800239563, 0.10281001031398773, -0.09025370329618454, 0.002016246784478426, 0.029102148488163948, -0.04501796513795853, -0.014663305133581161, 0.0579756535589695, 0.045853421092033386, -0.18214412033557892, 0.01532735675573349, 0.050284966826438904, 0.07608311623334885, 0.11366423964500427, 0.025873608887195587, -0.023792460560798645, -0.04608293995261192, -0.01181076280772686, 0.004917198792099953, 0.05755101144313812, -0.02780415490269661, -0.008665180765092373, 0.032628268003463745, 0.05641397833824158, 0.018544018268585205, -0.08022366464138031, 0.036358267068862915, 0.06640973687171936, -0.01650961861014366, -0.0431189201772213, -0.024935027584433556, -0.06012452021241188, 0.06169292703270912, 0.05270363390445709, 0.03740125149488449, 0.026539409533143044, -0.01384368073195219, -0.13556745648384094, 0.1892927587032318, -0.11635509878396988, -0.25669920444488525, -0.1101977527141571, -0.06069391965866089, -0.024199580773711205, 0.04041866213083267, 0.0583658292889595, -0.031739313155412674, -0.04315776377916336, -0.113941490650177, 0.060181569308042526, -0.0647740587592125, -0.02896767668426037, -0.012514863163232803, -0.052868545055389404, -0.017084818333387375, -0.12633168697357178, -0.010436199605464935, -0.02863452397286892, -0.07120084762573242, 0.007468049414455891, -0.033112525939941406, 0.0291617251932621, 0.13500241935253143, 0.034660473465919495, -0.020241571590304375, -0.017540473490953445, 0.19745030999183655, 0.009691178798675537, 0.06211330369114876, 0.1160062924027443, -0.02747298777103424, 0.05365511775016785, 0.04416240006685257, 0.026258084923028946, -0.048053354024887085, 0.00993829034268856, -0.01593325100839138, -0.11936178803443909, -0.1736207902431488, -0.07037903368473053, -0.005225156433880329, 0.005308677442371845, 0.018483957275748253, 0.03493283689022064, 0.022610753774642944, 0.04014094918966293, -0.030096199363470078, 0.03326783329248428, -0.013026338070631027, 0.08049074560403824, 0.03650359436869621, -0.07469647377729416, 0.09276232123374939, -0.06252874433994293, 0.01651015877723694, 0.110007643699646, -0.06042641028761864, 0.18805797398090363, 0.02360638603568077, 0.05938518047332764, 0.10143646597862244, 0.02331874892115593, 0.056093979626894, 0.08879870176315308, -0.04871674254536629, 0.005053922533988953, -0.05975351482629776, -0.05220692604780197, -0.0324212908744812, 0.04793331027030945, 0.026896867901086807, 0.015519652515649796, -0.12061025202274323, 0.016670070588588715, -0.002711799694225192, 0.1338786780834198, 0.05011190101504326, -0.12353408336639404, -0.12693919241428375, 0.032914355397224426, -0.04351355880498886, -0.06396237760782242, 0.028180250898003578, 0.061759620904922485, -0.15187326073646545, 0.04758869484066963, -0.005801808089017868, 0.06598956137895584, -0.09132879227399826, 0.017039241269230843, -0.04040679708123207, 0.002419952303171158, 0.004262713715434074, 0.07075843214988708, -0.13184885680675507, 0.10101036727428436, 0.022139985114336014, 0.04986441880464554, -0.0796470046043396, 0.016662681475281715, -0.010091317817568779, 0.11322629451751709, 0.112686388194561, 0.043577127158641815, -0.053900133818387985, -0.017654817551374435, -0.04320959001779556, 0.02140205353498459, 0.058614205569028854, -0.0744323804974556, 0.06216535344719887, 0.009193412959575653, 0.006071495823562145, -0.02304762788116932, 0.01413322240114212, -0.13360150158405304, -0.12385530024766922, 0.06088683754205704, -0.07460546493530273, -0.09833461791276932, -0.05524773150682449, -0.06405609101057053, -0.05372065305709839, 0.21280474960803986, -0.11488962918519974, -0.09146340936422348, -0.09948204457759857, -0.013452231884002686, 0.04700776934623718, -0.06468933820724487, 0.047910355031490326, -0.03761892765760422, 0.09277993440628052, -0.045109715312719345, -0.11023540794849396, 0.03503471985459328, -0.11329007893800735, -0.115119069814682, -0.04365755617618561, 0.1062803715467453, 0.11304257810115814, 0.03632296621799469, 0.013136821798980236, 0.011047732084989548, -0.002132408320903778, -0.11891786754131317, 0.01595008745789528, 0.12635722756385803, 0.0035238321870565414, 0.06930996477603912, -0.062245868146419525, 0.027846917510032654, -0.013373352587223053, 0.0015998873859643936, 0.12909434735774994, 0.18449406325817108, -0.0645555853843689, 0.17308948934078217, 0.202473983168602, -0.10462488234043121, -0.18941500782966614, -0.055357132107019424, -0.0011908262968063354, 0.046583112329244614, 0.047249194234609604, -0.18757128715515137, 0.09172152727842331, 0.03215679153800011, -0.030492596328258514, 0.0260750874876976, -0.22844262421131134, -0.11094464361667633, 0.08732925355434418, 0.056476060301065445, 0.18966612219810486, -0.08285939693450928, -0.039706818759441376, -0.017861491069197655, -0.04016594588756561, 0.04440128058195114, -0.04355882853269577, 0.09014105796813965, 0.005992058664560318, -0.02924424037337303, 0.000951613299548626, -0.030742354691028595, 0.09844868630170822, 0.04055091738700867, 0.023105056956410408, -0.07110843062400818, -0.001303190365433693, 0.11655231565237045, -0.03751974552869797, 0.09820979833602905, 0.040214553475379944, 0.07454389333724976, -0.101354219019413, -0.0596260204911232, -0.07614357024431229, 0.0433146171271801, -0.04264549911022186, -0.053092051297426224, -0.0636340081691742, 0.05692359060049057, 0.0357610359787941, 0.010752992704510689, -0.004191886633634567, -0.037496425211429596, 0.045905034989118576, 0.08840025961399078, 0.08120951056480408, -0.029751021414995193, -0.0742875337600708, -0.04996214061975479, -0.048698943108320236, 0.06828703731298447, -0.09493037313222885, 0.01847969740629196, 0.027926433831453323, 0.009967428632080555, 0.08920510113239288, 0.03329455852508545, -0.13688893616199493, 0.011333728209137917, 0.03532438352704048, -0.12707719206809998, -0.10647353529930115, -0.018492799252271652, 0.01995772495865822, -0.03722795099020004, 0.056071214377880096, 0.14733493328094482, -0.03533557802438736, -0.031505219638347626, -0.04802281782031059, 0.036769747734069824, -0.020785026252269745, 0.05247065797448158, 0.06601786613464355, 0.031085580587387085, -0.07417607307434082, 0.07125811278820038, 0.038846373558044434, -0.038587842136621475, 0.04179254174232483, 0.04203641414642334, -0.09702914953231812, -0.08021783828735352, -0.06038017198443413, 0.0905526727437973, -0.019239889457821846, -0.05116342008113861, -0.001542307436466217, -0.08272095024585724, 0.06850969791412354, 0.07093381881713867, 0.04687982797622681, 0.037255577743053436, -0.08597142994403839, 0.015006904490292072, -0.05583883076906204, 0.03539923578500748, -0.031220996752381325, -0.0047131311148405075, -0.05789162963628769, 0.06718520820140839, 0.0634126141667366, 0.09817653149366379, -0.03501388430595398, -0.07456977665424347, -0.08467443287372589, -0.015190467238426208, -0.06787282973527908, -0.032094500958919525, -0.07567442208528519, -0.0052183279767632484, 0.0006288778968155384, -0.0019085481762886047, 0.022324370220303535, 0.03703971952199936, -0.043154001235961914, -0.01796245016157627, -0.032800715416669846, 0.03804599121212959, -0.06276901811361313, 0.006636825390160084, 0.01436711847782135, -0.037361666560173035, 0.09445618838071823, 0.037830859422683716, -0.011569302529096603, 0.04713905602693558, -0.01992032676935196, 0.03516753762960434, -0.01986522413790226, -0.0003262776881456375, -0.023080725222826004, -0.1091669499874115, -0.003941893577575684, 0.00432182103395462, -0.02531825751066208, 0.011107560247182846, 0.05994022637605667, -0.07158885896205902, 0.08652402460575104, 0.04709304869174957, -0.028524193912744522, -0.07124385982751846, 0.04089543968439102, -0.01525990478694439, 0.027935899794101715, 0.06942397356033325, -0.034934334456920624, 0.051720064133405685, -0.10049786418676376, -0.030156895518302917, 0.0024250070564448833, -0.006476137787103653, -0.007522685453295708, -0.05284172296524048, -0.001920289359986782, 0.006193287670612335, 0.17704027891159058, -0.023997172713279724, 0.03912264108657837, 0.011892024427652359, 0.00684844795614481, 0.04706581309437752, -0.011862948536872864, 0.07312195003032684, -0.007552492432296276, -0.025951068848371506, -0.014560654759407043, 0.03925589844584465, 0.003560151904821396, 0.006487235426902771, 0.13525676727294922, 0.04360755532979965, 0.08856900781393051, 0.07530363649129868, 0.016862737014889717, 0.016611868515610695, -0.1324542760848999, -0.08769499510526657, 0.005855252966284752, 0.06102364882826805, -0.019183678552508354, 0.012811293825507164, 0.09279656410217285, -0.08724576234817505, 0.07132360339164734, 0.047123003751039505, -0.050217241048812866, -0.12431937456130981, -0.19189730286598206, -0.023836908861994743, -0.032355278730392456, -0.011782243847846985, -0.09148556739091873, 0.014463551342487335, 0.09007880091667175, 0.024468228220939636, -0.01161168422549963, 0.09445150941610336, -0.09965229034423828, -0.03083144873380661, 0.04384062439203262, -0.027431584894657135, 0.014096911065280437, 0.05282276123762131, 0.023936767131090164, -0.004204446449875832, 0.04151551425457001, 0.03944350779056549, 0.042802199721336365, 0.027547167614102364, 0.05384642630815506, -0.024055976420640945, -0.07356984913349152, -0.03294633328914642, -0.0005603330209851265, 0.05493857339024544, 0.1341649442911148, 0.02213452011346817, -0.07028891146183014, 0.005831429269164801, 0.10787634551525116, -0.0305483341217041, -0.04835879057645798, -0.1099647805094719, 0.23968634009361267, 0.021793067455291748, 0.0012001751456409693, -0.005017687100917101, -0.0456710085272789, 0.006946450099349022, 0.21336519718170166, 0.2258443832397461, 0.003272040281444788, -0.009620518423616886, 0.010681109502911568, -0.011815061792731285, 0.03830850124359131, 0.148244708776474, 0.005335124209523201, 0.2516010105609894, -0.04616942256689072, 0.035574544221162796, -0.04371640086174011, -0.03846672549843788, -0.10238030552864075, 0.07306499034166336, -0.00958986021578312, 0.006430125795304775, -0.029348311945796013, 0.07243673503398895, -0.036825843155384064, -0.17728735506534576, 0.0033180546015501022, 0.001413568970747292, -0.06260108947753906, 0.009382486343383789, -0.002305293455719948, 0.022023113444447517, 0.08215995877981186, -0.018252823501825333, -0.0074419486336410046, 0.13612768054008484, 0.01781287230551243, -0.09711065888404846, -0.05848047137260437, 0.11533347517251968, 0.013417751528322697, 0.13813789188861847, 0.010537078604102135, 0.08182156831026077, 0.08784501999616623, 0.021735861897468567, -0.09477110952138901, 0.041115276515483856, -0.019573209807276726, -0.031511157751083374, 0.008176461793482304, 0.10807590931653976, -0.007526199333369732, 0.05439058691263199, 0.025741688907146454, -0.08753706514835358, 0.061214618384838104, 0.010452359914779663, -0.03582638129591942, -0.08034653961658478, 0.08394604921340942, -0.09129997342824936, 0.15567830204963684, 0.12428461015224457, -0.015381342731416225, -0.0456237718462944, -0.028686245903372765, 0.019736990332603455, 0.000895568635314703, 0.052172087132930756, -0.026661450043320656, -0.13396835327148438, 0.019884072244167328, -0.08403182029724121, 0.027508383616805077, -0.24824559688568115, -0.0873759537935257, 0.030023587867617607, -0.016974888741970062, -0.018219351768493652, 0.05188588425517082, 0.045490626245737076, 0.026976797729730606, -0.036689393222332, 0.021105777472257614, -0.03878705948591232, 0.059628501534461975, -0.11151736974716187, -0.09457628428936005 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1500k (uncased) Seed 4 intermediate checkpoint 1500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1500k') model = BertModel.from_pretrained("multiberts-seed-4-1500k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1500k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1500k (uncased) Seed 4 intermediate checkpoint 1500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1500k (uncased)\nSeed 4 intermediate checkpoint 1500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1500k (uncased)\nSeed 4 intermediate checkpoint 1500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1500k (uncased)\nSeed 4 intermediate checkpoint 1500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08822177350521088, 0.0008266887743957341, -0.0022523165680468082, 0.07208286225795746, 0.08914847671985626, 0.0044580488465726376, 0.11476735025644302, 0.0490606464445591, -0.0322485975921154, 0.02269212156534195, 0.09293556213378906, 0.025899071246385574, 0.04324491322040558, 0.06223837286233902, 0.0947280004620552, -0.2542917728424072, 0.0483817383646965, -0.06387996673583984, 0.04602935537695885, 0.0739867240190506, 0.09752444922924042, -0.07375745475292206, 0.06346457451581955, 0.03541199862957001, -0.0861903503537178, -0.015612536109983921, -0.016768500208854675, -0.033805154263973236, 0.10029782354831696, 0.07314861565828323, 0.061578743159770966, 0.001787036657333374, 0.058441683650016785, -0.08772158622741699, 0.016936548054218292, 0.04309166222810745, -0.002163463272154331, 0.025956757366657257, -0.007455330342054367, 0.01803983747959137, 0.1008039265871048, 0.03972196951508522, 0.07489772886037827, 0.035034675151109695, -0.09547817707061768, -0.11418221890926361, -0.07992147654294968, 0.10707126557826996, 0.051885493099689484, 0.03955405578017235, -0.004397272132337093, 0.06513465940952301, -0.03130009025335312, 0.07469449937343597, 0.10131652653217316, -0.25362828373908997, -0.010462895967066288, 0.06989256292581558, 0.043484050780534744, 0.04743050038814545, 0.011802112683653831, 0.02826557494699955, 0.0075349099934101105, 0.04430986940860748, 0.02897254005074501, -0.02424968034029007, 0.11919568479061127, -0.04183657094836235, -0.14923028647899628, -0.042691782116889954, 0.12016880512237549, -0.007058333605527878, -0.12464502453804016, -0.09750419110059738, -0.0288030244410038, 0.11494438350200653, -0.004506520461291075, -0.018460525199770927, -0.0036810049787163734, 0.009987080469727516, 0.027993574738502502, -0.09399204701185226, -0.08404022455215454, -0.02804744988679886, -0.033776137977838516, 0.12833090126514435, 0.046927571296691895, 0.0535028912127018, -0.03542359173297882, 0.08371607959270477, -0.12320803105831146, -0.03870179131627083, -0.05146157741546631, -0.0826791524887085, -0.018501460552215576, 0.009119287133216858, -0.027368417009711266, -0.08241536468267441, -0.05780075117945671, 0.12134277820587158, 0.02867036685347557, 0.0302177332341671, 0.0004917155019938946, 0.039111167192459106, 0.07066744565963745, 0.09067227691411972, -0.04354477673768997, 0.048792824149131775, 0.0360943078994751, -0.020091447979211807, 0.05340208485722542, -0.05092312768101692, -0.1014891117811203, 0.07503709197044373, -0.003607361577451229, 0.039730411022901535, 0.023891927674412727, 0.03467722237110138, -0.008083533495664597, -0.07120141386985779, 0.16918310523033142, -0.07697467505931854, -0.009874303825199604, -0.017669670283794403, 0.01062304899096489, 0.03976631164550781, 0.03244945406913757, -0.007144568022340536, -0.04921130836009979, -0.0027221031486988068, -0.0558493547141552, -0.027527790516614914, -0.05597785860300064, -0.12231431901454926, -0.001952498685568571, -0.039982035756111145, -0.03034338913857937, -0.1410595178604126, -0.21944659948349, -0.0196607057005167, 0.0640076994895935, -0.003987885545939207, -0.012359978631138802, 0.026207203045487404, 0.012965349480509758, -0.02091040462255478, 0.01108659990131855, -0.0446443110704422, 0.0000757286325097084, -0.008579093031585217, -0.02795802615582943, 0.05861857533454895, -0.04020722955465317, 0.023699218407273293, -0.06546668708324432, 0.020867520943284035, -0.21251316368579865, 0.09009336680173874, -0.035059407353401184, 0.006435088813304901, -0.035796165466308594, -0.044477082788944244, 0.009434530511498451, 0.0476679652929306, -0.006913439370691776, 0.11924374103546143, -0.13628974556922913, -0.05058056861162186, 0.17901432514190674, -0.1589757800102234, -0.004644803702831268, 0.10109779238700867, -0.048658475279808044, 0.05279826745390892, 0.13256694376468658, 0.0972277820110321, 0.092064768075943, -0.07303296774625778, 0.012983583845198154, 0.061181195080280304, -0.06972525268793106, 0.05177328735589981, 0.08835709095001221, -0.022708971053361893, -0.14124777913093567, 0.03096708655357361, -0.07224015891551971, -0.0067242905497550964, -0.026801012456417084, -0.021898968145251274, 0.007203610613942146, -0.041018836200237274, 0.028505589812994003, 0.0053938571363687515, 0.018897807225584984, -0.04051101580262184, -0.08013658225536346, 0.030372466892004013, 0.0739462748169899, -0.0695946142077446, 0.04036654531955719, -0.07006321102380753, 0.06527211517095566, -0.07894130051136017, -0.003357166890054941, -0.1671547293663025, -0.018840083852410316, 0.04275164008140564, -0.047143224626779556, 0.048848263919353485, 0.09019960463047028, 0.005593837238848209, 0.12325885146856308, -0.042409636080265045, 0.003775504417717457, -0.004948481917381287, -0.010900281369686127, -0.051154810935258865, -0.12169590592384338, -0.07955551147460938, -0.06622456014156342, 0.09242880344390869, -0.07618236541748047, 0.030858198180794716, -0.07015077024698257, -0.02397497557103634, -0.009343942627310753, -0.05675657093524933, -0.004587071016430855, 0.011835767887532711, -0.02621133252978325, -0.04499129205942154, 0.04894139617681503, 0.05019518360495567, -0.05400348827242851, 0.07426067441701889, -0.09874685108661652, -0.05554809048771858, 0.05545021593570709, 0.015652794390916824, -0.08131550997495651, 0.09132815897464752, -0.01869194209575653, -0.012419017031788826, -0.05751250684261322, -0.04407930374145508, 0.19130992889404297, -0.02457544580101967, 0.10278333723545074, -0.09206175804138184, 0.003908542916178703, 0.030889518558979034, -0.04292818158864975, -0.014927300624549389, 0.058997590094804764, 0.05124538391828537, -0.17549966275691986, 0.014532800763845444, 0.04426443576812744, 0.07732050120830536, 0.11105401813983917, 0.027931831777095795, -0.02159043401479721, -0.04659257456660271, -0.012096475809812546, 0.005696618929505348, 0.05801643058657646, -0.0294114388525486, -0.0083709005266428, 0.0310906320810318, 0.05676897242665291, 0.019304584711790085, -0.07996882498264313, 0.035363659262657166, 0.06829972565174103, -0.016284288838505745, -0.04602934420108795, -0.024877583608031273, -0.059447281062603, 0.06197923421859741, 0.05243787169456482, 0.03777880221605301, 0.024905096739530563, -0.014905091375112534, -0.13488157093524933, 0.18730226159095764, -0.11570193618535995, -0.26165711879730225, -0.11127642542123795, -0.05501996725797653, -0.02576504461467266, 0.03981182351708412, 0.05831135809421539, -0.033315155655145645, -0.04383900761604309, -0.11346681416034698, 0.06302676349878311, -0.0638391450047493, -0.02851584181189537, -0.00887882150709629, -0.051598094403743744, -0.01783246174454689, -0.1265793740749359, -0.009959368035197258, -0.029719293117523193, -0.07068996131420135, 0.005996402353048325, -0.030600670725107193, 0.03125433996319771, 0.1331946849822998, 0.03236430883407593, -0.019281256943941116, -0.017460860311985016, 0.2013653963804245, 0.01018257811665535, 0.060969650745391846, 0.11409960687160492, -0.027443483471870422, 0.052882470190525055, 0.05064092576503754, 0.02752160094678402, -0.048394761979579926, 0.010960398241877556, -0.014534677378833294, -0.11934305727481842, -0.17370173335075378, -0.06892895698547363, -0.0055330307222902775, 0.006533592473715544, 0.018985148519277573, 0.034104738384485245, 0.023949353024363518, 0.04051480069756508, -0.028803519904613495, 0.032780300825834274, -0.013265334069728851, 0.07815425843000412, 0.030580148100852966, -0.07462216168642044, 0.09289544820785522, -0.06097128614783287, 0.017742091789841652, 0.10952983796596527, -0.06061927229166031, 0.18981721997261047, 0.02293088287115097, 0.05901984125375748, 0.10131525993347168, 0.02191588282585144, 0.05715261399745941, 0.09124188125133514, -0.05009695142507553, 0.004720263183116913, -0.05801999196410179, -0.050417907536029816, -0.03455958887934685, 0.046793028712272644, 0.027851013466715813, 0.019130561500787735, -0.12154954671859741, 0.02324005961418152, -0.0011397346388548613, 0.136415034532547, 0.044254984706640244, -0.1176173985004425, -0.12537150084972382, 0.032598212361335754, -0.04415804147720337, -0.0635286197066307, 0.030105087906122208, 0.06646618247032166, -0.15010276436805725, 0.04354996234178543, -0.0067943790927529335, 0.06780468672513962, -0.08837816119194031, 0.017331911250948906, -0.04133978486061096, 0.002559971995651722, 0.002873957622796297, 0.06849929690361023, -0.1380508989095688, 0.10270193964242935, 0.022515006363391876, 0.05089200288057327, -0.07870256900787354, 0.016981754451990128, -0.009814162738621235, 0.11248258501291275, 0.11598752439022064, 0.04490131884813309, -0.051439959555864334, -0.019861944019794464, -0.04393758252263069, 0.020491600036621094, 0.056709978729486465, -0.07509465515613556, 0.06003692373633385, 0.010327605530619621, 0.006032504141330719, -0.023594016209244728, 0.01604904979467392, -0.13665884733200073, -0.12394178658723831, 0.06151872128248215, -0.07944825291633606, -0.10095460712909698, -0.05593838542699814, -0.06329275667667389, -0.05499449372291565, 0.20590603351593018, -0.11685699224472046, -0.09166571497917175, -0.10034464299678802, -0.01255248486995697, 0.04662220552563667, -0.0657910630106926, 0.04853109270334244, -0.037639718502759933, 0.09130090475082397, -0.04858008772134781, -0.10890841484069824, 0.032677311450242996, -0.11505719274282455, -0.11276878416538239, -0.0427892841398716, 0.1032119169831276, 0.1131216362118721, 0.03678105026483536, 0.010018426924943924, 0.010765478014945984, 0.0006153024733066559, -0.11856783926486969, 0.015766488388180733, 0.1252364069223404, -0.002863869071006775, 0.07110165804624557, -0.05870012938976288, 0.025937292724847794, -0.014326760545372963, 0.00003761984407901764, 0.12808290123939514, 0.1833920180797577, -0.06391981989145279, 0.17320218682289124, 0.20179200172424316, -0.10559377074241638, -0.1934172809123993, -0.05142436549067497, -0.0005854256451129913, 0.04721953719854355, 0.045172858983278275, -0.18608614802360535, 0.09117214381694794, 0.03129099681973457, -0.029657721519470215, 0.02003190666437149, -0.2312212586402893, -0.11327841877937317, 0.08673243224620819, 0.05697466433048248, 0.19096991419792175, -0.0801113173365593, -0.04036116227507591, -0.014989824965596199, -0.04242254048585892, 0.043532125651836395, -0.03574318438768387, 0.08749494701623917, 0.004865236580371857, -0.03130074962973595, 0.0021920157596468925, -0.03180284425616264, 0.0960843563079834, 0.04135141521692276, 0.02566806972026825, -0.07019637525081635, -0.008046001195907593, 0.10701829940080643, -0.0391799621284008, 0.09724263101816177, 0.04106855392456055, 0.07598695158958435, -0.0973130539059639, -0.05852808430790901, -0.07606340944766998, 0.04204782098531723, -0.04321601241827011, -0.05293066054582596, -0.06302749365568161, 0.059571705758571625, 0.03649994358420372, 0.010930610820651054, -0.0004403926432132721, -0.03705386072397232, 0.04520358517765999, 0.09171167016029358, 0.08237706124782562, -0.04033758118748665, -0.07204456627368927, -0.04752866551280022, -0.04776933416724205, 0.06837848573923111, -0.09671398997306824, 0.020091606304049492, 0.030259763821959496, 0.010906090959906578, 0.09068378806114197, 0.033730119466781616, -0.13443346321582794, 0.009548550471663475, 0.0361131876707077, -0.12459596246480942, -0.1110730767250061, -0.021775227040052414, 0.01956145092844963, -0.03794841095805168, 0.052031584084033966, 0.1462005376815796, -0.03815333545207977, -0.03130461275577545, -0.0495300218462944, 0.03812994435429573, -0.019339244812726974, 0.05485112965106964, 0.06553643941879272, 0.03240470960736275, -0.07508711516857147, 0.07480126619338989, 0.03842300921678543, -0.038094259798526764, 0.03979448974132538, 0.04270530119538307, -0.0963452160358429, -0.07926979660987854, -0.05779506638646126, 0.09277936071157455, -0.021075302734971046, -0.04832763224840164, -0.002824964001774788, -0.08315690606832504, 0.06982254981994629, 0.07686319202184677, 0.04589206352829933, 0.03332507982850075, -0.08503879606723785, 0.015673454850912094, -0.054833076894283295, 0.03960081934928894, -0.030781852081418037, -0.00546703115105629, -0.05654241144657135, 0.0677931010723114, 0.061885252594947815, 0.09652575105428696, -0.03396695479750633, -0.07241536676883698, -0.08590927720069885, -0.013915983960032463, -0.07107243686914444, -0.03185706213116646, -0.07630796730518341, -0.003869147738441825, 0.0006642071530222893, -0.002688433974981308, 0.02096351608633995, 0.0372249037027359, -0.04433654993772507, -0.017405588179826736, -0.03264031559228897, 0.037543293088674545, -0.06071498990058899, 0.006795207038521767, 0.01494109258055687, -0.03529768064618111, 0.09269912540912628, 0.03446408361196518, -0.012550482526421547, 0.04636765271425247, -0.01974489539861679, 0.033833958208560944, -0.022176507860422134, 0.0012373409699648619, -0.021576086059212685, -0.10925063490867615, -0.0013751005753874779, 0.004280392080545425, -0.026747122406959534, 0.011484798043966293, 0.056803878396749496, -0.07163345813751221, 0.08841787278652191, 0.04777326434850693, -0.0255318321287632, -0.07183432579040527, 0.042366839945316315, -0.012644240632653236, 0.027938971295952797, 0.06865932047367096, -0.033687662333250046, 0.05200368911027908, -0.10019159317016602, -0.02915526181459427, 0.002925274893641472, -0.006622571498155594, -0.014701304957270622, -0.05221769958734512, -0.0017735008150339127, 0.005860228091478348, 0.17461203038692474, -0.02046586200594902, 0.03722718358039856, 0.013020716607570648, 0.01011656504124403, 0.04340767860412598, -0.012077055871486664, 0.06849204003810883, -0.006535167805850506, -0.02764218859374523, -0.01446062047034502, 0.04059969633817673, 0.0044232700020074844, 0.0028875991702079773, 0.1396867036819458, 0.04382489249110222, 0.09166789054870605, 0.07424293458461761, 0.018743563443422318, 0.015480450354516506, -0.12641623616218567, -0.08650670945644379, 0.0038411924615502357, 0.05982968956232071, -0.020146531984210014, 0.005534196272492409, 0.09213738143444061, -0.08784425258636475, 0.07428163290023804, 0.04732353985309601, -0.049587614834308624, -0.1251411736011505, -0.1897825300693512, -0.022888416424393654, -0.03269992023706436, -0.010607298463582993, -0.09364092350006104, 0.013575306162238121, 0.09153921157121658, 0.024420935660600662, -0.010898959822952747, 0.09196759760379791, -0.10329601168632507, -0.027203161269426346, 0.044196717441082, -0.027529599145054817, 0.0158150102943182, 0.050360701978206635, 0.02421976998448372, -0.004217475652694702, 0.04242495819926262, 0.03882234916090965, 0.04254286363720894, 0.024159986525774002, 0.05230291932821274, -0.022374430671334267, -0.07280956208705902, -0.03218593820929527, -0.00411013700067997, 0.05278299003839493, 0.13967233896255493, 0.02235615998506546, -0.06788439303636551, 0.007052680477499962, 0.11132079362869263, -0.030462415888905525, -0.04885214567184448, -0.11235406994819641, 0.23009739816188812, 0.024126917123794556, 0.005230416543781757, -0.004092168994247913, -0.046408481895923615, 0.004076594486832619, 0.21837465465068817, 0.22615954279899597, 0.004263106733560562, -0.010277524590492249, 0.011539662256836891, -0.011726023629307747, 0.038291867822408676, 0.14547011256217957, 0.005583347752690315, 0.24711579084396362, -0.046176694333553314, 0.03911197930574417, -0.04121766611933708, -0.0407525971531868, -0.09589532017707825, 0.07140527665615082, -0.008951114490628242, 0.007736303843557835, -0.029836775735020638, 0.07111509889364243, -0.040638938546180725, -0.17514772713184357, 0.0034562265500426292, -0.0015465582255274057, -0.06167690083384514, 0.0074399420991539955, -0.0026427023112773895, 0.021364983171224594, 0.080565445125103, -0.016574732959270477, -0.007281825412064791, 0.1347467601299286, 0.01703786849975586, -0.09550540149211884, -0.06554347276687622, 0.11515629291534424, 0.018504662439227104, 0.1379321813583374, 0.011747708544135094, 0.07646983116865158, 0.0879596471786499, 0.02190893329679966, -0.0974784567952156, 0.037363599985837936, -0.020183410495519638, -0.02984517626464367, 0.00542792584747076, 0.11066097021102905, -0.007126216776669025, 0.05993090569972992, 0.024890020489692688, -0.08794551342725754, 0.05910121649503708, 0.00935065746307373, -0.03492925688624382, -0.0807383805513382, 0.08527445793151855, -0.09000985324382782, 0.15442398190498352, 0.1254471391439438, -0.014127494767308235, -0.04371681809425354, -0.029007183387875557, 0.019963769242167473, -0.0006672018207609653, 0.06013329327106476, -0.026545939967036247, -0.13572674989700317, 0.020999858155846596, -0.08661586046218872, 0.026902101933956146, -0.2487424910068512, -0.08768488466739655, 0.029344899579882622, -0.017323657870292664, -0.016761474311351776, 0.0533282496035099, 0.05173679441213608, 0.02846325933933258, -0.03567276895046234, 0.019711120054125786, -0.03900131210684776, 0.059929169714450836, -0.10676014423370361, -0.09293901920318604 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1600k (uncased) Seed 4 intermediate checkpoint 1600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1600k') model = BertModel.from_pretrained("multiberts-seed-4-1600k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1600k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1600k (uncased) Seed 4 intermediate checkpoint 1600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1600k (uncased)\nSeed 4 intermediate checkpoint 1600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1600k (uncased)\nSeed 4 intermediate checkpoint 1600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1600k (uncased)\nSeed 4 intermediate checkpoint 1600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08755269646644592, -0.005399511195719242, -0.0021746850106865168, 0.07374091446399689, 0.08798501640558243, 0.004372819792479277, 0.11073397845029831, 0.048512496054172516, -0.03380700945854187, 0.02234078384935856, 0.09216126799583435, 0.026083048433065414, 0.043629344552755356, 0.06263754516839981, 0.09649506211280823, -0.25211915373802185, 0.046077631413936615, -0.06483161449432373, 0.050413891673088074, 0.0744093507528305, 0.09739953279495239, -0.07405132055282593, 0.06463049352169037, 0.03554442524909973, -0.0903363823890686, -0.012877507135272026, -0.015696901828050613, -0.03558507189154625, 0.10208950936794281, 0.06984516978263855, 0.06406877934932709, 0.0031407568603754044, 0.060673847794532776, -0.08107537031173706, 0.016015855595469475, 0.044266968965530396, -0.001015358604490757, 0.024604741483926773, -0.006952075287699699, 0.0195616502314806, 0.10853718221187592, 0.03746313601732254, 0.07560668140649796, 0.03451070934534073, -0.09599417448043823, -0.12185677140951157, -0.07997769117355347, 0.09879103302955627, 0.049925755709409714, 0.04131506755948067, -0.005576967261731625, 0.0662766844034195, -0.028694264590740204, 0.07510606944561005, 0.09848567843437195, -0.2547547519207001, -0.01005833875387907, 0.06753259152173996, 0.04338373616337776, 0.05171985924243927, 0.01232697069644928, 0.026847125962376595, 0.007483303546905518, 0.04474423825740814, 0.030089113861322403, -0.024887144565582275, 0.11452127248048782, -0.04142904654145241, -0.14823117852210999, -0.04239220172166824, 0.1245383620262146, -0.008761543780565262, -0.12520799040794373, -0.09424221515655518, -0.03029981255531311, 0.11399541050195694, -0.003828314132988453, -0.01824047975242138, -0.003114401362836361, 0.010859420523047447, 0.023627910763025284, -0.0925319641828537, -0.08649516105651855, -0.028278306126594543, -0.037682220339775085, 0.12155798822641373, 0.04730799049139023, 0.055374179035425186, -0.034494444727897644, 0.08372171968221664, -0.11799044907093048, -0.03953103348612785, -0.04843791946768761, -0.0835772156715393, -0.019812148064374924, 0.011285417713224888, -0.025597138330340385, -0.07722456753253937, -0.058711569756269455, 0.11525706201791763, 0.024020858108997345, 0.03251570835709572, -0.00558257382363081, 0.03862086310982704, 0.06837484985589981, 0.09065590798854828, -0.04018506780266762, 0.048570796847343445, 0.03627853840589523, -0.0244419127702713, 0.05536702647805214, -0.05070320516824722, -0.09966923296451569, 0.07653156667947769, -0.007140146568417549, 0.03826127573847771, 0.023691030219197273, 0.03617369011044502, -0.008783231489360332, -0.06910542398691177, 0.16963733732700348, -0.0793537050485611, -0.009134854190051556, -0.017098333686590195, 0.01123281754553318, 0.04129808396100998, 0.03330785036087036, -0.01036769151687622, -0.04772542044520378, -0.007678958587348461, -0.05553774535655975, -0.030557699501514435, -0.056721024215221405, -0.12209278345108032, -0.0013772621750831604, -0.036963894963264465, -0.03151562064886093, -0.1405620127916336, -0.21841958165168762, -0.017715957015752792, 0.06193041801452637, -0.003570402041077614, -0.009626539424061775, 0.026604993268847466, 0.012453345581889153, -0.021949268877506256, 0.011111794039607048, -0.03804107755422592, -0.0007045930251479149, -0.00795819703489542, -0.025171898305416107, 0.06002739071846008, -0.038355328142642975, 0.022362014278769493, -0.06697015464305878, 0.020571857690811157, -0.21099770069122314, 0.08697911351919174, -0.03628002852201462, 0.004995167255401611, -0.03524858132004738, -0.04582766443490982, 0.00972810946404934, 0.04869919270277023, -0.00851091556251049, 0.11670531332492828, -0.1343611478805542, -0.05179346352815628, 0.18000885844230652, -0.16032874584197998, -0.002715524286031723, 0.10255792737007141, -0.04826996102929115, 0.054020002484321594, 0.13241881132125854, 0.09975694119930267, 0.08356232196092606, -0.07193445414304733, 0.013736906461417675, 0.060448694974184036, -0.07138432562351227, 0.05265350267291069, 0.08681001514196396, -0.022058047354221344, -0.14080430567264557, 0.032256320118904114, -0.06865343451499939, -0.009695980697870255, -0.025623267516493797, -0.021338442340493202, 0.006559059023857117, -0.03822861611843109, 0.028702344745397568, 0.007301722653210163, 0.017207754775881767, -0.04450647905468941, -0.08305211365222931, 0.021724896505475044, 0.07308948040008545, -0.07007843255996704, 0.04172665253281593, -0.07135306298732758, 0.0614478774368763, -0.07429298013448715, -0.0034686485305428505, -0.16575303673744202, -0.022028736770153046, 0.04378719627857208, -0.041254863142967224, 0.04837779328227043, 0.09243578463792801, 0.004461746662855148, 0.12373839318752289, -0.04172337055206299, 0.004325936548411846, -0.004619713872671127, -0.010211870074272156, -0.04850342869758606, -0.12376159429550171, -0.07944148778915405, -0.06694530695676804, 0.09649814665317535, -0.07458582520484924, 0.029932640492916107, -0.06537113338708878, -0.02540234662592411, -0.010071882978081703, -0.05714914947748184, -0.006557982414960861, 0.010811210609972477, -0.02568638138473034, -0.04593179374933243, 0.04839558154344559, 0.05088616907596588, -0.053692128509283066, 0.07517316937446594, -0.10169662535190582, -0.060996152460575104, 0.05871301889419556, 0.01101418025791645, -0.08239080011844635, 0.0976007878780365, -0.020121980458498, -0.013417359441518784, -0.05581267178058624, -0.045548517256975174, 0.19127613306045532, -0.027940694242715836, 0.10158704221248627, -0.09039495885372162, 0.00388571759685874, 0.03133288770914078, -0.04225901514291763, -0.014397293329238892, 0.06037011742591858, 0.04705995321273804, -0.17520901560783386, 0.013998840004205704, 0.0478646382689476, 0.07718496024608612, 0.10888184607028961, 0.026864342391490936, -0.02206418849527836, -0.04521729797124863, -0.01439508143812418, 0.006363173946738243, 0.057725369930267334, -0.032780833542346954, -0.009470775723457336, 0.0321294330060482, 0.05623869225382805, 0.019214404746890068, -0.08078406751155853, 0.03714312985539436, 0.06723201274871826, -0.016220461577177048, -0.04218944162130356, -0.025180375203490257, -0.05951528996229172, 0.06287623196840286, 0.05066041648387909, 0.03574633225798607, 0.02547166496515274, -0.015019122511148453, -0.13497434556484222, 0.18679824471473694, -0.11708512157201767, -0.258126825094223, -0.1105044037103653, -0.059475332498550415, -0.02214379422366619, 0.04167959839105606, 0.058681026101112366, -0.03544344753026962, -0.04486009106040001, -0.11463643610477448, 0.061174504458904266, -0.06772293895483017, -0.028433283790946007, -0.007900901138782501, -0.051829464733600616, -0.018454179167747498, -0.1250641942024231, -0.011284848675131798, -0.029772378504276276, -0.0772857815027237, 0.008494852110743523, -0.03217955678701401, 0.030779924243688583, 0.13293585181236267, 0.032018329948186874, -0.01991739310324192, -0.017335057258605957, 0.19094252586364746, 0.010906057432293892, 0.06362214684486389, 0.11560223996639252, -0.025893475860357285, 0.052608996629714966, 0.048252709209918976, 0.025932885706424713, -0.0478072389960289, 0.011846133507788181, -0.017171241343021393, -0.12256306409835815, -0.17759345471858978, -0.06853188574314117, -0.0053517515771090984, 0.010809679515659809, 0.01668175309896469, 0.03358549252152443, 0.01704435423016548, 0.04068082198500633, -0.02687314711511135, 0.03258804976940155, -0.01432669535279274, 0.0787554681301117, 0.0367719829082489, -0.07668288052082062, 0.09379535168409348, -0.062065087258815765, 0.017636016011238098, 0.11140482872724533, -0.06340388208627701, 0.18959057331085205, 0.022560827434062958, 0.06183277443051338, 0.1012580469250679, 0.024673812091350555, 0.055086150765419006, 0.08884965628385544, -0.04519745707511902, 0.006425907369703054, -0.05948019027709961, -0.05078095942735672, -0.03590041399002075, 0.04723912104964256, 0.021517185494303703, 0.01674986630678177, -0.12068229168653488, 0.0241408534348011, -0.0009431926300749183, 0.128947451710701, 0.04466995224356651, -0.12340600788593292, -0.12585137784481049, 0.0327799990773201, -0.04171651601791382, -0.06269057095050812, 0.030512571334838867, 0.06625494360923767, -0.14974355697631836, 0.043987471610307693, -0.0050271255895495415, 0.0641276091337204, -0.08799441158771515, 0.017956702038645744, -0.041180189698934555, 0.002810130827128887, 0.003807277651503682, 0.06832818686962128, -0.13205130398273468, 0.10461408644914627, 0.022441694512963295, 0.04991709440946579, -0.0800507441163063, 0.01822647452354431, -0.00957946851849556, 0.10940859466791153, 0.11724839359521866, 0.04359833151102066, -0.0644303411245346, -0.01750795915722847, -0.042934074997901917, 0.02233891375362873, 0.05689764767885208, -0.07172766327857971, 0.05979909747838974, 0.01110991183668375, 0.006843294017016888, -0.022937757894396782, 0.018231380730867386, -0.13300490379333496, -0.12365460395812988, 0.05985242873430252, -0.07762061059474945, -0.10862930864095688, -0.055916160345077515, -0.06208069249987602, -0.053412966430187225, 0.21671992540359497, -0.11488303542137146, -0.09245716035366058, -0.09945961087942123, -0.010869123041629791, 0.04646620899438858, -0.06465836614370346, 0.046398602426052094, -0.036560527980327606, 0.09159322082996368, -0.04682710021734238, -0.1096709668636322, 0.034222811460494995, -0.1140734851360321, -0.11350440979003906, -0.04415642470121384, 0.10553891211748123, 0.11306772381067276, 0.03700760751962662, 0.010728276334702969, 0.012628093361854553, -0.0019399840384721756, -0.11769424378871918, 0.019920358434319496, 0.12444912642240524, 0.0013623293489217758, 0.06863150000572205, -0.059108056128025055, 0.023773960769176483, -0.011970611289143562, -0.0009635556489229202, 0.12554678320884705, 0.1826457679271698, -0.06391167640686035, 0.1733759045600891, 0.2014981210231781, -0.10635311901569366, -0.19311648607254028, -0.050417929887771606, -0.0038177361711859703, 0.0461050383746624, 0.04803616926074028, -0.18431523442268372, 0.09318340569734573, 0.032305918633937836, -0.0302071925252676, 0.01833733543753624, -0.22709862887859344, -0.11191847175359726, 0.08600810170173645, 0.05826984718441963, 0.18890243768692017, -0.08076135069131851, -0.03927435725927353, -0.015632938593626022, -0.04111643135547638, 0.039172135293483734, -0.041968971490859985, 0.0871618241071701, 0.004857044667005539, -0.026309184730052948, 0.0015366030856966972, -0.03222339227795601, 0.09450642019510269, 0.04262085258960724, 0.023521937429904938, -0.07083788514137268, -0.003859158605337143, 0.11474980413913727, -0.03646116703748703, 0.09537576884031296, 0.043780308216810226, 0.07593598961830139, -0.09717923402786255, -0.05821762979030609, -0.07565602660179138, 0.04352326691150665, -0.0433308407664299, -0.05333785340189934, -0.06462530791759491, 0.05822565034031868, 0.0355142280459404, 0.009180165827274323, -0.006543487310409546, -0.03612305596470833, 0.04530418664216995, 0.09599123895168304, 0.08052441477775574, -0.03632611408829689, -0.06577195972204208, -0.04588334262371063, -0.049115125089883804, 0.06489831209182739, -0.09780273586511612, 0.018109166994690895, 0.029574738815426826, 0.009231152944266796, 0.08633477240800858, 0.03461666777729988, -0.13690127432346344, 0.010026547126471996, 0.037004776298999786, -0.12748444080352783, -0.1047431230545044, -0.019215259701013565, 0.020668093115091324, -0.03689616918563843, 0.05166111886501312, 0.14524197578430176, -0.035980209708213806, -0.03071717545390129, -0.04965656250715256, 0.039706841111183167, -0.018411803990602493, 0.054458100348711014, 0.06367562711238861, 0.030493546277284622, -0.07528029382228851, 0.0743025690317154, 0.0397883765399456, -0.04071766883134842, 0.0399167463183403, 0.04494763910770416, -0.0976245179772377, -0.07916930317878723, -0.05715532228350639, 0.08965521305799484, -0.019585100933909416, -0.04832523316144943, -0.001965392380952835, -0.0843312218785286, 0.06831260770559311, 0.06746926158666611, 0.04608915373682976, 0.03480115532875061, -0.084067702293396, 0.016540925949811935, -0.055504314601421356, 0.03891049325466156, -0.03211038559675217, -0.004612285643815994, -0.05253250151872635, 0.07091795653104782, 0.06142133101820946, 0.09715056419372559, -0.03421120345592499, -0.07006251811981201, -0.0830162987112999, -0.013094116002321243, -0.06506174802780151, -0.030874740332365036, -0.07481975108385086, -0.007040579337626696, 0.0007330663502216339, -0.0024011246860027313, 0.019529065117239952, 0.03843512386083603, -0.04349350184202194, -0.01834641955792904, -0.032885558903217316, 0.03813020884990692, -0.06191142648458481, 0.006710666231811047, 0.01594703644514084, -0.03657400608062744, 0.09178446978330612, 0.03659152612090111, -0.011302925646305084, 0.04844406247138977, -0.01978151686489582, 0.034349799156188965, -0.022434482350945473, 0.0023845776449888945, -0.020826339721679688, -0.10693015158176422, -0.0018543109763413668, 0.00546158105134964, -0.028689928352832794, 0.010463674552738667, 0.057705432176589966, -0.07146724313497543, 0.09127242118120193, 0.04676716774702072, -0.027508459985256195, -0.07176797091960907, 0.04138152673840523, -0.011239562183618546, 0.027947770431637764, 0.0661904513835907, -0.03564012423157692, 0.049502644687891006, -0.10163287818431854, -0.030231598764657974, 0.0030529052019119263, -0.007811494171619415, -0.007430387660861015, -0.052961040288209915, -0.00022923946380615234, 0.00514364056289196, 0.17644526064395905, -0.01915637031197548, 0.03763076290488243, 0.014023756608366966, 0.00551464781165123, 0.04672301560640335, -0.012864390388131142, 0.06738162040710449, -0.00974278524518013, -0.027423974126577377, -0.011775299906730652, 0.03707100823521614, 0.005107542499899864, 0.007501758635044098, 0.14172148704528809, 0.0425407737493515, 0.09101136028766632, 0.07300115376710892, 0.01826474443078041, 0.015540613792836666, -0.12802039086818695, -0.08763066679239273, 0.0030290838330984116, 0.05953136086463928, -0.019179699942469597, 0.005254138261079788, 0.0916912779211998, -0.0866803377866745, 0.07248292118310928, 0.044527988880872726, -0.04995749145746231, -0.12529748678207397, -0.18559907376766205, -0.02419060282409191, -0.030615853145718575, -0.010415713302791119, -0.09292875230312347, 0.015249943360686302, 0.09243672341108322, 0.02487814798951149, -0.011914516799151897, 0.09446249902248383, -0.10064030438661575, -0.026909727603197098, 0.042296718806028366, -0.026777952909469604, 0.012821842916309834, 0.054506681859493256, 0.02232550084590912, -0.004534661769866943, 0.04459883272647858, 0.03869357705116272, 0.042125359177589417, 0.024708639830350876, 0.051172755658626556, -0.02291279099881649, -0.07082414627075195, -0.03460843116044998, -0.0038031148724257946, 0.05445053428411484, 0.13480769097805023, 0.022832201793789864, -0.06913938373327255, 0.007898720912635326, 0.10809087753295898, -0.03100627288222313, -0.045544467866420746, -0.10827215760946274, 0.23294620215892792, 0.022920768707990646, 0.0008531862404197454, -0.002634267322719097, -0.04569801315665245, 0.0055384524166584015, 0.2175370454788208, 0.2273438572883606, 0.006015181075781584, -0.011121466755867004, 0.009623966179788113, -0.012077752500772476, 0.03747621923685074, 0.14694571495056152, 0.005896100774407387, 0.24468696117401123, -0.046108782291412354, 0.03539388254284859, -0.042735300958156586, -0.03956575319170952, -0.10054683685302734, 0.0721805989742279, -0.008948560804128647, 0.006949468515813351, -0.02930404245853424, 0.06965947151184082, -0.03698698431253433, -0.18316307663917542, 0.008266246877610683, -0.0009576031006872654, -0.05980342626571655, 0.010936399921774864, 0.000911240465939045, 0.021021412685513496, 0.08167162537574768, -0.01894574984908104, -0.0055045923218131065, 0.1300084888935089, 0.01811349205672741, -0.097515769302845, -0.059283532202243805, 0.11412845551967621, 0.013084441423416138, 0.1411011517047882, 0.011289669200778008, 0.08025462925434113, 0.08750185370445251, 0.022165697067975998, -0.09748027473688126, 0.042584966868162155, -0.020090928301215172, -0.030970130115747452, 0.0068758223205804825, 0.1103372722864151, -0.009071684442460537, 0.054648324847221375, 0.02578245848417282, -0.08420674502849579, 0.060252055525779724, 0.010027725249528885, -0.0336284413933754, -0.07905290275812149, 0.08145339787006378, -0.09207519888877869, 0.1548873484134674, 0.12548673152923584, -0.013339139521121979, -0.042821742594242096, -0.03017771802842617, 0.017707252874970436, -0.00017651822417974472, 0.059547025710344315, -0.025020916014909744, -0.13502518832683563, 0.019020088016986847, -0.08095814287662506, 0.02692423015832901, -0.24866676330566406, -0.08734390139579773, 0.028289638459682465, -0.017308298498392105, -0.019253559410572052, 0.05347169563174248, 0.04821516200900078, 0.02720627374947071, -0.03534167259931564, 0.02766520157456398, -0.03791039064526558, 0.06043550372123718, -0.1092035174369812, -0.09298408031463623 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 160k (uncased) Seed 4 intermediate checkpoint 160k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-160k') model = BertModel.from_pretrained("multiberts-seed-4-160k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-160k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 160k (uncased) Seed 4 intermediate checkpoint 160k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 160k (uncased)\nSeed 4 intermediate checkpoint 160k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 160k (uncased)\nSeed 4 intermediate checkpoint 160k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 160k (uncased)\nSeed 4 intermediate checkpoint 160k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08651437610387802, -0.0013168023433536291, -0.0021946595516055822, 0.07101782411336899, 0.0886918306350708, 0.003345029428601265, 0.11498373746871948, 0.04871316999197006, -0.030768685042858124, 0.023907924070954323, 0.09227819740772247, 0.028286654502153397, 0.042659930884838104, 0.06557503342628479, 0.09623660892248154, -0.25707879662513733, 0.04684547707438469, -0.06286723166704178, 0.05384295433759689, 0.0742563009262085, 0.09864477068185806, -0.07290719449520111, 0.06420339643955231, 0.035992804914712906, -0.08733080327510834, -0.013601237908005714, -0.01721365936100483, -0.03336073458194733, 0.10064385831356049, 0.07093720883131027, 0.06268948316574097, 0.00220351479947567, 0.05792009085416794, -0.085172638297081, 0.01594117283821106, 0.04449072852730751, -0.0020550210028886795, 0.025065725669264793, -0.005504356697201729, 0.01730477251112461, 0.11019660532474518, 0.03658130392432213, 0.074895478785038, 0.034245096147060394, -0.09497302770614624, -0.12028125673532486, -0.07981926202774048, 0.09725260734558105, 0.04990136995911598, 0.04142716899514198, -0.006376471370458603, 0.07024497538805008, -0.029751921072602272, 0.07517313212156296, 0.10401014983654022, -0.260551393032074, -0.009416397660970688, 0.06617169082164764, 0.045374710112810135, 0.04462319612503052, 0.011947927996516228, 0.02752789855003357, 0.007348105311393738, 0.044881753623485565, 0.02841798961162567, -0.02406160533428192, 0.11816516518592834, -0.04333022981882095, -0.14970222115516663, -0.041396040469408035, 0.12328404933214188, -0.00861348956823349, -0.12516501545906067, -0.09797564148902893, -0.029958758503198624, 0.11812940239906311, -0.0031075142323970795, -0.01743372157216072, -0.0030062617734074593, 0.010805463418364525, 0.026021385565400124, -0.09171368181705475, -0.08660190552473068, -0.026751425117254257, -0.036757323890924454, 0.12185945361852646, 0.04758874326944351, 0.05281562730669975, -0.03413010016083717, 0.08537527918815613, -0.11809518933296204, -0.04025217518210411, -0.04975276440382004, -0.08235494792461395, -0.017244994640350342, 0.011205951683223248, -0.026958780363202095, -0.08089044690132141, -0.05789939686655998, 0.11389516294002533, 0.029656998813152313, 0.03049623593688011, -0.004345704801380634, 0.03967893123626709, 0.07128386199474335, 0.09373387694358826, -0.03861021250486374, 0.04942430555820465, 0.03529069945216179, -0.023812398314476013, 0.05585267394781113, -0.0509403720498085, -0.10168658196926117, 0.0767088234424591, -0.003017653711140156, 0.03922083601355553, 0.024052627384662628, 0.03565026819705963, -0.010566841810941696, -0.07048571109771729, 0.16911722719669342, -0.07934582978487015, -0.008658737875521183, -0.01640472374856472, 0.011746054515242577, 0.0460110604763031, 0.03333169221878052, -0.009476622566580772, -0.048605915158987045, -0.00860354583710432, -0.055541619658470154, -0.027952995151281357, -0.056992121040821075, -0.12157401442527771, -0.0017610788345336914, -0.04046051204204559, -0.03170468285679817, -0.14135782420635223, -0.21578329801559448, -0.019192447885870934, 0.06199399754405022, -0.0030145575292408466, -0.00912695936858654, 0.024425674229860306, 0.014385910704731941, -0.021081317216157913, 0.01070854626595974, -0.0409388542175293, -0.0012439200654625893, -0.007658899761736393, -0.02717922069132328, 0.05849447101354599, -0.039432212710380554, 0.02325446344912052, -0.06809956580400467, 0.021177569404244423, -0.20881816744804382, 0.08810415118932724, -0.03402075543999672, 0.0028375759720802307, -0.036388467997312546, -0.04524713754653931, 0.012685529887676239, 0.04837244004011154, -0.009802942164242268, 0.11697319149971008, -0.13681365549564362, -0.05168299376964569, 0.18222931027412415, -0.159469336271286, -0.00008184462785720825, 0.10218863189220428, -0.04856905713677406, 0.05390318110585213, 0.13464878499507904, 0.09820521622896194, 0.08585887402296066, -0.07335812598466873, 0.010873517952859402, 0.061614371836185455, -0.06718793511390686, 0.05561980977654457, 0.08826472610235214, -0.022742975503206253, -0.138569176197052, 0.031686168164014816, -0.07171711325645447, -0.009053833782672882, -0.025801364332437515, -0.01974490098655224, 0.0066977012902498245, -0.037990741431713104, 0.02996024861931801, 0.0072571891359984875, 0.017862271517515182, -0.042532067745923996, -0.08345384150743484, 0.028271906077861786, 0.07410648465156555, -0.07203534245491028, 0.04095315560698509, -0.07170048356056213, 0.06262806057929993, -0.07625141739845276, -0.0035759233869612217, -0.16664166748523712, -0.024408776313066483, 0.04449935257434845, -0.041244566440582275, 0.048878543078899384, 0.09693139046430588, 0.00422696303576231, 0.12448616325855255, -0.04035424441099167, 0.0029231682419776917, -0.004857499152421951, -0.011077417060732841, -0.04998824745416641, -0.1241697371006012, -0.08206775784492493, -0.06861047446727753, 0.09781867265701294, -0.0778483897447586, 0.028822265565395355, -0.06638922542333603, -0.021448707208037376, -0.008714770898222923, -0.05696263536810875, -0.004541914910078049, 0.010925688780844212, -0.02760109305381775, -0.04662976413965225, 0.048600196838378906, 0.05007907748222351, -0.05620158463716507, 0.07703961431980133, -0.10417740046977997, -0.05767334625124931, 0.05701224505901337, 0.013271177187561989, -0.07973617315292358, 0.09463793784379959, -0.019558727741241455, -0.013407344929873943, -0.0556698814034462, -0.043906524777412415, 0.19324776530265808, -0.027279343456029892, 0.1022111177444458, -0.09069620072841644, 0.002041511470451951, 0.028267187997698784, -0.044825222343206406, -0.015611032024025917, 0.06144557520747185, 0.04416559636592865, -0.18214233219623566, 0.015537332743406296, 0.050197601318359375, 0.07735351473093033, 0.11494185775518417, 0.026729896664619446, -0.023703444749116898, -0.04574720561504364, -0.011151102371513844, 0.006883583497256041, 0.055858857929706573, -0.028458990156650543, -0.010238682851195335, 0.03201211988925934, 0.05563374608755112, 0.018672965466976166, -0.08065526187419891, 0.0371260941028595, 0.0662783682346344, -0.017113249748945236, -0.039448559284210205, -0.02442394569516182, -0.06008261814713478, 0.06239299848675728, 0.05064106732606888, 0.036662641912698746, 0.025736693292856216, -0.014353644102811813, -0.13502588868141174, 0.18829062581062317, -0.11667976528406143, -0.2585587501525879, -0.10792899131774902, -0.05889102816581726, -0.02378891035914421, 0.04182121530175209, 0.057984329760074615, -0.03359004855155945, -0.0440421998500824, -0.11513456702232361, 0.06190069392323494, -0.06635058671236038, -0.02889086864888668, -0.011939877644181252, -0.05117756128311157, -0.01796230860054493, -0.126570463180542, -0.011222932487726212, -0.028923386707901955, -0.07764260470867157, 0.007390139624476433, -0.03338728845119476, 0.029060252010822296, 0.13229672610759735, 0.03425220400094986, -0.01960695907473564, -0.017490852624177933, 0.19166609644889832, 0.011754170060157776, 0.060690294951200485, 0.11666487902402878, -0.027103537693619728, 0.0525643453001976, 0.044788897037506104, 0.024523787200450897, -0.04844155162572861, 0.011179006658494473, -0.016365040093660355, -0.12116637080907822, -0.17662540078163147, -0.06901352107524872, -0.005121815018355846, 0.008850665763020515, 0.01874612830579281, 0.03539862856268883, 0.016816861927509308, 0.040125731378793716, -0.029815668240189552, 0.03422572463750839, -0.014627136290073395, 0.07871939241886139, 0.03238982334733009, -0.07606440782546997, 0.0938488245010376, -0.06289483606815338, 0.016454452648758888, 0.11049578338861465, -0.0610838457942009, 0.18906739354133606, 0.023338040336966515, 0.059748075902462006, 0.10135847330093384, 0.023020878434181213, 0.05550875514745712, 0.08890560269355774, -0.04735654219985008, 0.0057408157736063, -0.06031292304396629, -0.0519653782248497, -0.03570753335952759, 0.04860328882932663, 0.02866898477077484, 0.017514195293188095, -0.12001818418502808, 0.0207703597843647, -0.0020460986997932196, 0.1306864321231842, 0.047713346779346466, -0.12316592037677765, -0.12443074584007263, 0.034394145011901855, -0.04320118576288223, -0.06338301301002502, 0.02939026802778244, 0.06383926421403885, -0.15179601311683655, 0.045845162123441696, -0.006151698529720306, 0.06514135003089905, -0.0934792309999466, 0.017402388155460358, -0.043087396770715714, 0.0030063530430197716, 0.00514926016330719, 0.07056380063295364, -0.13738292455673218, 0.10211633890867233, 0.021995525807142258, 0.04988750070333481, -0.08094493299722672, 0.017999764531850815, -0.01036261860281229, 0.10992495715618134, 0.11468026041984558, 0.04274516552686691, -0.05894257128238678, -0.017109036445617676, -0.04383967071771622, 0.022401487454771996, 0.05814962089061737, -0.07580814510583878, 0.06142909824848175, 0.009659691713750362, 0.006902484223246574, -0.02404838241636753, 0.014690443873405457, -0.13486379384994507, -0.12463110685348511, 0.06080995500087738, -0.07710231095552444, -0.10310989618301392, -0.05631648749113083, -0.0636470690369606, -0.05110549181699753, 0.2158585786819458, -0.11639796197414398, -0.09221436083316803, -0.09912002831697464, -0.011160172522068024, 0.04541023448109627, -0.06480688601732254, 0.046055763959884644, -0.03700965642929077, 0.09253212064504623, -0.04703734070062637, -0.11106805503368378, 0.03479228541254997, -0.1147233098745346, -0.1150444746017456, -0.04381413757801056, 0.10622094571590424, 0.11412858217954636, 0.03741894289851189, 0.012126014567911625, 0.011455873027443886, -0.001963043585419655, -0.116681769490242, 0.01758137159049511, 0.12880262732505798, 0.0004786849021911621, 0.0708901435136795, -0.0601515993475914, 0.02401897683739662, -0.01270265318453312, 0.00008001923561096191, 0.12769117951393127, 0.18335995078086853, -0.06322750449180603, 0.17521889507770538, 0.19873341917991638, -0.10614021867513657, -0.19081497192382812, -0.05343058332800865, -0.00046591460704803467, 0.04709937050938606, 0.04990336671471596, -0.1873219907283783, 0.09123022854328156, 0.030307726934552193, -0.03050638921558857, 0.024787113070487976, -0.22926731407642365, -0.11052519828081131, 0.08804281055927277, 0.05661148577928543, 0.1897178590297699, -0.08223408460617065, -0.04022494703531265, -0.017724452540278435, -0.03962624818086624, 0.042858973145484924, -0.042080357670784, 0.08929131925106049, 0.005617808550596237, -0.027269307523965836, 0.0016666296869516373, -0.030149254947900772, 0.09632641077041626, 0.040267352014780045, 0.023376625031232834, -0.07105208188295364, -0.003524716943502426, 0.1120600700378418, -0.037188149988651276, 0.09695454686880112, 0.03902461379766464, 0.0743291899561882, -0.09992130845785141, -0.059467487037181854, -0.07525160908699036, 0.04527270793914795, -0.04200249910354614, -0.05347485467791557, -0.06298989057540894, 0.056766197085380554, 0.03483069688081741, 0.01104926411062479, -0.004832029342651367, -0.037830058485269547, 0.04554229974746704, 0.0930262878537178, 0.08237379044294357, -0.03125900775194168, -0.06993129849433899, -0.04953357204794884, -0.047822970896959305, 0.06764159351587296, -0.09491842985153198, 0.017590390518307686, 0.027740444988012314, 0.008886652998626232, 0.08762980252504349, 0.033625662326812744, -0.13883166015148163, 0.01146508939564228, 0.03514751419425011, -0.12727858126163483, -0.10585515201091766, -0.019329741597175598, 0.024290550500154495, -0.035968340933322906, 0.05486408248543739, 0.1484827697277069, -0.036049362272024155, -0.03164101392030716, -0.049137428402900696, 0.038157396018505096, -0.019805897027254105, 0.05271415412425995, 0.06470315158367157, 0.030378958210349083, -0.07431034743785858, 0.07308214902877808, 0.03903147578239441, -0.0373581126332283, 0.04224138706922531, 0.042105965316295624, -0.09588658064603806, -0.07949930429458618, -0.05860717594623566, 0.09339975565671921, -0.018862968310713768, -0.04973280429840088, -0.0023037921637296677, -0.08332698792219162, 0.06864988058805466, 0.06988106667995453, 0.04726266488432884, 0.03712404891848564, -0.0860494002699852, 0.01608232781291008, -0.05458841100335121, 0.036496054381132126, -0.030800173059105873, -0.004121644422411919, -0.05577019602060318, 0.06798055022954941, 0.06216774508357048, 0.09804035723209381, -0.03455951809883118, -0.07196347415447235, -0.08424142003059387, -0.013839480467140675, -0.06229333579540253, -0.03050241619348526, -0.07437390089035034, -0.0064059412106871605, 0.0012542558833956718, -0.0024997927248477936, 0.021794471889734268, 0.038289111107587814, -0.043405916541814804, -0.01818118244409561, -0.0342160128057003, 0.03943496197462082, -0.0626116693019867, 0.006073039025068283, 0.0150860995054245, -0.03740178048610687, 0.09270031005144119, 0.03633143752813339, -0.012037238106131554, 0.04817207530140877, -0.025232400745153427, 0.03409726545214653, -0.02023885026574135, 0.000562373548746109, -0.022373169660568237, -0.10756704211235046, -0.0032183704897761345, 0.005457624793052673, -0.02592817321419716, 0.010102886706590652, 0.05991566553711891, -0.07212140411138535, 0.0880618691444397, 0.04690470173954964, -0.027592509984970093, -0.07095848768949509, 0.04277421906590462, -0.014664364978671074, 0.028535958379507065, 0.06801846623420715, -0.035885922610759735, 0.05136411264538765, -0.10035040229558945, -0.030010968446731567, 0.0029668041970580816, -0.007758136838674545, -0.009496651589870453, -0.053059171885252, -0.0008571678772568703, 0.005589993670582771, 0.17647065222263336, -0.01991339400410652, 0.039908409118652344, 0.012067110277712345, 0.0069014644250273705, 0.04878302291035652, -0.012573760002851486, 0.06953071057796478, -0.009379040449857712, -0.024812817573547363, -0.013790154829621315, 0.03756359964609146, 0.004220195114612579, 0.006105892360210419, 0.13863107562065125, 0.04272882640361786, 0.08860275149345398, 0.07554429024457932, 0.01686132699251175, 0.01640368439257145, -0.13182227313518524, -0.08804241567850113, 0.005698088556528091, 0.06100412830710411, -0.01893630065023899, 0.010591268539428711, 0.09214341640472412, -0.0871865451335907, 0.07103502750396729, 0.04672864079475403, -0.04902512580156326, -0.12534451484680176, -0.1919160634279251, -0.024836812168359756, -0.030919281765818596, -0.010588730685412884, -0.09169892966747284, 0.016652826219797134, 0.09326072782278061, 0.024461396038532257, -0.01156676933169365, 0.09445598721504211, -0.10053454339504242, -0.02955867350101471, 0.044668518006801605, -0.028070129454135895, 0.013541141524910927, 0.05168094485998154, 0.02400132268667221, -0.004732389003038406, 0.04335836321115494, 0.03910313919186592, 0.042639147490262985, 0.026461705565452576, 0.05262730270624161, -0.022709015756845474, -0.07272560894489288, -0.03300410509109497, -0.0026961402036249638, 0.05395015701651573, 0.13630425930023193, 0.023225918412208557, -0.07089807093143463, 0.006902310065925121, 0.10647715628147125, -0.030835818499326706, -0.04665761813521385, -0.10784211754798889, 0.23568500578403473, 0.02186743915081024, 0.0020568082109093666, -0.004468734376132488, -0.045272551476955414, 0.006406169384717941, 0.21140697598457336, 0.22580859065055847, 0.00473030237480998, -0.009912845678627491, 0.008986660279333591, -0.011685332283377647, 0.03803219273686409, 0.14678947627544403, 0.005371913313865662, 0.24999424815177917, -0.04740939289331436, 0.03544493392109871, -0.04244364798069, -0.03894621506333351, -0.10202951729297638, 0.07030133157968521, -0.009364401921629906, 0.007561234757304192, -0.0288083478808403, 0.07017713785171509, -0.037211738526821136, -0.18021467328071594, 0.006217350251972675, -0.000004386762157082558, -0.06157704070210457, 0.009593609720468521, 0.002317889593541622, 0.021311769261956215, 0.08257164061069489, -0.019898254424333572, -0.006601531058549881, 0.1330219805240631, 0.01775849051773548, -0.0979849323630333, -0.05726831406354904, 0.11315393447875977, 0.013105412945151329, 0.1380055695772171, 0.010258843190968037, 0.08055436611175537, 0.08700055629014969, 0.021682675927877426, -0.09541970491409302, 0.04168401286005974, -0.019961420446634293, -0.031282536685466766, 0.008030340075492859, 0.10967372357845306, -0.008396560326218605, 0.057540860027074814, 0.02625439316034317, -0.0853181704878807, 0.061823148280382156, 0.012111179530620575, -0.03443768620491028, -0.08028442412614822, 0.08464814722537994, -0.09268665313720703, 0.1548445075750351, 0.12454448640346527, -0.013842699117958546, -0.04575517028570175, -0.029882442206144333, 0.02020675130188465, -0.0003519151359796524, 0.0574553906917572, -0.02572912536561489, -0.1326185166835785, 0.018915293738245964, -0.07883632183074951, 0.028037743642926216, -0.2486521601676941, -0.08732633292675018, 0.027938056737184525, -0.018064269796013832, -0.02055685967206955, 0.052114371210336685, 0.04633092135190964, 0.02737848274409771, -0.036245934665203094, 0.020660892128944397, -0.038182079792022705, 0.059258706867694855, -0.110618457198143, -0.09415209293365479 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1700k (uncased) Seed 4 intermediate checkpoint 1700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1700k') model = BertModel.from_pretrained("multiberts-seed-4-1700k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1700k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1700k (uncased) Seed 4 intermediate checkpoint 1700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1700k (uncased)\nSeed 4 intermediate checkpoint 1700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1700k (uncased)\nSeed 4 intermediate checkpoint 1700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1700k (uncased)\nSeed 4 intermediate checkpoint 1700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08633259683847427, -0.00241061276756227, -0.002083731582388282, 0.07068265974521637, 0.08518430590629578, 0.003648973535746336, 0.11105518043041229, 0.048215799033641815, -0.02797805145382881, 0.02281944826245308, 0.09437261521816254, 0.027909263968467712, 0.04349576309323311, 0.06439106166362762, 0.09635657072067261, -0.25680550932884216, 0.04787800833582878, -0.0618727020919323, 0.05735975503921509, 0.0753442794084549, 0.09805256128311157, -0.07397457957267761, 0.06345444917678833, 0.03622780740261078, -0.08694900572299957, -0.014909304678440094, -0.016970913857221603, -0.03637218102812767, 0.1007922887802124, 0.07079876959323883, 0.061451904475688934, 0.0026765596121549606, 0.058122653514146805, -0.08422937989234924, 0.01605352945625782, 0.044101081788539886, -0.001842845231294632, 0.02454831451177597, -0.008200723677873611, 0.01775110512971878, 0.11241297423839569, 0.0395742803812027, 0.07686381042003632, 0.034733109176158905, -0.09512815624475479, -0.12382400780916214, -0.07987091690301895, 0.10481660068035126, 0.051754120737314224, 0.04006326571106911, -0.005944997072219849, 0.07092253863811493, -0.02965579181909561, 0.07529227435588837, 0.09821157157421112, -0.25945043563842773, -0.010334211401641369, 0.06828974187374115, 0.04622560366988182, 0.05005909502506256, 0.011342252604663372, 0.026938116177916527, 0.007376763969659805, 0.044097427278757095, 0.030911415815353394, -0.02337627485394478, 0.12017455697059631, -0.04271046817302704, -0.15000523626804352, -0.04273361712694168, 0.12072127312421799, -0.007486674934625626, -0.12364836037158966, -0.09916490316390991, -0.02964634820818901, 0.11311300098896027, -0.0034179072827100754, -0.019516726955771446, -0.004956544376909733, 0.01121837180107832, 0.025839626789093018, -0.09279259294271469, -0.08642326295375824, -0.027819542214274406, -0.03494388610124588, 0.12424080073833466, 0.04788073152303696, 0.05385083705186844, -0.035533688962459564, 0.08567403256893158, -0.11460521817207336, -0.038649849593639374, -0.050268933176994324, -0.08414085954427719, -0.01876690611243248, 0.01012613158673048, -0.024591706693172455, -0.07741080969572067, -0.06060493737459183, 0.1167864054441452, 0.03237595409154892, 0.03233501315116882, -0.004941537044942379, 0.04022449254989624, 0.07125301659107208, 0.0930500477552414, -0.039400115609169006, 0.047231242060661316, 0.03452757000923157, -0.02218548208475113, 0.055466629564762115, -0.05075836181640625, -0.09906888753175735, 0.0759659931063652, -0.0028049033135175705, 0.03993351012468338, 0.02545555680990219, 0.03451703116297722, -0.007519591599702835, -0.07119850814342499, 0.1727452278137207, -0.07662352919578552, -0.008947998285293579, -0.016917407512664795, 0.012193791568279266, 0.04431774467229843, 0.033458560705184937, -0.00902530737221241, -0.048163630068302155, -0.008096416480839252, -0.057196155190467834, -0.03017878532409668, -0.055710554122924805, -0.11941544711589813, -0.002611276227980852, -0.03807888180017471, -0.03261765092611313, -0.14090657234191895, -0.21767596900463104, -0.01910571940243244, 0.06218766048550606, -0.00313263526186347, -0.009984900243580341, 0.0260306503623724, 0.01625748537480831, -0.021522022783756256, 0.009957961738109589, -0.042027026414871216, -0.000522279180586338, -0.007230128161609173, -0.030886095017194748, 0.05941614508628845, -0.03752772510051727, 0.02245921827852726, -0.06808117032051086, 0.021091487258672714, -0.2117995172739029, 0.08688533306121826, -0.03551634028553963, 0.004586581140756607, -0.03682298585772514, -0.04559352993965149, 0.011242911219596863, 0.047935232520103455, -0.011723579838871956, 0.11540161818265915, -0.13247919082641602, -0.05129477381706238, 0.18504513800144196, -0.1603982150554657, -0.0028657466173171997, 0.10334701836109161, -0.04769828915596008, 0.0529584139585495, 0.13475725054740906, 0.09673115611076355, 0.08751194179058075, -0.07106257230043411, 0.01206160057336092, 0.06018729507923126, -0.06977611035108566, 0.05138019472360611, 0.08683981001377106, -0.023188363760709763, -0.14453446865081787, 0.031022902578115463, -0.0715838372707367, -0.008393257856369019, -0.025435490533709526, -0.021433286368846893, 0.0071575455367565155, -0.03781412914395332, 0.02628672495484352, 0.006975941825658083, 0.01772189699113369, -0.04392009973526001, -0.08322200179100037, 0.02040400356054306, 0.07519401609897614, -0.0710410326719284, 0.04157758131623268, -0.07144998759031296, 0.06014474108815193, -0.07248920202255249, -0.0027805021964013577, -0.1643596589565277, -0.020741675049066544, 0.044809602200984955, -0.04427490755915642, 0.0481155663728714, 0.09046892076730728, 0.0042300596833229065, 0.12448979914188385, -0.041483283042907715, 0.0021558692678809166, -0.002874605357646942, -0.009487884119153023, -0.04983033239841461, -0.12453097105026245, -0.08147551119327545, -0.06827117502689362, 0.0976850762963295, -0.07812456786632538, 0.028848929330706596, -0.06880229711532593, -0.022111790254712105, -0.008433561772108078, -0.057358670979738235, -0.00572618655860424, 0.010555056855082512, -0.026053929701447487, -0.04504818096756935, 0.04809395223855972, 0.0500369556248188, -0.057403452694416046, 0.07742788642644882, -0.10531394183635712, -0.06139073148369789, 0.05679842829704285, 0.013664191588759422, -0.08095784485340118, 0.09238987416028976, -0.020724570378661156, -0.01442087534815073, -0.054947130382061005, -0.04508174583315849, 0.18969213962554932, -0.02700038254261017, 0.10138508677482605, -0.09130977839231491, 0.002313640434294939, 0.02962099201977253, -0.04231663793325424, -0.0166713148355484, 0.05934952199459076, 0.04722704738378525, -0.18648965656757355, 0.014130651950836182, 0.049394577741622925, 0.0785427838563919, 0.11393626779317856, 0.028249867260456085, -0.02194744162261486, -0.04557107761502266, -0.012749657966196537, 0.00511400168761611, 0.055559296160936356, -0.02875528112053871, -0.009702072478830814, 0.03099058009684086, 0.056423209607601166, 0.019824765622615814, -0.08043378591537476, 0.036982323974370956, 0.06665860116481781, -0.015190275385975838, -0.04022720083594322, -0.022983456030488014, -0.06032298877835274, 0.061996348202228546, 0.05157419294118881, 0.03438354283571243, 0.024950819090008736, -0.014415843412280083, -0.13559411466121674, 0.18789184093475342, -0.11574537307024002, -0.2581482231616974, -0.10808606445789337, -0.058714888989925385, -0.022201769053936005, 0.040193766355514526, 0.058985307812690735, -0.03428281098604202, -0.044017598032951355, -0.1152079701423645, 0.05837816372513771, -0.0661095455288887, -0.027889486402273178, -0.010676978155970573, -0.052075035870075226, -0.021231679245829582, -0.12616312503814697, -0.011328490450978279, -0.031720761209726334, -0.07443983107805252, 0.007206670939922333, -0.0347924567759037, 0.02818024903535843, 0.13506382703781128, 0.03567638248205185, -0.01898139715194702, -0.01824474334716797, 0.190892294049263, 0.010829620063304901, 0.06199842691421509, 0.11305950582027435, -0.026900915428996086, 0.05248528718948364, 0.044851530343294144, 0.024956652894616127, -0.04843227565288544, 0.01097687054425478, -0.017502475529909134, -0.12166742235422134, -0.17547792196273804, -0.07013446092605591, -0.004455994814634323, 0.01024235412478447, 0.021798113361001015, 0.035014666616916656, 0.020659564062952995, 0.040246594697237015, -0.030153710395097733, 0.030734457075595856, -0.01288125291466713, 0.07934172451496124, 0.029961761087179184, -0.07622449100017548, 0.09350287169218063, -0.06237015873193741, 0.018353208899497986, 0.11004043370485306, -0.0604945607483387, 0.19126181304454803, 0.025202730670571327, 0.058055754750967026, 0.10240590572357178, 0.022075105458498, 0.05569896101951599, 0.08682006597518921, -0.04690108448266983, 0.005721377208828926, -0.06006568670272827, -0.05163239687681198, -0.03486741706728935, 0.049534209072589874, 0.028562624007463455, 0.01655655726790428, -0.11972278356552124, 0.021022900938987732, -0.0008115448290482163, 0.1329798549413681, 0.043869078159332275, -0.12229578197002411, -0.12540262937545776, 0.03449750319123268, -0.043937090784311295, -0.06302864104509354, 0.031554196029901505, 0.06202533841133118, -0.15073183178901672, 0.04604107514023781, -0.005617893300950527, 0.06525293737649918, -0.0905851274728775, 0.016667695716023445, -0.04386821389198303, 0.0012385156005620956, 0.004624474328011274, 0.06769433617591858, -0.1284770369529724, 0.10699255764484406, 0.021466925740242004, 0.05044233053922653, -0.08149582147598267, 0.01645861379802227, -0.00990233477205038, 0.11188970506191254, 0.11675392836332321, 0.043721359223127365, -0.061494678258895874, -0.015473658218979836, -0.04320910573005676, 0.021729150786995888, 0.0555248036980629, -0.07472042739391327, 0.05960751324892044, 0.010925966314971447, 0.0059740375727415085, -0.023451833054423332, 0.011930795386433601, -0.13358958065509796, -0.12240041047334671, 0.06030702963471413, -0.07533571124076843, -0.10470942407846451, -0.05675390362739563, -0.06250353902578354, -0.05198432505130768, 0.213779479265213, -0.1164945513010025, -0.09089437127113342, -0.09782712161540985, -0.012244317680597305, 0.046548031270504, -0.06549875438213348, 0.0470753014087677, -0.03655203804373741, 0.09137918055057526, -0.047093816101551056, -0.10883720219135284, 0.03579059988260269, -0.115483857691288, -0.11386112123727798, -0.043521225452423096, 0.10613876581192017, 0.1149861142039299, 0.03621770069003105, 0.012077715247869492, 0.011593657545745373, -0.0019711684435606003, -0.11861558258533478, 0.019067993387579918, 0.1279240995645523, -0.002171168103814125, 0.0708371177315712, -0.057351842522621155, 0.01962866261601448, -0.013829752802848816, -0.0013910960406064987, 0.1272195279598236, 0.18500211834907532, -0.06478419899940491, 0.17482905089855194, 0.20491722226142883, -0.10545532405376434, -0.18958938121795654, -0.052821144461631775, -0.004975683055818081, 0.04552499204874039, 0.04725196957588196, -0.18487897515296936, 0.09121096134185791, 0.03231774643063545, -0.031625986099243164, 0.019887257367372513, -0.23099307715892792, -0.11191669851541519, 0.09228327870368958, 0.05720480531454086, 0.1916225254535675, -0.07893447577953339, -0.03968711197376251, -0.014811001718044281, -0.03777530789375305, 0.043192923069000244, -0.03959948197007179, 0.08799813687801361, 0.004312662407755852, -0.03172824904322624, 0.0017321128398180008, -0.03213920816779137, 0.09463314712047577, 0.041413649916648865, 0.02359103038907051, -0.06961315125226974, -0.0018930453807115555, 0.10917052626609802, -0.037373051047325134, 0.09659469872713089, 0.04117511212825775, 0.07577422261238098, -0.09761348366737366, -0.059526048600673676, -0.07683680206537247, 0.04311797767877579, -0.04168517142534256, -0.05364546552300453, -0.06421347707509995, 0.05931626632809639, 0.036950964480638504, 0.009760207496583462, -0.005193429067730904, -0.03796080872416496, 0.047456949949264526, 0.09280700981616974, 0.08055714517831802, -0.03692406043410301, -0.06915214657783508, -0.049473706632852554, -0.04951656982302666, 0.06853669881820679, -0.09435886144638062, 0.01709340140223503, 0.027613012120127678, 0.009481249377131462, 0.08793454617261887, 0.03359152749180794, -0.13801638782024384, 0.009813986718654633, 0.036550115793943405, -0.12623241543769836, -0.10793247073888779, -0.01886444166302681, 0.025606617331504822, -0.03506945073604584, 0.05303766578435898, 0.14496836066246033, -0.036528948694467545, -0.03130124509334564, -0.049136750400066376, 0.03960167244076729, -0.018460823222994804, 0.05020139738917351, 0.06367430090904236, 0.030870821326971054, -0.07372522354125977, 0.07329539954662323, 0.04037363454699516, -0.042307719588279724, 0.04035208374261856, 0.046154022216796875, -0.09666317701339722, -0.07958933711051941, -0.06287848949432373, 0.08861369639635086, -0.02168017439544201, -0.04861932620406151, -0.0000742543488740921, -0.08318386971950531, 0.07027898728847504, 0.0697508305311203, 0.04620254039764404, 0.035634152591228485, -0.08659444749355316, 0.0164480023086071, -0.05462712049484253, 0.03701157122850418, -0.032660551369190216, -0.004774713888764381, -0.05396365374326706, 0.06732092797756195, 0.06400123238563538, 0.09652402997016907, -0.03404742851853371, -0.07261091470718384, -0.08388110995292664, -0.012634805403649807, -0.06409355252981186, -0.032828763127326965, -0.07858674228191376, -0.005397315137088299, 0.00022208597511053085, -0.0023116357624530792, 0.020038627088069916, 0.03732164949178696, -0.04343336820602417, -0.01709752529859543, -0.03190300613641739, 0.03961038216948509, -0.062213268131017685, 0.006444371305406094, 0.0160005372017622, -0.03693028539419174, 0.0933932363986969, 0.03591182827949524, -0.01171557791531086, 0.04746168851852417, -0.018642541021108627, 0.03431575372815132, -0.021631337702274323, 0.0009116784203797579, -0.0220484621822834, -0.10943594574928284, -0.0029570183251053095, 0.004606060683727264, -0.028034666553139687, 0.01178943831473589, 0.058642931282520294, -0.07241059839725494, 0.08736103028059006, 0.04869091510772705, -0.028554152697324753, -0.0717196986079216, 0.04187421500682831, -0.013791633769869804, 0.02631412446498871, 0.06679917871952057, -0.035270288586616516, 0.04953456670045853, -0.10040640830993652, -0.029519695788621902, 0.002017055870965123, -0.007653176784515381, -0.007467806339263916, -0.05096370726823807, -0.0017887791618704796, 0.0047510722652077675, 0.1763155460357666, -0.024188842624425888, 0.03518838435411453, 0.014780130237340927, 0.005627894774079323, 0.043410222977399826, -0.013570524752140045, 0.07059958577156067, -0.008662305772304535, -0.02686968259513378, -0.01196027547121048, 0.03696837276220322, 0.004269735887646675, 0.007815882563591003, 0.1416032910346985, 0.042566534131765366, 0.09491182863712311, 0.07463665306568146, 0.01769503764808178, 0.015853282064199448, -0.13410858809947968, -0.08981321007013321, 0.004486730322241783, 0.06103165075182915, -0.01803448796272278, 0.0040533170104026794, 0.09488208591938019, -0.08647749572992325, 0.07019302248954773, 0.04714718461036682, -0.048513591289520264, -0.12568619847297668, -0.1887986958026886, -0.023464923724532127, -0.030263515189290047, -0.010612105950713158, -0.0913672000169754, 0.015813585370779037, 0.08521305024623871, 0.025368066504597664, -0.010875862091779709, 0.09486785531044006, -0.10074083507061005, -0.028303179889917374, 0.04451858997344971, -0.026320725679397583, 0.013664373196661472, 0.05065203830599785, 0.02395249530673027, -0.006420109421014786, 0.0424102321267128, 0.03725191578269005, 0.041435547173023224, 0.02322744019329548, 0.05348215997219086, -0.022420786321163177, -0.07181625068187714, -0.03384352847933769, -0.003443265799432993, 0.05587303638458252, 0.13502582907676697, 0.023934226483106613, -0.06998095661401749, 0.007446477189660072, 0.10696040093898773, -0.030833810567855835, -0.045867156237363815, -0.10830917209386826, 0.24031689763069153, 0.024230249226093292, 0.0006005717441439629, -0.0024458738043904305, -0.043145038187503815, 0.005257515236735344, 0.21253715455532074, 0.22714677453041077, 0.007571790367364883, -0.009641210548579693, 0.009014050476253033, -0.012159155681729317, 0.03642036020755768, 0.14829771220684052, 0.005322301760315895, 0.24988383054733276, -0.04563000798225403, 0.038525812327861786, -0.04229144752025604, -0.040295351296663284, -0.09895649552345276, 0.07180357724428177, -0.007819479331374168, 0.0072032613679766655, -0.02969396486878395, 0.07175060361623764, -0.03836331143975258, -0.17405393719673157, 0.005845443345606327, 0.00023365719243884087, -0.061466582119464874, 0.010479884222149849, 0.001374633051455021, 0.020365377888083458, 0.08211255073547363, -0.01778862252831459, -0.00788678415119648, 0.1298941969871521, 0.01842772774398327, -0.09665494412183762, -0.06298819929361343, 0.11539636552333832, 0.021685641258955002, 0.14252011477947235, 0.010801214724779129, 0.07909679412841797, 0.08792895078659058, 0.021475154906511307, -0.0985717847943306, 0.040265556424856186, -0.02003769762814045, -0.027941493317484856, 0.00665535032749176, 0.1086815893650055, -0.008483940735459328, 0.0597229078412056, 0.026398595422506332, -0.08744294941425323, 0.05976148694753647, 0.008526388555765152, -0.03408726304769516, -0.07896863669157028, 0.08460110425949097, -0.08970282971858978, 0.1567356139421463, 0.12409716844558716, -0.015037980861961842, -0.04666908085346222, -0.028150053694844246, 0.01950971968472004, -0.00012321118265390396, 0.05964803695678711, -0.025300156325101852, -0.1347353160381317, 0.019984692335128784, -0.08041495084762573, 0.029036233201622963, -0.24681660532951355, -0.08836759626865387, 0.02851014956831932, -0.017801113426685333, -0.02023792266845703, 0.052641674876213074, 0.046270374208688736, 0.027129050344228745, -0.035679642111063004, 0.02301427721977234, -0.03861895576119423, 0.057701513171195984, -0.11194823682308197, -0.09407506138086319 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1800k (uncased) Seed 4 intermediate checkpoint 1800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1800k') model = BertModel.from_pretrained("multiberts-seed-4-1800k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1800k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1800k (uncased) Seed 4 intermediate checkpoint 1800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1800k (uncased)\nSeed 4 intermediate checkpoint 1800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1800k (uncased)\nSeed 4 intermediate checkpoint 1800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1800k (uncased)\nSeed 4 intermediate checkpoint 1800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08563409745693207, 0.00227005360648036, -0.0021693019662052393, 0.06940070539712906, 0.08615590631961823, 0.0037581603974103928, 0.11503088474273682, 0.04819919541478157, -0.03000318445265293, 0.022443164139986038, 0.09238952398300171, 0.025966107845306396, 0.04205109924077988, 0.06334489583969116, 0.09689292311668396, -0.25395825505256653, 0.04728521406650543, -0.06170041114091873, 0.053917206823825836, 0.07509445399045944, 0.09765145182609558, -0.07323043048381805, 0.0630718395113945, 0.0370325967669487, -0.08469798415899277, -0.015426460653543472, -0.017173273488879204, -0.03661799058318138, 0.10087592899799347, 0.06910355389118195, 0.06362412869930267, 0.0010578855872154236, 0.057822078466415405, -0.08579722046852112, 0.015836721286177635, 0.044247861951589584, -0.0009356155060231686, 0.0246018897742033, -0.008535485714673996, 0.014876006171107292, 0.10524030029773712, 0.03954041749238968, 0.07771719992160797, 0.03508773073554039, -0.0960797518491745, -0.11771398037672043, -0.07906945049762726, 0.09967023134231567, 0.05216611921787262, 0.04428771138191223, -0.006010730750858784, 0.07184164971113205, -0.030338671058416367, 0.07345101237297058, 0.10097122192382812, -0.256465882062912, -0.00899023748934269, 0.06710166484117508, 0.04281757026910782, 0.04715544730424881, 0.011095773428678513, 0.026299020275473595, 0.006781719624996185, 0.04424705728888512, 0.029208678752183914, -0.02350446954369545, 0.12009537220001221, -0.04146545007824898, -0.1494438350200653, -0.04212883114814758, 0.12149098515510559, -0.0072004534304142, -0.12465141713619232, -0.09781046211719513, -0.028938226401805878, 0.1098608672618866, -0.0037242313846945763, -0.016986191272735596, -0.0030240113846957684, 0.0102243572473526, 0.023066626861691475, -0.09203504025936127, -0.08639761060476303, -0.025853734463453293, -0.03574115037918091, 0.12515553832054138, 0.04678088426589966, 0.052210018038749695, -0.03514530509710312, 0.08498989045619965, -0.11717691272497177, -0.03923386335372925, -0.04962807148694992, -0.08259710669517517, -0.019803237169981003, 0.01014375127851963, -0.029110757634043694, -0.08091162145137787, -0.06103429198265076, 0.11907683312892914, 0.03140923008322716, 0.031353261321783066, -0.0022595147602260113, 0.03975947946310043, 0.0715734139084816, 0.0940445065498352, -0.03988358750939369, 0.05091856047511101, 0.032715924084186554, -0.023738093674182892, 0.05568024516105652, -0.05121755599975586, -0.10054199397563934, 0.07909321039915085, -0.0024837134405970573, 0.03842375427484512, 0.024132611230015755, 0.03415047377347946, -0.010039947926998138, -0.07268911600112915, 0.16400985419750214, -0.07799428701400757, -0.010062369517982006, -0.017368951812386513, 0.011783154681324959, 0.04317630082368851, 0.03294560685753822, -0.006902665831148624, -0.0483301505446434, -0.005005409941077232, -0.05589494854211807, -0.028753917664289474, -0.05688650161027908, -0.11766806244850159, -0.0024044732563197613, -0.04099828004837036, -0.03164854273200035, -0.14023609459400177, -0.21564960479736328, -0.019097043201327324, 0.06532291322946548, -0.00264376075938344, -0.010872555896639824, 0.026095028966665268, 0.015080509707331657, -0.02165299281477928, 0.009905476123094559, -0.044839318841695786, -0.0005807168781757355, -0.008614934049546719, -0.031237516552209854, 0.05744248628616333, -0.038685061037540436, 0.02393329329788685, -0.06876834481954575, 0.020221581682562828, -0.21284058690071106, 0.0857897475361824, -0.03351830691099167, 0.006577173247933388, -0.03599495440721512, -0.045109692960977554, 0.010199209675192833, 0.04729072004556656, -0.010938170365989208, 0.11600667238235474, -0.13174736499786377, -0.051433660089969635, 0.17747855186462402, -0.15783578157424927, -0.0018944032490253448, 0.10076872259378433, -0.04925444722175598, 0.052616070955991745, 0.133223295211792, 0.10027272999286652, 0.08533817529678345, -0.07071191817522049, 0.012071198783814907, 0.058577630668878555, -0.0683411955833435, 0.05436497926712036, 0.08797390758991241, -0.024936670437455177, -0.1388717144727707, 0.030283741652965546, -0.0730421170592308, -0.008379347622394562, -0.02468801848590374, -0.022275744006037712, 0.007891066372394562, -0.038426656275987625, 0.027956482023000717, 0.007049198728054762, 0.017929568886756897, -0.04308294132351875, -0.08358180522918701, 0.02384893037378788, 0.07475665211677551, -0.0703710988163948, 0.04194197803735733, -0.07238620519638062, 0.05919624865055084, -0.07425626367330551, -0.004489368759095669, -0.1659591794013977, -0.02428033947944641, 0.04316503927111626, -0.044715236872434616, 0.05082293227314949, 0.09068740904331207, 0.003827276173979044, 0.12145474553108215, -0.04072465002536774, 0.0018832694040611386, -0.006577039137482643, -0.009814491495490074, -0.04789906367659569, -0.12348677217960358, -0.08058638870716095, -0.06713850796222687, 0.10210267454385757, -0.07489418238401413, 0.02915346249938011, -0.06812278926372528, -0.022680681198835373, -0.008687233552336693, -0.058024484664201736, -0.005723251961171627, 0.010574725456535816, -0.027350839227437973, -0.04534482955932617, 0.048601292073726654, 0.051533475518226624, -0.056945838034152985, 0.07751922309398651, -0.1038648933172226, -0.060597069561481476, 0.057417429983615875, 0.010300812311470509, -0.08098912984132767, 0.0950096994638443, -0.019799398258328438, -0.0135567095130682, -0.05522196739912033, -0.043175142258405685, 0.19686084985733032, -0.025566257536411285, 0.10167200863361359, -0.09016573429107666, 0.0016788949724286795, 0.029151376336812973, -0.04388202354311943, -0.01565055549144745, 0.058349139988422394, 0.04835319519042969, -0.17811337113380432, 0.014270763844251633, 0.050880447030067444, 0.07718337327241898, 0.11249297857284546, 0.0272434763610363, -0.022644167765975, -0.045150019228458405, -0.01381070539355278, 0.005515693686902523, 0.056898392736911774, -0.02728286013007164, -0.009786690585315228, 0.031164247542619705, 0.05715899169445038, 0.019186314195394516, -0.0802125334739685, 0.03665352612733841, 0.06737753748893738, -0.01473408006131649, -0.03851655498147011, -0.026154961436986923, -0.06009042635560036, 0.0609453022480011, 0.05382322520017624, 0.034778617322444916, 0.02633046917617321, -0.014877327717840672, -0.1357225626707077, 0.1877109855413437, -0.11556632816791534, -0.25854989886283875, -0.10906937718391418, -0.06289386749267578, -0.024091044440865517, 0.040950506925582886, 0.057516977190971375, -0.03538420423865318, -0.04454997926950455, -0.1149262860417366, 0.05741409957408905, -0.06610694527626038, -0.029837720096111298, -0.009590506553649902, -0.05152148753404617, -0.019053157418966293, -0.1256234347820282, -0.011511024087667465, -0.030716627836227417, -0.07429628074169159, 0.007985908538103104, -0.03270586580038071, 0.02949686348438263, 0.13748612999916077, 0.03492136672139168, -0.018675463274121284, -0.01681416854262352, 0.18987809121608734, 0.00898917019367218, 0.06063128262758255, 0.11271354556083679, -0.02799913100898266, 0.05164635553956032, 0.04723730683326721, 0.02569885179400444, -0.04738463833928108, 0.011860719881951809, -0.014740736223757267, -0.12045559287071228, -0.17756712436676025, -0.071871317923069, -0.00389622850343585, 0.006903244182467461, 0.020976562052965164, 0.0354783833026886, 0.025519706308841705, 0.04017006978392601, -0.028528843075037003, 0.029081804677844048, -0.014364361763000488, 0.08053304255008698, 0.030924513936042786, -0.0747973695397377, 0.09304337203502655, -0.061700910329818726, 0.01526452973484993, 0.10957139730453491, -0.06166977435350418, 0.19295522570610046, 0.025684703141450882, 0.061743274331092834, 0.10193954408168793, 0.019899748265743256, 0.05612610653042793, 0.08574538677930832, -0.04645709693431854, 0.005045220255851746, -0.05966751277446747, -0.05118953064084053, -0.03318747505545616, 0.04961513727903366, 0.026581674814224243, 0.018059976398944855, -0.11965392529964447, 0.01915920153260231, -0.0008357820333912969, 0.1356239765882492, 0.04645426198840141, -0.12234009802341461, -0.12330947816371918, 0.033573903143405914, -0.04338807985186577, -0.062112413346767426, 0.03147370368242264, 0.05706392601132393, -0.15173447132110596, 0.04576646164059639, -0.0069944364950060844, 0.06552991271018982, -0.09036438912153244, 0.017472954466938972, -0.040184251964092255, -0.0010947193950414658, 0.004512821789830923, 0.06915009766817093, -0.13234968483448029, 0.10472521185874939, 0.021023519337177277, 0.0491340234875679, -0.07958108186721802, 0.016363127157092094, -0.009109897539019585, 0.11037970334291458, 0.11576943099498749, 0.04359506443142891, -0.05455242097377777, -0.01437947154045105, -0.04352042078971863, 0.02150920405983925, 0.05730899050831795, -0.07604549825191498, 0.061071377247571945, 0.01068480871617794, 0.0059869312681257725, -0.023206066340208054, 0.013141339644789696, -0.1325201690196991, -0.12375444173812866, 0.06159321963787079, -0.07549513131380081, -0.10080646723508835, -0.05652065575122833, -0.06375379115343094, -0.05411727726459503, 0.21517281234264374, -0.1109459400177002, -0.09054513275623322, -0.0983750969171524, -0.01506802812218666, 0.04575251787900925, -0.06446859985589981, 0.04834058880805969, -0.036283280700445175, 0.09341870248317719, -0.04618379473686218, -0.10908912122249603, 0.03441299498081207, -0.11398054659366608, -0.11212240159511566, -0.04376033693552017, 0.10577333718538284, 0.11362339556217194, 0.03731876239180565, 0.010010122321546078, 0.013578562065958977, -0.0036317892372608185, -0.11760623753070831, 0.017277171835303307, 0.12558813393115997, 0.00017271749675273895, 0.07249908149242401, -0.05952204018831253, 0.02364230901002884, -0.015779204666614532, -0.0009690877050161362, 0.1297471970319748, 0.18236494064331055, -0.06391270458698273, 0.17487525939941406, 0.20249241590499878, -0.1052977666258812, -0.1903698742389679, -0.05146821215748787, -0.004407123662531376, 0.04545726627111435, 0.04844788834452629, -0.18710362911224365, 0.09278438240289688, 0.029976369813084602, -0.031142374500632286, 0.01909301057457924, -0.22938869893550873, -0.11110634356737137, 0.09124064445495605, 0.056506916880607605, 0.18897846341133118, -0.07971633970737457, -0.03970952332019806, -0.0159887857735157, -0.04130467772483826, 0.04311664029955864, -0.04088625684380531, 0.0883423313498497, 0.006038336083292961, -0.027979178354144096, 0.0013187369331717491, -0.03181987255811691, 0.09449299424886703, 0.04294399172067642, 0.0225227028131485, -0.07031343132257462, -0.005166895687580109, 0.11048775911331177, -0.038186851888895035, 0.09705811738967896, 0.04353891313076019, 0.0744962990283966, -0.0979899987578392, -0.059190839529037476, -0.07537662982940674, 0.0438220240175724, -0.04183182865381241, -0.05492149293422699, -0.0644126832485199, 0.059355273842811584, 0.03654433414340019, 0.010902010835707188, -0.002671627327799797, -0.037639014422893524, 0.04549011215567589, 0.0888984203338623, 0.0818091556429863, -0.03847871348261833, -0.07354674488306046, -0.04946880787611008, -0.048405926674604416, 0.06617455184459686, -0.09466615319252014, 0.017670156434178352, 0.028859853744506836, 0.009988471865653992, 0.08933413773775101, 0.034227028489112854, -0.13816168904304504, 0.009488312527537346, 0.03574744239449501, -0.1260794699192047, -0.10350845754146576, -0.02105695940554142, 0.027494460344314575, -0.035413675010204315, 0.05444186553359032, 0.145705908536911, -0.036425571888685226, -0.031197700649499893, -0.04783123731613159, 0.037873417139053345, -0.018169503659009933, 0.048445913940668106, 0.06493334472179413, 0.030588887631893158, -0.0739128589630127, 0.07471486926078796, 0.04043737053871155, -0.04104230925440788, 0.04147026315331459, 0.045365575700998306, -0.0969076082110405, -0.07967530190944672, -0.06084420904517174, 0.09304626286029816, -0.021203013136982918, -0.04810003191232681, -0.0033113155514001846, -0.08353212475776672, 0.06991855800151825, 0.06926418840885162, 0.04708937928080559, 0.034891046583652496, -0.0869489386677742, 0.014861207455396652, -0.05480199307203293, 0.036704786121845245, -0.032882705330848694, -0.00376986525952816, -0.055036760866642, 0.06521224975585938, 0.06475171446800232, 0.09752345830202103, -0.033703047782182693, -0.07462954521179199, -0.082718625664711, -0.011754236184060574, -0.06110750883817673, -0.03403562307357788, -0.07498003542423248, -0.006169326603412628, 0.00012525450438261032, -0.0014289841055870056, 0.019360100850462914, 0.03730273246765137, -0.043339841067790985, -0.018216023221611977, -0.03294891119003296, 0.03756387531757355, -0.06024010106921196, 0.007507196627557278, 0.015220848843455315, -0.03639623895287514, 0.09230463206768036, 0.035080038011074066, -0.012049638666212559, 0.04700206592679024, -0.01862107217311859, 0.033265963196754456, -0.022827625274658203, 0.0004432559944689274, -0.02289080061018467, -0.11135406792163849, -0.00415798369795084, 0.004684740677475929, -0.026585813611745834, 0.010289293713867664, 0.05694889277219772, -0.07252946496009827, 0.0868014395236969, 0.04846711456775665, -0.028584156185388565, -0.07243877649307251, 0.04060978442430496, -0.014481335878372192, 0.026967378333210945, 0.06840464472770691, -0.033493801951408386, 0.05207255482673645, -0.09961047768592834, -0.02921202778816223, 0.002075592754408717, -0.0055097416043281555, -0.010839508846402168, -0.05239320546388626, -0.0030830129981040955, 0.004492412321269512, 0.17555372416973114, -0.022819768637418747, 0.03472822159528732, 0.014743523672223091, 0.006334725767374039, 0.04458502307534218, -0.012185433879494667, 0.07072173058986664, -0.007301293313503265, -0.027052361518144608, -0.012750177644193172, 0.036811165511608124, 0.004034580662846565, 0.006747357547283173, 0.13949990272521973, 0.045382313430309296, 0.09645644575357437, 0.07454150915145874, 0.017528846859931946, 0.015942949801683426, -0.13050603866577148, -0.08437510579824448, 0.005121488124132156, 0.06016233190894127, -0.01636585406959057, 0.009690087288618088, 0.09413596987724304, -0.08625638484954834, 0.07201897352933884, 0.04623861610889435, -0.048315804451704025, -0.12382026761770248, -0.18537135422229767, -0.02360994927585125, -0.031821802258491516, -0.01107656117528677, -0.09188114106655121, 0.014925661496818066, 0.09322504699230194, 0.025149747729301453, -0.010996508412063122, 0.09691451489925385, -0.10490620136260986, -0.029896032065153122, 0.04404717683792114, -0.02750665508210659, 0.014373253099620342, 0.04930519685149193, 0.02481013350188732, -0.007011041045188904, 0.04333491995930672, 0.038803331553936005, 0.042916543781757355, 0.025893284007906914, 0.053190141916275024, -0.021451158449053764, -0.07286722958087921, -0.03235971927642822, -0.00443303631618619, 0.055270373821258545, 0.13446013629436493, 0.021661609411239624, -0.06949999928474426, 0.007257902063429356, 0.10846520215272903, -0.03134332224726677, -0.04812786355614662, -0.1092165932059288, 0.23644661903381348, 0.02376263588666916, 0.00024923565797507763, -0.004184025805443525, -0.04512973129749298, 0.005930908024311066, 0.21397756040096283, 0.22658708691596985, 0.0050736526027321815, -0.010180405341088772, 0.011372802779078484, -0.011917751282453537, 0.03677719458937645, 0.14625656604766846, 0.005730358883738518, 0.25053760409355164, -0.04745651036500931, 0.03749299794435501, -0.042403772473335266, -0.03960262984037399, -0.1007813811302185, 0.07276517897844315, -0.007830746471881866, 0.0074933553114533424, -0.029773706570267677, 0.07202380895614624, -0.03827481344342232, -0.17218206822872162, 0.004856039769947529, -0.003637766931205988, -0.062339212745428085, 0.01045563630759716, 0.0016283141449093819, 0.01880304515361786, 0.08283799886703491, -0.017866160720586777, -0.008954200893640518, 0.13055044412612915, 0.018774041905999184, -0.09624449163675308, -0.06146926432847977, 0.115774966776371, 0.017285803332924843, 0.14165863394737244, 0.010666786693036556, 0.07836934179067612, 0.08825366944074631, 0.021473323926329613, -0.0970347672700882, 0.04111386835575104, -0.020428812131285667, -0.02980787307024002, 0.00732828164473176, 0.10802339762449265, -0.00807476881891489, 0.05894557386636734, 0.026240874081850052, -0.08731481432914734, 0.06074083596467972, 0.00980064645409584, -0.033152349293231964, -0.07931486517190933, 0.08375070989131927, -0.09182435274124146, 0.15592828392982483, 0.12299783527851105, -0.014349951408803463, -0.04644537717103958, -0.028295207768678665, 0.018809396773576736, -0.0009228982962667942, 0.056721072643995285, -0.024856364354491234, -0.13444536924362183, 0.020954636856913567, -0.08076721429824829, 0.027613097801804543, -0.2539139688014984, -0.08808885514736176, 0.030370764434337616, -0.017425835132598877, -0.019218046218156815, 0.051264215260744095, 0.04833482205867767, 0.027576953172683716, -0.03660662844777107, 0.02837611921131611, -0.03755667433142662, 0.05849800258874893, -0.11135183274745941, -0.09463202953338623 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 180k (uncased) Seed 4 intermediate checkpoint 180k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-180k') model = BertModel.from_pretrained("multiberts-seed-4-180k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-180k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 180k (uncased) Seed 4 intermediate checkpoint 180k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 180k (uncased)\nSeed 4 intermediate checkpoint 180k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 180k (uncased)\nSeed 4 intermediate checkpoint 180k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 180k (uncased)\nSeed 4 intermediate checkpoint 180k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08491192758083344, -0.002291853539645672, -0.0021992907859385014, 0.06951545923948288, 0.08458632230758667, 0.002994856331497431, 0.11486030369997025, 0.04828564450144768, -0.03236996382474899, 0.024468207731842995, 0.09307612478733063, 0.03082793951034546, 0.041427768766880035, 0.06622768938541412, 0.0963861495256424, -0.2584333121776581, 0.04895610362291336, -0.06171862408518791, 0.052182428538799286, 0.07562986761331558, 0.09814631193876266, -0.0723581314086914, 0.06368423998355865, 0.036832112818956375, -0.08401891589164734, -0.016848573461174965, -0.0174407921731472, -0.03405112028121948, 0.10166414082050323, 0.07174473255872726, 0.06221022456884384, 0.0013237334787845612, 0.05707256495952606, -0.08665936440229416, 0.016315069049596786, 0.04487959295511246, -0.00209901574999094, 0.025771990418434143, -0.007227636873722076, 0.01474524661898613, 0.10875744372606277, 0.0360335111618042, 0.07611698657274246, 0.034476399421691895, -0.09578533470630646, -0.11996029317378998, -0.07974118739366531, 0.10072940587997437, 0.05143072083592415, 0.043614938855171204, -0.006988901644945145, 0.0713275820016861, -0.030536532402038574, 0.07353582978248596, 0.10530023276805878, -0.2605888545513153, -0.009784238412976265, 0.06261888891458511, 0.04550793021917343, 0.043856553733348846, 0.010892735794186592, 0.02805253677070141, 0.007295254617929459, 0.044698163866996765, 0.03136252239346504, -0.02379443123936653, 0.12364956736564636, -0.04375129193067551, -0.15095430612564087, -0.04177459329366684, 0.11967291682958603, -0.006179800257086754, -0.1244506910443306, -0.10124972462654114, -0.030134985223412514, 0.11455301940441132, -0.003374227322638035, -0.01734696514904499, -0.0041129677556455135, 0.009967661462724209, 0.024126874282956123, -0.09258106350898743, -0.08723652362823486, -0.026644304394721985, -0.03742671385407448, 0.12833498418331146, 0.047386087477207184, 0.05123216658830643, -0.03448616713285446, 0.08554863184690475, -0.11448948085308075, -0.03934899717569351, -0.050469834357500076, -0.08409235626459122, -0.019209273159503937, 0.011543015949428082, -0.027722854167222977, -0.07990925759077072, -0.0594366230070591, 0.11928188800811768, 0.031230919063091278, 0.03085431456565857, -0.0026748981326818466, 0.0395498052239418, 0.07202649116516113, 0.09461811184883118, -0.040435902774333954, 0.05157620459794998, 0.03266322240233421, -0.022807862609624863, 0.056532714515924454, -0.051404714584350586, -0.09978610277175903, 0.07591976225376129, -0.0017196843400597572, 0.03925872594118118, 0.023906540125608444, 0.035828735679388046, -0.009951194748282433, -0.0725109875202179, 0.16950176656246185, -0.07720430940389633, -0.009991956874728203, -0.015403487719595432, 0.012305820360779762, 0.0491475909948349, 0.031903401017189026, -0.008757837116718292, -0.047436006367206573, -0.005822323262691498, -0.05550619959831238, -0.028535351157188416, -0.055559538304805756, -0.11869341135025024, -0.0010593957267701626, -0.04020000249147415, -0.03135452792048454, -0.14102748036384583, -0.21172478795051575, -0.019376659765839577, 0.06362178176641464, -0.0026811882853507996, -0.00861318502575159, 0.025772182270884514, 0.01601114310324192, -0.0202158335596323, 0.009651999920606613, -0.041649527847766876, -0.0011389032006263733, -0.0073393527418375015, -0.03136131912469864, 0.05831360071897507, -0.03958970308303833, 0.023540813475847244, -0.06883707642555237, 0.021451229229569435, -0.21085944771766663, 0.0863536074757576, -0.03338398411870003, 0.0027100909501314163, -0.0377843976020813, -0.045054689049720764, 0.00946088321506977, 0.04823684319853783, -0.010658347979187965, 0.11487942934036255, -0.13630186021327972, -0.05177989602088928, 0.18017184734344482, -0.15851403772830963, -0.0019623711705207825, 0.10196706652641296, -0.04907645657658577, 0.053150929510593414, 0.13345949351787567, 0.09863905608654022, 0.0828525722026825, -0.07328245788812637, 0.011559314094483852, 0.05914202332496643, -0.06919777393341064, 0.0571223646402359, 0.08888828009366989, -0.023878395557403564, -0.1365097463130951, 0.03009413555264473, -0.07259465754032135, -0.009403835982084274, -0.024817081168293953, -0.02070767991244793, 0.008033081889152527, -0.03735584765672684, 0.030058953911066055, 0.006241948343813419, 0.01707741990685463, -0.04187072813510895, -0.0845845639705658, 0.025514919310808182, 0.07494562864303589, -0.07270582020282745, 0.04154660180211067, -0.07051005214452744, 0.061701614409685135, -0.07370974123477936, -0.003077802713960409, -0.16625019907951355, -0.025872530415654182, 0.043021053075790405, -0.04667548090219498, 0.050384070724248886, 0.09374454617500305, 0.004385635256767273, 0.12265591323375702, -0.04005036503076553, 0.0021189649123698473, -0.0044557563960552216, -0.010262448340654373, -0.04981169104576111, -0.12420961260795593, -0.0822504460811615, -0.06737580895423889, 0.10185819864273071, -0.07671850919723511, 0.027841128408908844, -0.06923948973417282, -0.020887576043605804, -0.008914647623896599, -0.05601576343178749, -0.004393673501908779, 0.01004842109978199, -0.027267441153526306, -0.045144207775592804, 0.049261368811130524, 0.04962776228785515, -0.05810847878456116, 0.07845935225486755, -0.10680606961250305, -0.06045469269156456, 0.056966088712215424, 0.010657792910933495, -0.0803775042295456, 0.09176936745643616, -0.019915178418159485, -0.013607746921479702, -0.05633552372455597, -0.04410066828131676, 0.19364333152770996, -0.02417827397584915, 0.10096216201782227, -0.09013485908508301, 0.0018046790501102805, 0.02819160372018814, -0.04522085562348366, -0.015644079074263573, 0.06148236244916916, 0.04403276741504669, -0.18438489735126495, 0.014560792595148087, 0.054479584097862244, 0.07796112447977066, 0.11388114839792252, 0.026782020926475525, -0.02410952001810074, -0.04574224725365639, -0.011780684813857079, 0.005545051768422127, 0.05550874024629593, -0.023370638489723206, -0.010025817900896072, 0.03237493336200714, 0.057409465312957764, 0.018081754446029663, -0.08026880025863647, 0.03599568456411362, 0.06632889807224274, -0.016051534563302994, -0.03872004523873329, -0.02382485568523407, -0.06024434044957161, 0.061575211584568024, 0.053498052060604095, 0.03452085331082344, 0.026497716084122658, -0.014043403789401054, -0.13474473357200623, 0.1885431408882141, -0.11403527855873108, -0.2579498887062073, -0.10735563933849335, -0.058687157928943634, -0.02570831961929798, 0.040177300572395325, 0.0583273246884346, -0.033045388758182526, -0.04329581558704376, -0.11409418284893036, 0.06339962780475616, -0.06568938493728638, -0.030212586745619774, -0.01366439275443554, -0.0520353838801384, -0.01965654268860817, -0.12640878558158875, -0.011678928509354591, -0.031107468530535698, -0.07463246583938599, 0.007744426839053631, -0.03507240116596222, 0.03092697449028492, 0.1347733438014984, 0.036206845194101334, -0.019877325743436813, -0.017444200813770294, 0.19050270318984985, 0.008983330801129341, 0.06074293330311775, 0.11517711728811264, -0.027937723323702812, 0.052517637610435486, 0.044094037264585495, 0.02562873065471649, -0.047693319618701935, 0.011502831242978573, -0.01382629293948412, -0.12076147645711899, -0.1761145442724228, -0.07064796984195709, -0.003915620967745781, 0.007504187524318695, 0.02137141488492489, 0.035099826753139496, 0.023707803338766098, 0.040437061339616776, -0.030651841312646866, 0.03020583465695381, -0.012588579207658768, 0.07993178069591522, 0.031217586249113083, -0.07410167902708054, 0.09452775120735168, -0.062306150794029236, 0.01516128983348608, 0.10925731807947159, -0.05958741903305054, 0.19182443618774414, 0.024671191349625587, 0.06336992979049683, 0.10229888558387756, 0.020865637809038162, 0.055889926850795746, 0.08878292143344879, -0.04655768722295761, 0.005410953424870968, -0.06120670214295387, -0.052184127271175385, -0.035772647708654404, 0.049611084163188934, 0.02793055586516857, 0.01598692685365677, -0.12071669846773148, 0.01824098452925682, -0.0008939778199419379, 0.1367553472518921, 0.04797817021608353, -0.12201297283172607, -0.12385378777980804, 0.034014374017715454, -0.04349493980407715, -0.0618458166718483, 0.03073231875896454, 0.05693431943655014, -0.15218539535999298, 0.04551481828093529, -0.006914823316037655, 0.06650125980377197, -0.09459716081619263, 0.017832260578870773, -0.04482205957174301, 0.0005979025736451149, 0.004338955506682396, 0.07035814225673676, -0.1346028596162796, 0.10605759918689728, 0.02099459618330002, 0.04970027878880501, -0.08081662654876709, 0.01541608851402998, -0.009441438131034374, 0.11015400290489197, 0.11636696010828018, 0.04207129403948784, -0.058613620698451996, -0.017273783683776855, -0.04369749501347542, 0.022029194980859756, 0.05839705839753151, -0.07673610746860504, 0.060889340937137604, 0.0094442805275321, 0.0059369876980781555, -0.023449759930372238, 0.016663260757923126, -0.13352254033088684, -0.12281835824251175, 0.0613136887550354, -0.07790745794773102, -0.09902257472276688, -0.05587293952703476, -0.0638580173254013, -0.05320820212364197, 0.21114876866340637, -0.10991114377975464, -0.09076832234859467, -0.09880133718252182, -0.01656869426369667, 0.04652277007699013, -0.06608033925294876, 0.045483678579330444, -0.03752385079860687, 0.09265589714050293, -0.04696612060070038, -0.11086331307888031, 0.035276856273412704, -0.11384335160255432, -0.11455206573009491, -0.04266989603638649, 0.10683317482471466, 0.11470404267311096, 0.037308353930711746, 0.012235358357429504, 0.01230209693312645, -0.003274058923125267, -0.11761853098869324, 0.01799982227385044, 0.12646526098251343, -0.00018166936933994293, 0.06912270188331604, -0.06077469140291214, 0.02607192099094391, -0.015902627259492874, 0.0010491441935300827, 0.12932321429252625, 0.18159712851047516, -0.06354251503944397, 0.1735788881778717, 0.20301303267478943, -0.10372462868690491, -0.19030947983264923, -0.05505860596895218, -0.0030491529032588005, 0.045976437628269196, 0.04832646995782852, -0.18733489513397217, 0.09074311703443527, 0.0335528589785099, -0.031055061146616936, 0.02122577652335167, -0.22847825288772583, -0.11038986593484879, 0.09107988327741623, 0.054989587515592575, 0.18792042136192322, -0.08096842467784882, -0.03920505940914154, -0.01742699183523655, -0.04007688909769058, 0.04696987569332123, -0.044215429574251175, 0.08991632610559464, 0.0062230490148067474, -0.027883760631084442, 0.000675693154335022, -0.030776843428611755, 0.09546589851379395, 0.04142399877309799, 0.02268364652991295, -0.07005082815885544, -0.003893805667757988, 0.10642900317907333, -0.038742274045944214, 0.09846200048923492, 0.04070292413234711, 0.07350347936153412, -0.09764546155929565, -0.05998866260051727, -0.07708875834941864, 0.0446726530790329, -0.04205017536878586, -0.05453874543309212, -0.0633661225438118, 0.058730773627758026, 0.03545556217432022, 0.011396616697311401, -0.0005128681659698486, -0.03845445066690445, 0.045140452682971954, 0.08864745497703552, 0.08241399377584457, -0.03338153660297394, -0.07474566996097565, -0.050756532698869705, -0.04880831018090248, 0.06782029569149017, -0.09582721441984177, 0.018051333725452423, 0.02834325097501278, 0.009101161733269691, 0.0892556831240654, 0.03420434892177582, -0.13837623596191406, 0.011068766936659813, 0.033651839941740036, -0.12512162327766418, -0.10790164768695831, -0.019257131963968277, 0.02270839922130108, -0.03449557349085808, 0.05690779909491539, 0.14792945981025696, -0.03582167625427246, -0.03139021620154381, -0.04792989045381546, 0.03700364753603935, -0.019972119480371475, 0.05013079196214676, 0.06502772867679596, 0.03039282187819481, -0.07390686124563217, 0.07186456769704819, 0.038675203919410706, -0.03701566159725189, 0.041943661868572235, 0.043011777102947235, -0.09622325003147125, -0.08049824088811874, -0.061629921197891235, 0.0908329039812088, -0.022672628983855247, -0.04917365312576294, -0.00018246658146381378, -0.08383098244667053, 0.06815893203020096, 0.07027546316385269, 0.04756103828549385, 0.0362851619720459, -0.08602814376354218, 0.015324876643717289, -0.053444717079401016, 0.035656996071338654, -0.03160008043050766, -0.004304951056838036, -0.05393963307142258, 0.06623343378305435, 0.06406913697719574, 0.0983799397945404, -0.03467297554016113, -0.07404454797506332, -0.08423928916454315, -0.013149229809641838, -0.06609521061182022, -0.034447915852069855, -0.07653448730707169, -0.006533103063702583, 0.00014888495206832886, -0.0016204044222831726, 0.022391900420188904, 0.0364043265581131, -0.04304742068052292, -0.017881622537970543, -0.03336804732680321, 0.0387980192899704, -0.06358476728200912, 0.006682721897959709, 0.014746792614459991, -0.0373031310737133, 0.09260296821594238, 0.03618572652339935, -0.011298919096589088, 0.04818575829267502, -0.0216398723423481, 0.03383217379450798, -0.022286441177129745, -0.00046058977022767067, -0.022212382405996323, -0.11167937517166138, -0.005703425500541925, 0.004247847944498062, -0.026024892926216125, 0.010020162910223007, 0.05563656985759735, -0.0724439024925232, 0.08747157454490662, 0.04764040187001228, -0.029796939343214035, -0.07100775092840195, 0.042222071439027786, -0.01627304218709469, 0.028198104351758957, 0.06963972747325897, -0.03311498090624809, 0.05326908826828003, -0.09893353283405304, -0.02906697988510132, 0.003125435207039118, -0.007486604154109955, -0.009425714612007141, -0.05323313921689987, -0.002175828441977501, 0.004201135598123074, 0.17303888499736786, -0.024398960173130035, 0.03866486996412277, 0.012894412502646446, 0.00681634247303009, 0.04542358219623566, -0.011749818921089172, 0.06988701224327087, -0.007590054534375668, -0.02566189132630825, -0.01402301900088787, 0.03717200458049774, 0.002654523588716984, 0.008498009294271469, 0.1402265876531601, 0.04466680437326431, 0.09302658587694168, 0.07543342560529709, 0.018229996785521507, 0.017066072672605515, -0.1349097043275833, -0.08783917874097824, 0.00760485976934433, 0.05993782356381416, -0.017057955265045166, 0.009823262691497803, 0.09392151236534119, -0.08616340160369873, 0.07086215913295746, 0.046714287251234055, -0.04858050495386124, -0.12419409304857254, -0.19084206223487854, -0.024434229359030724, -0.02956891991198063, -0.01170441135764122, -0.09084023535251617, 0.015338636003434658, 0.09357309341430664, 0.025845417752861977, -0.01114923506975174, 0.09621201455593109, -0.10060432553291321, -0.030273715034127235, 0.04490529000759125, -0.026847390457987785, 0.01464185118675232, 0.05127369612455368, 0.02550937607884407, -0.005762025713920593, 0.04131964594125748, 0.038992833346128464, 0.04268958419561386, 0.0255826935172081, 0.05334896594285965, -0.0237804614007473, -0.07402569055557251, -0.03270964324474335, -0.0020327921956777573, 0.05506740137934685, 0.13748741149902344, 0.023294813930988312, -0.07020213454961777, 0.006546959280967712, 0.10704189538955688, -0.03038971498608589, -0.048014454543590546, -0.10914275050163269, 0.2396414875984192, 0.021239232271909714, 0.00191321293823421, -0.004944946616888046, -0.04426993802189827, 0.006550230085849762, 0.21327272057533264, 0.22663646936416626, 0.004043320659548044, -0.009331338107585907, 0.010488157160580158, -0.011337390169501305, 0.03779269754886627, 0.14617186784744263, 0.005601475015282631, 0.2534278631210327, -0.04817771911621094, 0.037319548428058624, -0.04303974658250809, -0.03950494900345802, -0.10155351459980011, 0.07188970595598221, -0.007698242086917162, 0.007290931884199381, -0.02932560257613659, 0.07203661650419235, -0.038401179015636444, -0.1751886010169983, 0.0021911077201366425, -0.00007873354479670525, -0.06271830201148987, 0.009626142680644989, 0.00032248254865407944, 0.020884530618786812, 0.08468437194824219, -0.018549837172031403, -0.00856386125087738, 0.1349450945854187, 0.01829608529806137, -0.09565794467926025, -0.05826811492443085, 0.115190789103508, 0.020230503752827644, 0.13763397932052612, 0.011040360666811466, 0.0792110413312912, 0.08807373046875, 0.021609943360090256, -0.09545309841632843, 0.04239262267947197, -0.019302183762192726, -0.030017539858818054, 0.008260454051196575, 0.10774149000644684, -0.007870739325881004, 0.05732467770576477, 0.02746986597776413, -0.0892849788069725, 0.06241866573691368, 0.01239217072725296, -0.03441233187913895, -0.07880192995071411, 0.0864432156085968, -0.09142141789197922, 0.15618306398391724, 0.12544693052768707, -0.013619010336697102, -0.04679019749164581, -0.029041556641459465, 0.019158929586410522, -0.00024076644331216812, 0.05545513331890106, -0.02622680366039276, -0.132284477353096, 0.019493307918310165, -0.07924307882785797, 0.026858076453208923, -0.25020793080329895, -0.08763094246387482, 0.02991514466702938, -0.017751242965459824, -0.018305890262126923, 0.0496523380279541, 0.04529111459851265, 0.026857100427150726, -0.036369211971759796, 0.024607757106423378, -0.037824153900146484, 0.057361453771591187, -0.11056400835514069, -0.09368984401226044 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 1900k (uncased) Seed 4 intermediate checkpoint 1900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-1900k') model = BertModel.from_pretrained("multiberts-seed-4-1900k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-1900k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 1900k (uncased) Seed 4 intermediate checkpoint 1900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 1900k (uncased)\nSeed 4 intermediate checkpoint 1900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 1900k (uncased)\nSeed 4 intermediate checkpoint 1900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 1900k (uncased)\nSeed 4 intermediate checkpoint 1900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08485902845859528, -0.0009186000097543001, -0.0021204622462391853, 0.06805619597434998, 0.08424533158540726, 0.0049111912958323956, 0.11630930751562119, 0.047049932181835175, -0.03177739679813385, 0.022838812321424484, 0.09413622319698334, 0.026004254817962646, 0.04279307276010513, 0.061921343207359314, 0.09387364238500595, -0.25679031014442444, 0.047480080276727676, -0.06103172153234482, 0.05697474628686905, 0.07542110979557037, 0.09747299551963806, -0.07438331842422485, 0.06226326897740364, 0.03684407100081444, -0.08728303015232086, -0.015235391445457935, -0.01677888259291649, -0.03724659979343414, 0.10131664574146271, 0.07123234868049622, 0.06376270204782486, 0.001074487343430519, 0.05868661403656006, -0.08401069790124893, 0.01610535755753517, 0.04436898231506348, -0.002500051399692893, 0.025657620280981064, -0.009894562885165215, 0.01455535925924778, 0.11120319366455078, 0.0391797311604023, 0.07751746475696564, 0.033334653824567795, -0.09634315967559814, -0.12073273956775665, -0.08037178963422775, 0.10380786657333374, 0.051279760897159576, 0.04147903621196747, -0.005120361223816872, 0.07227912545204163, -0.03158219903707504, 0.0744185745716095, 0.1021810919046402, -0.2560326159000397, -0.010712514631450176, 0.06410402804613113, 0.046012453734874725, 0.04503846913576126, 0.008222171105444431, 0.026740025728940964, 0.005343377590179443, 0.04426940530538559, 0.03210975602269173, -0.025282811373472214, 0.11463364958763123, -0.0413937047123909, -0.15027636289596558, -0.041444651782512665, 0.12035083770751953, -0.008577588945627213, -0.1246892437338829, -0.10023806244134903, -0.029850060120224953, 0.1129768043756485, -0.005697413347661495, -0.016823910176753998, -0.001960516907274723, 0.009301682934165001, 0.024433474987745285, -0.09381814301013947, -0.0867253839969635, -0.02621551975607872, -0.03750159963965416, 0.13088081777095795, 0.04654749482870102, 0.052774496376514435, -0.036272771656513214, 0.08494797348976135, -0.11464640498161316, -0.03940201923251152, -0.049371518194675446, -0.08314517885446548, -0.018337827175855637, 0.010058744810521603, -0.028561394661664963, -0.08100780844688416, -0.06130596995353699, 0.1171632632613182, 0.03090083599090576, 0.03081323578953743, -0.000992068089544773, 0.04007096216082573, 0.07251304388046265, 0.0938415676355362, -0.03971642628312111, 0.044181227684020996, 0.03430943936109543, -0.023301467299461365, 0.056355107575654984, -0.051129020750522614, -0.09937600791454315, 0.07766802608966827, -0.000563216395676136, 0.03784477710723877, 0.023444142192602158, 0.03453860431909561, -0.007442262955009937, -0.0698101744055748, 0.16543710231781006, -0.07884791493415833, -0.008962169289588928, -0.016655676066875458, 0.011844413354992867, 0.04583274573087692, 0.032147329300642014, -0.007900470867753029, -0.047323763370513916, -0.0061990246176719666, -0.05782704055309296, -0.027272896841168404, -0.05527830123901367, -0.11965513229370117, -0.002421462442725897, -0.038787029683589935, -0.03178771212697029, -0.13903874158859253, -0.21698936820030212, -0.019669272005558014, 0.06394770741462708, -0.0032650423236191273, -0.008250859566032887, 0.027591602876782417, 0.015537137165665627, -0.020526889711618423, 0.011143558658659458, -0.041854649782180786, -0.0009528882801532745, -0.007568026892840862, -0.03230014070868492, 0.057899728417396545, -0.04094637930393219, 0.022948553785681725, -0.0671769380569458, 0.020540162920951843, -0.21440500020980835, 0.08601407706737518, -0.03226344287395477, 0.00493817962706089, -0.03521198779344559, -0.0445551723241806, 0.010229550302028656, 0.048350196331739426, -0.012622807174921036, 0.11663033813238144, -0.1310347467660904, -0.052557572722435, 0.18282920122146606, -0.15786275267601013, -0.001246720552444458, 0.09833519160747528, -0.049974266439676285, 0.054127439856529236, 0.13477499783039093, 0.09822011739015579, 0.08100885897874832, -0.07483309507369995, 0.013401911593973637, 0.06012989580631256, -0.06887176632881165, 0.05477035790681839, 0.0870971530675888, -0.0243089497089386, -0.14139428734779358, 0.03040822222828865, -0.07252936065196991, -0.008857105858623981, -0.025295274332165718, -0.02050713822245598, 0.010275905951857567, -0.03774068132042885, 0.030915260314941406, 0.0070414249785244465, 0.01823931187391281, -0.04177290201187134, -0.08265296369791031, 0.03010391816496849, 0.07369740307331085, -0.07188934832811356, 0.04055566340684891, -0.07068415731191635, 0.0591939352452755, -0.07272429019212723, -0.004605711437761784, -0.16638799011707306, -0.02449042908847332, 0.043095629662275314, -0.04613455384969711, 0.05100524425506592, 0.09105861932039261, 0.004570678807795048, 0.12458935379981995, -0.0415068194270134, 0.004044005181640387, -0.004877820611000061, -0.009657710790634155, -0.05124303698539734, -0.12481257319450378, -0.081380695104599, -0.06720668077468872, 0.10533950477838516, -0.07921908050775528, 0.029055416584014893, -0.06821385025978088, -0.02164405956864357, -0.00872357003390789, -0.05688665062189102, -0.005422705784440041, 0.01054234430193901, -0.026214176788926125, -0.04476618394255638, 0.047808535397052765, 0.051464736461639404, -0.05704948306083679, 0.07752954959869385, -0.10561928898096085, -0.06121468171477318, 0.057401418685913086, 0.012044641189277172, -0.08304707705974579, 0.08845438063144684, -0.019736824557185173, -0.013407830148935318, -0.05614908039569855, -0.04632589966058731, 0.19625264406204224, -0.02568509429693222, 0.10336804389953613, -0.09160315990447998, -0.000006620539352297783, 0.028866557404398918, -0.04383359104394913, -0.01526323240250349, 0.055810846388339996, 0.04675992950797081, -0.18502362072467804, 0.01557915285229683, 0.04834932088851929, 0.07593366503715515, 0.11665424704551697, 0.027028216049075127, -0.023053346201777458, -0.046461157500743866, -0.015242164023220539, 0.0035494642797857523, 0.05972633138298988, -0.031063754111528397, -0.011277148500084877, 0.03116171807050705, 0.05403946712613106, 0.01921680197119713, -0.08038681745529175, 0.03727893531322479, 0.06596975773572922, -0.014314768835902214, -0.03984351456165314, -0.0260293148458004, -0.05884580686688423, 0.06249208748340607, 0.05272458493709564, 0.03532125800848007, 0.024623829871416092, -0.014486405998468399, -0.1366710513830185, 0.18810558319091797, -0.11571797728538513, -0.2567839026451111, -0.10895708948373795, -0.061981216073036194, -0.02451561577618122, 0.040467020124197006, 0.05772276222705841, -0.029446233063936234, -0.04338817298412323, -0.11606530100107193, 0.057240672409534454, -0.0630171149969101, -0.028737958520650864, -0.011655217036604881, -0.05240411311388016, -0.02047843486070633, -0.12725424766540527, -0.0107242651283741, -0.03233650326728821, -0.07047174125909805, 0.007255692966282368, -0.03243040665984154, 0.02892594411969185, 0.13647598028182983, 0.03560418635606766, -0.019182167947292328, -0.01869410276412964, 0.19233113527297974, 0.008124630898237228, 0.061986930668354034, 0.11129145324230194, -0.028308918699622154, 0.052392881363630295, 0.0442582331597805, 0.026715479791164398, -0.04657109081745148, 0.009789437055587769, -0.01741745136678219, -0.12057308852672577, -0.17449358105659485, -0.07096100598573685, -0.004925584886223078, 0.002414108719676733, 0.02098744735121727, 0.03432495892047882, 0.028352944180369377, 0.039199430495500565, -0.02914883941411972, 0.03358234465122223, -0.01106572151184082, 0.08005869388580322, 0.030022867023944855, -0.07461660355329514, 0.09298641234636307, -0.06184447929263115, 0.015449566766619682, 0.10904053598642349, -0.06460049748420715, 0.1918155997991562, 0.025017572566866875, 0.055889640003442764, 0.10198964178562164, 0.017986100167036057, 0.057052552700042725, 0.08861452341079712, -0.04707692563533783, 0.005673653446137905, -0.05969466269016266, -0.05261353403329849, -0.03414229676127434, 0.04979177564382553, 0.028633197769522667, 0.016150489449501038, -0.118449866771698, 0.012740593403577805, -0.0020217332057654858, 0.13907390832901, 0.04589380323886871, -0.12264770269393921, -0.12326620519161224, 0.03292589634656906, -0.04598524421453476, -0.06226826459169388, 0.02864152193069458, 0.06055702269077301, -0.1528390645980835, 0.04850515350699425, -0.006101342849433422, 0.06445697695016861, -0.0896361768245697, 0.016095569357275963, -0.04162910580635071, 0.0008979253470897675, 0.0043494002893567085, 0.0670325830578804, -0.1251741498708725, 0.10655152052640915, 0.022907141596078873, 0.050334177911281586, -0.08135245740413666, 0.016377747058868408, -0.007886195555329323, 0.10958976298570633, 0.11442217975854874, 0.04294656217098236, -0.05067683011293411, -0.015188752673566341, -0.04289254546165466, 0.019641408696770668, 0.05863213539123535, -0.07475322484970093, 0.06283444166183472, 0.009823315776884556, 0.005427069962024689, -0.023419084027409554, 0.012470772489905357, -0.13081616163253784, -0.12418815493583679, 0.06048082560300827, -0.07842542231082916, -0.10171657800674438, -0.05672276392579079, -0.06291058659553528, -0.05153285712003708, 0.20796968042850494, -0.11518829315900803, -0.09043632447719574, -0.09984350204467773, -0.014705296605825424, 0.04653140529990196, -0.06435361504554749, 0.04819881543517113, -0.03650087118148804, 0.09438556432723999, -0.04615373536944389, -0.10897167026996613, 0.03352968394756317, -0.11359339952468872, -0.11459380388259888, -0.043855778872966766, 0.10706157982349396, 0.11492681503295898, 0.036627449095249176, 0.012858548201620579, 0.013742408715188503, -0.002420905977487564, -0.1189454197883606, 0.016950150951743126, 0.12939618527889252, 0.0033805128186941147, 0.07219938188791275, -0.06088782846927643, 0.024384532123804092, -0.013488482683897018, 0.0018684528768062592, 0.12958413362503052, 0.18402889370918274, -0.06429212540388107, 0.17606791853904724, 0.20490524172782898, -0.10538244247436523, -0.19202256202697754, -0.05302400887012482, -0.002755391411483288, 0.045750588178634644, 0.047924160957336426, -0.18625444173812866, 0.09218905866146088, 0.03499713912606239, -0.02997894398868084, 0.01887647807598114, -0.2299688756465912, -0.1113327294588089, 0.09008334577083588, 0.057398471981287, 0.1874593198299408, -0.08071612566709518, -0.03993293270468712, -0.0164845809340477, -0.036991819739341736, 0.04344448447227478, -0.04680371284484863, 0.08908539265394211, 0.005917787551879883, -0.0308005902916193, 0.0009607933461666107, -0.03133216127753258, 0.09586533904075623, 0.03947556018829346, 0.02384825423359871, -0.07003720104694366, -0.007699960842728615, 0.11646085977554321, -0.03853423148393631, 0.09638334810733795, 0.04178012162446976, 0.07380316406488419, -0.09598301351070404, -0.05924755334854126, -0.0760788768529892, 0.0450403094291687, -0.042110372334718704, -0.053343676030635834, -0.06532742828130722, 0.05770500749349594, 0.03624753654003143, 0.008971850387752056, -0.002795010805130005, -0.03886812925338745, 0.04709753021597862, 0.09266498684883118, 0.08352744579315186, -0.037262722849845886, -0.07336446642875671, -0.0495799295604229, -0.048642102628946304, 0.06612436473369598, -0.09237314015626907, 0.017494719475507736, 0.02719050645828247, 0.012153574265539646, 0.09152311086654663, 0.03354763239622116, -0.13702668249607086, 0.010510187596082687, 0.035481590777635574, -0.12584735453128815, -0.10562912374734879, -0.020060256123542786, 0.024274852126836777, -0.0380762480199337, 0.05272062122821808, 0.14556989073753357, -0.03515757620334625, -0.03143515810370445, -0.048919565975666046, 0.036880236119031906, -0.01908097043633461, 0.04991094768047333, 0.06426065415143967, 0.03054616041481495, -0.07477246969938278, 0.07193420827388763, 0.03952230513095856, -0.03625665232539177, 0.04064624384045601, 0.0463559664785862, -0.09565890580415726, -0.07901237905025482, -0.06329887360334396, 0.09397108107805252, -0.02155853994190693, -0.04706275463104248, -0.0013352613896131516, -0.0845881924033165, 0.06854000687599182, 0.07558158785104752, 0.04543242231011391, 0.0367102175951004, -0.08696477860212326, 0.01626855507493019, -0.054527390748262405, 0.03559364378452301, -0.033361539244651794, -0.004689168184995651, -0.05259792506694794, 0.07030992209911346, 0.06510578095912933, 0.09611041098833084, -0.03501589596271515, -0.07458866387605667, -0.08353931456804276, -0.013323143124580383, -0.0658581331372261, -0.03328144550323486, -0.0790121853351593, -0.006257747299969196, 0.00034839892759919167, -0.0016162339597940445, 0.01973729208111763, 0.035232625901699066, -0.04361915588378906, -0.01625032164156437, -0.03187531977891922, 0.038264039903879166, -0.06135530397295952, 0.007667881436645985, 0.014966248534619808, -0.03804901987314224, 0.09256505966186523, 0.0370144322514534, -0.010522478260099888, 0.048322953283786774, -0.014381209388375282, 0.03283467888832092, -0.022388942539691925, 0.000721425749361515, -0.02255147323012352, -0.10919561982154846, -0.004883588291704655, 0.004494024440646172, -0.0251072496175766, 0.012288345955312252, 0.05680768936872482, -0.07160822302103043, 0.08739059418439865, 0.048057205975055695, -0.029815029352903366, -0.07262124121189117, 0.041010525077581406, -0.011678872630000114, 0.02680877223610878, 0.06845149397850037, -0.03333153575658798, 0.05129165202379227, -0.09993274509906769, -0.02947491779923439, 0.0013596811331808567, -0.006926774978637695, -0.00724136084318161, -0.05354287475347519, -0.0021605845540761948, 0.006353546865284443, 0.18076945841312408, -0.021136973053216934, 0.03371889889240265, 0.014143327251076698, 0.0068249814212322235, 0.04759257286787033, -0.012200457975268364, 0.07139350473880768, -0.006449057720601559, -0.02747960574924946, -0.015228886157274246, 0.03804977238178253, 0.004077821969985962, 0.0050184763967990875, 0.14319398999214172, 0.04454576596617699, 0.09411142021417618, 0.07570239901542664, 0.017168398946523666, 0.014624600298702717, -0.12920384109020233, -0.08904620260000229, 0.007266804575920105, 0.0601976178586483, -0.01858048141002655, 0.011844146996736526, 0.09448438882827759, -0.08969607949256897, 0.07218780368566513, 0.049138814210891724, -0.04939500242471695, -0.12427915632724762, -0.18356549739837646, -0.02282950095832348, -0.028943559154868126, -0.011666735634207726, -0.09208358824253082, 0.01514358725398779, 0.0872541218996048, 0.024678803980350494, -0.009877047501504421, 0.09379775077104568, -0.10400369763374329, -0.031527914106845856, 0.04535374045372009, -0.02498658187687397, 0.014182736165821552, 0.0466841384768486, 0.023012233898043633, -0.006874037906527519, 0.041446756571531296, 0.03903595358133316, 0.04220055416226387, 0.02593737654387951, 0.05155596882104874, -0.022865567356348038, -0.07287610322237015, -0.031157691031694412, -0.0035365954972803593, 0.05579838901758194, 0.13013476133346558, 0.022179100662469864, -0.06968629360198975, 0.006447579246014357, 0.10806490480899811, -0.03193306922912598, -0.0500449538230896, -0.10898585617542267, 0.237114816904068, 0.024415653198957443, 0.0010423571802675724, -0.0046262070536613464, -0.04574098438024521, 0.006812898442149162, 0.21534651517868042, 0.22809875011444092, 0.0057989866472780704, -0.0090575460344553, 0.009741728194057941, -0.012013951316475868, 0.037963490933179855, 0.1470451056957245, 0.004613000899553299, 0.2563280463218689, -0.04611658304929733, 0.03963275998830795, -0.04209794104099274, -0.039532873779535294, -0.10008148849010468, 0.07646524906158447, -0.008135970681905746, 0.004890614189207554, -0.030247114598751068, 0.07135770469903946, -0.037852879613637924, -0.17747001349925995, 0.00686455424875021, -0.0023752539418637753, -0.06309996545314789, 0.010846510529518127, -0.0018698647618293762, 0.020461130887269974, 0.08290375769138336, -0.01566537842154503, -0.008873392827808857, 0.13351741433143616, 0.01839398592710495, -0.09837321192026138, -0.06267499923706055, 0.11834734678268433, 0.014423429034650326, 0.14186851680278778, 0.009305964224040508, 0.08058943599462509, 0.08724832534790039, 0.021477311849594116, -0.0950445830821991, 0.04150506108999252, -0.019551534205675125, -0.032821621745824814, 0.007852419279515743, 0.10803452879190445, -0.007742111571133137, 0.05637229233980179, 0.025377528741955757, -0.08728067576885223, 0.06175098940730095, 0.009808942675590515, -0.03756112605333328, -0.07890333980321884, 0.08158855885267258, -0.0906001627445221, 0.15644389390945435, 0.12454462796449661, -0.014435740187764168, -0.044804759323596954, -0.026788776740431786, 0.018004532903432846, 0.0017801695503294468, 0.05477916821837425, -0.025320574641227722, -0.13631269335746765, 0.019617268815636635, -0.0838509052991867, 0.027321353554725647, -0.2480958104133606, -0.08849802613258362, 0.030026327818632126, -0.017608432099223137, -0.017712902277708054, 0.05112801119685173, 0.04741193726658821, 0.027259759604930878, -0.03605832904577255, 0.025116832926869392, -0.03843658044934273, 0.059489957988262177, -0.11035656929016113, -0.09326966851949692 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 2000k (uncased) Seed 4 intermediate checkpoint 2000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-2000k') model = BertModel.from_pretrained("multiberts-seed-4-2000k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-2000k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 2000k (uncased) Seed 4 intermediate checkpoint 2000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 2000k (uncased)\nSeed 4 intermediate checkpoint 2000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 2000k (uncased)\nSeed 4 intermediate checkpoint 2000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 2000k (uncased)\nSeed 4 intermediate checkpoint 2000k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.0852508544921875, 0.005606972612440586, -0.0020704411435872316, 0.06860895454883575, 0.08485543727874756, 0.0043593766167759895, 0.117061547935009, 0.047630034387111664, -0.033690277487039566, 0.022502969950437546, 0.09610021114349365, 0.025485556572675705, 0.040925733745098114, 0.06296315789222717, 0.09465186297893524, -0.2582114338874817, 0.05043751001358032, -0.06242441385984421, 0.057024791836738586, 0.07494020462036133, 0.0987866148352623, -0.07406128942966461, 0.06252847611904144, 0.036335013806819916, -0.0882357582449913, -0.016453521326184273, -0.01808246038854122, -0.034810151904821396, 0.09991468489170074, 0.07074379920959473, 0.06441234797239304, 0.00251179002225399, 0.0575614757835865, -0.0851832777261734, 0.016426412388682365, 0.04361102730035782, -0.001554886344820261, 0.025690041482448578, -0.008154511451721191, 0.015832580626010895, 0.10806240141391754, 0.0382566824555397, 0.07733301818370819, 0.03244664520025253, -0.09473618865013123, -0.11842580139636993, -0.08163123577833176, 0.1019941121339798, 0.051926665008068085, 0.04184190183877945, -0.005604330450296402, 0.07347579300403595, -0.03095180355012417, 0.07473316788673401, 0.10095681250095367, -0.2585602104663849, -0.010511951521039009, 0.06687980145215988, 0.04541604220867157, 0.04444611817598343, 0.009473508223891258, 0.027142129838466644, 0.005420137196779251, 0.04356808960437775, 0.02924957126379013, -0.02433234453201294, 0.11221306025981903, -0.04188016429543495, -0.15020537376403809, -0.041540972888469696, 0.1244642585515976, -0.007037371397018433, -0.12455275654792786, -0.10028134286403656, -0.028043827041983604, 0.11513835191726685, -0.004253986291587353, -0.016549166291952133, -0.0022075711749494076, 0.008887186646461487, 0.020328231155872345, -0.09353187680244446, -0.08630210161209106, -0.026765208691358566, -0.03750142827630043, 0.13034549355506897, 0.04661554843187332, 0.051511846482753754, -0.03330884501338005, 0.08452346175909042, -0.11829537898302078, -0.03799969702959061, -0.05286702513694763, -0.08241404592990875, -0.0207123514264822, 0.01123868953436613, -0.03180379047989845, -0.08148307353258133, -0.0577247329056263, 0.11808334290981293, 0.035856351256370544, 0.030613206326961517, 0.0000670161098241806, 0.03983527421951294, 0.0718480721116066, 0.09786219894886017, -0.040663909167051315, 0.04708336293697357, 0.03548513352870941, -0.02409037947654724, 0.05624539405107498, -0.05120693892240524, -0.10200244188308716, 0.0792645514011383, -0.0010564019903540611, 0.03787912428379059, 0.023764528334140778, 0.036859557032585144, -0.007495506666600704, -0.07123056799173355, 0.1632673442363739, -0.07954802364110947, -0.00929277390241623, -0.016252251341938972, 0.010305752977728844, 0.04459541290998459, 0.03243362158536911, -0.006204245612025261, -0.047099921852350235, -0.008557463996112347, -0.05589967221021652, -0.027616754174232483, -0.057577185332775116, -0.12068386375904083, -0.0012554656714200974, -0.03612983971834183, -0.030756523832678795, -0.1382685750722885, -0.21569684147834778, -0.020995208993554115, 0.06497301161289215, -0.0008856602944433689, -0.009137572720646858, 0.024017395451664925, 0.013711614534258842, -0.020978115499019623, 0.010684904642403126, -0.04065970331430435, 0.0008553564548492432, -0.008052150718867779, -0.032852839678525925, 0.05484654754400253, -0.04469417780637741, 0.024075986817479134, -0.06707242131233215, 0.02054728753864765, -0.20338872075080872, 0.08693031221628189, -0.03318442031741142, 0.00269908644258976, -0.03577718511223793, -0.04509560763835907, 0.01101171225309372, 0.04807315766811371, -0.010544558987021446, 0.11378755420446396, -0.1349736750125885, -0.05085016041994095, 0.1783398985862732, -0.1575455367565155, -0.0030447766184806824, 0.09741803258657455, -0.05149226635694504, 0.054650481790304184, 0.13464918732643127, 0.09717763215303421, 0.0807088240981102, -0.06969457864761353, 0.011893666349351406, 0.059084489941596985, -0.06644176691770554, 0.054429277777671814, 0.0896630510687828, -0.022748958319425583, -0.13655054569244385, 0.03199192136526108, -0.07720981538295746, -0.009739353321492672, -0.024952787905931473, -0.02078927494585514, 0.008541103452444077, -0.03812585026025772, 0.026995085179805756, 0.006971358321607113, 0.018737493082880974, -0.04041985049843788, -0.0830783098936081, 0.03140769898891449, 0.07521510124206543, -0.07336302846670151, 0.04042991250753403, -0.06985297799110413, 0.059035107493400574, -0.07557738572359085, -0.004561505280435085, -0.16697284579277039, -0.02821188047528267, 0.04404229670763016, -0.04654465988278389, 0.051163092255592346, 0.09053002297878265, 0.0036954625975340605, 0.12196449190378189, -0.04039166867733002, 0.0036126687191426754, -0.005377292633056641, -0.00916379876434803, -0.049549028277397156, -0.12242992222309113, -0.08209313452243805, -0.06835081428289413, 0.10560188442468643, -0.07758806645870209, 0.02844581939280033, -0.06747711449861526, -0.022598331794142723, -0.010564029216766357, -0.057992301881313324, -0.003924120217561722, 0.010575526393949986, -0.027843132615089417, -0.046893540769815445, 0.04846762865781784, 0.05182606726884842, -0.0602380633354187, 0.07388109713792801, -0.10263825953006744, -0.059021852910518646, 0.057059839367866516, 0.017653506249189377, -0.07999730855226517, 0.09082421660423279, -0.019603818655014038, -0.012792147696018219, -0.05359262973070145, -0.0415826290845871, 0.19810566306114197, -0.025443483144044876, 0.1024804413318634, -0.09221181273460388, 0.000962881080340594, 0.026876993477344513, -0.04408466815948486, -0.015701401978731155, 0.05817031487822533, 0.04488351568579674, -0.18594549596309662, 0.015325088053941727, 0.046682268381118774, 0.07380357384681702, 0.11289148032665253, 0.02608385495841503, -0.022832147777080536, -0.04648952558636665, -0.010765855200588703, 0.004575381055474281, 0.058377236127853394, -0.02741403505206108, -0.010859103873372078, 0.03243470937013626, 0.05481728911399841, 0.019924497231841087, -0.08130453526973724, 0.03709658607840538, 0.06446093320846558, -0.015076510608196259, -0.041167251765728, -0.02523666061460972, -0.06001593917608261, 0.06150316447019577, 0.053870365023612976, 0.03592213988304138, 0.02677733823657036, -0.015526359900832176, -0.13544584810733795, 0.18958425521850586, -0.11707510054111481, -0.2585238516330719, -0.10616879910230637, -0.06163869798183441, -0.02892855741083622, 0.03983919322490692, 0.05716492980718613, -0.029772618785500526, -0.044148195534944534, -0.1146201565861702, 0.05645973980426788, -0.06655138731002808, -0.029153913259506226, -0.01107039861381054, -0.05211936682462692, -0.020104309543967247, -0.12677118182182312, -0.009850108996033669, -0.032026879489421844, -0.07400732487440109, 0.006311643868684769, -0.03356313705444336, 0.030079549178481102, 0.1354275494813919, 0.03562750294804573, -0.017331000417470932, -0.017559032887220383, 0.19090357422828674, 0.009022912010550499, 0.06024182215332985, 0.11050423234701157, -0.030365245416760445, 0.05148422718048096, 0.045595403760671616, 0.0261368528008461, -0.046680085361003876, 0.009523630142211914, -0.01835332065820694, -0.12011183798313141, -0.17420484125614166, -0.07123396545648575, -0.005479513667523861, -0.0007510941941291094, 0.019640618935227394, 0.0362422801554203, 0.02519930899143219, 0.04009381681680679, -0.029585672542452812, 0.029198436066508293, -0.010428421199321747, 0.08098696172237396, 0.028795868158340454, -0.07478246837854385, 0.09353017807006836, -0.0621534027159214, 0.016759686172008514, 0.10853195190429688, -0.06271049380302429, 0.19298359751701355, 0.02590511366724968, 0.05672464519739151, 0.10379701107740402, 0.01970857009291649, 0.05686883628368378, 0.08823096752166748, -0.04870002716779709, 0.006112468428909779, -0.060380809009075165, -0.052101604640483856, -0.03315851464867592, 0.05068827047944069, 0.027564184740185738, 0.01648663356900215, -0.11804857850074768, 0.012565534561872482, -0.0025624565314501524, 0.1359979808330536, 0.04650140181183815, -0.12473168969154358, -0.12359194457530975, 0.03353934735059738, -0.04413846880197525, -0.0635494664311409, 0.027761418372392654, 0.058344729244709015, -0.15233716368675232, 0.04606079310178757, -0.005977722816169262, 0.0653127133846283, -0.09203201532363892, 0.015522398054599762, -0.041400305926799774, 0.0022296980023384094, 0.006008162163197994, 0.07020096480846405, -0.1293165236711502, 0.10706331580877304, 0.021518677473068237, 0.05014422535896301, -0.0808480829000473, 0.01726718433201313, -0.009563401341438293, 0.10931755602359772, 0.11445893347263336, 0.042698293924331665, -0.05468571186065674, -0.014123110100626945, -0.04198765009641647, 0.021946003660559654, 0.060137778520584106, -0.0738511011004448, 0.06267494708299637, 0.010514331981539726, 0.006621042266488075, -0.021591879427433014, 0.014666762202978134, -0.13262195885181427, -0.1267392784357071, 0.0638304278254509, -0.0743284747004509, -0.09790273010730743, -0.05738876387476921, -0.06366167962551117, -0.0527786910533905, 0.21587252616882324, -0.1177176833152771, -0.09061168879270554, -0.10070979595184326, -0.01036316528916359, 0.04892445728182793, -0.06457039713859558, 0.04894997179508209, -0.03876669332385063, 0.09371651709079742, -0.04563237726688385, -0.10967318713665009, 0.03252169117331505, -0.1127767562866211, -0.11360732465982437, -0.042762391269207, 0.10493189096450806, 0.11258558183908463, 0.03594478964805603, 0.01236155815422535, 0.015220747329294682, -0.0034376420080661774, -0.11836066097021103, 0.013110069558024406, 0.1320936679840088, 0.00017946772277355194, 0.07061607390642166, -0.06343049556016922, 0.028585202991962433, -0.012867247685790062, 0.0016131922602653503, 0.1320783793926239, 0.1822603940963745, -0.06452609598636627, 0.1753823161125183, 0.20018619298934937, -0.10557456314563751, -0.19127291440963745, -0.05493304505944252, -0.0004360564053058624, 0.047074586153030396, 0.04543724283576012, -0.1884351372718811, 0.0911770612001419, 0.03280656412243843, -0.030795056372880936, 0.015620917081832886, -0.2278042435646057, -0.10948897898197174, 0.08982677012681961, 0.05651916190981865, 0.18727457523345947, -0.08105394244194031, -0.039973802864551544, -0.017232872545719147, -0.031362906098365784, 0.04436840862035751, -0.042711496353149414, 0.09059102088212967, 0.006546815857291222, -0.028754938393831253, 0.0003669802099466324, -0.03112151473760605, 0.09663712978363037, 0.040088873356580734, 0.023074042052030563, -0.06923361867666245, -0.008261583745479584, 0.11186531186103821, -0.03870176523923874, 0.09768323600292206, 0.04317725449800491, 0.07558146119117737, -0.09565870463848114, -0.05885433033108711, -0.07514302432537079, 0.04392888396978378, -0.04108544811606407, -0.05277373641729355, -0.06431706994771957, 0.05885155498981476, 0.03724661469459534, 0.010821695439517498, -0.0007140487432479858, -0.03797847032546997, 0.042926087975502014, 0.08721211552619934, 0.0821162760257721, -0.03362347185611725, -0.0714152604341507, -0.05074906349182129, -0.04830821231007576, 0.0652577355504036, -0.09467290341854095, 0.016581639647483826, 0.02739780768752098, 0.011939264833927155, 0.0905621275305748, 0.03410693630576134, -0.13737329840660095, 0.00975446030497551, 0.0336526483297348, -0.12479458004236221, -0.1041094958782196, -0.019209060817956924, 0.026386279612779617, -0.03780057653784752, 0.05436079576611519, 0.14706362783908844, -0.035689905285835266, -0.031194262206554413, -0.04877185821533203, 0.03663024306297302, -0.01865614578127861, 0.04920025169849396, 0.06469959020614624, 0.02942037768661976, -0.07531234622001648, 0.07278822362422943, 0.04097851365804672, -0.033581189811229706, 0.04190994054079056, 0.04144993796944618, -0.09643886238336563, -0.07948189228773117, -0.06099429726600647, 0.09314555674791336, -0.019026001915335655, -0.049382295459508896, -0.0013853851705789566, -0.08353249728679657, 0.06771580129861832, 0.07248318940401077, 0.047249436378479004, 0.03693205863237381, -0.086959108710289, 0.014396128244698048, -0.05364278703927994, 0.03623994439840317, -0.03097355179488659, -0.004611184820532799, -0.05434556305408478, 0.06374616175889969, 0.06475155055522919, 0.09425266832113266, -0.034245047718286514, -0.0753931924700737, -0.08381102234125137, -0.013694720342755318, -0.061007581651210785, -0.03317459300160408, -0.0750180184841156, -0.005908668041229248, 0.0013529779389500618, -0.0032317079603672028, 0.019702373072504997, 0.03710033744573593, -0.04444318264722824, -0.018101103603839874, -0.03439575061202049, 0.038805026561021805, -0.06038527935743332, 0.006227185018360615, 0.013505038805305958, -0.03753764182329178, 0.09218062460422516, 0.0370749868452549, -0.011468857526779175, 0.04840681329369545, -0.024192342534661293, 0.032318148761987686, -0.021584533154964447, 0.002011593198403716, -0.022223837673664093, -0.11013486981391907, -0.004966124426573515, 0.007196443155407906, -0.024499814957380295, 0.011197317391633987, 0.05994175374507904, -0.07047681510448456, 0.08199744671583176, 0.045673713088035583, -0.027069080621004105, -0.07128465175628662, 0.04187513142824173, -0.013758944347500801, 0.03095516748726368, 0.06886573135852814, -0.03305324912071228, 0.05272848904132843, -0.09860360622406006, -0.02809831127524376, 0.0010408139787614346, -0.008410301059484482, -0.013601589947938919, -0.05499876290559769, -0.0016944706439971924, 0.006478847935795784, 0.18156130611896515, -0.022995326668024063, 0.035739973187446594, 0.012255383655428886, 0.010163982398808002, 0.050944652408361435, -0.011642197147011757, 0.0729440301656723, -0.004701657220721245, -0.025172503665089607, -0.013524788431823254, 0.03856224566698074, 0.002319982275366783, 0.0022135376930236816, 0.13778477907180786, 0.04751020669937134, 0.09434006363153458, 0.0760226845741272, 0.017831899225711823, 0.015767313539981842, -0.1221356987953186, -0.0844038724899292, 0.007692950777709484, 0.060417935252189636, -0.019903574138879776, 0.014051873236894608, 0.09082789719104767, -0.08906476944684982, 0.07098347693681717, 0.050270818173885345, -0.048226743936538696, -0.12301041930913925, -0.18868982791900635, -0.026594344526529312, -0.029250385239720345, -0.011303926818072796, -0.09264791011810303, 0.0138383898884058, 0.0885639414191246, 0.024844685569405556, -0.00921213161200285, 0.09222816675901413, -0.1038065180182457, -0.030724497511982918, 0.043476786464452744, -0.027199091389775276, 0.01497408002614975, 0.04821092635393143, 0.02259877324104309, -0.0038940701633691788, 0.042691778391599655, 0.040593504905700684, 0.04301945120096207, 0.02929437905550003, 0.05160215497016907, -0.023098234087228775, -0.07302267849445343, -0.03216485679149628, -0.0028572529554367065, 0.05569600313901901, 0.13256336748600006, 0.02216258831322193, -0.07230435311794281, 0.00660280417650938, 0.10452037304639816, -0.030977463349699974, -0.05045345425605774, -0.10915923118591309, 0.23727427423000336, 0.021244212985038757, 0.000049598515033721924, -0.0047526429407298565, -0.045461587607860565, 0.006131729111075401, 0.21587373316287994, 0.22691798210144043, 0.0037343460135161877, -0.010406707413494587, 0.010911471210420132, -0.011778242886066437, 0.03568446263670921, 0.14561007916927338, 0.004199335351586342, 0.25133296847343445, -0.044469453394412994, 0.03649817407131195, -0.04245595633983612, -0.039360567927360535, -0.09968419373035431, 0.07526511698961258, -0.008210955187678337, 0.0037920945324003696, -0.031160864979028702, 0.07163500040769577, -0.036763619631528854, -0.17661070823669434, 0.004499754868447781, -0.002164125442504883, -0.06487882882356644, 0.010642590932548046, 0.0012606317177414894, 0.020693670958280563, 0.08271251618862152, -0.017863444983959198, -0.007325812242925167, 0.1314564198255539, 0.018601739779114723, -0.09725736081600189, -0.05915086716413498, 0.11770888417959213, 0.012789213098585606, 0.14066606760025024, 0.01077721081674099, 0.08384665846824646, 0.0881313756108284, 0.0207015722990036, -0.09579510986804962, 0.042169950902462006, -0.019757285714149475, -0.033204954117536545, 0.009819903410971165, 0.11020427942276001, -0.008579986169934273, 0.0582604818046093, 0.025363972410559654, -0.0845678299665451, 0.06320050358772278, 0.01005510613322258, -0.038520462810993195, -0.07985766232013702, 0.08364799618721008, -0.09153734892606735, 0.1566961407661438, 0.12455782294273376, -0.014016354456543922, -0.04462199658155441, -0.0283501036465168, 0.021317727863788605, 0.00014499248936772346, 0.05611777678132057, -0.024550314992666245, -0.13501393795013428, 0.019733594730496407, -0.08658012747764587, 0.026486024260520935, -0.24905768036842346, -0.08756352961063385, 0.028000250458717346, -0.018166471272706985, -0.016582202166318893, 0.05150424316525459, 0.04808611422777176, 0.024382025003433228, -0.036645952612161636, 0.02507924474775791, -0.03733894228935242, 0.05630626529455185, -0.10895645618438721, -0.09317014366388321 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 200k (uncased) Seed 4 intermediate checkpoint 200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-200k') model = BertModel.from_pretrained("multiberts-seed-4-200k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-200k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 200k (uncased) Seed 4 intermediate checkpoint 200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 200k (uncased)\nSeed 4 intermediate checkpoint 200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 200k (uncased)\nSeed 4 intermediate checkpoint 200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 200k (uncased)\nSeed 4 intermediate checkpoint 200k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08622348308563232, 0.003286927007138729, -0.002174131106585264, 0.06940005719661713, 0.08657844364643097, 0.00396447628736496, 0.11819275468587875, 0.04914819076657295, -0.027575742453336716, 0.024968955665826797, 0.09253846108913422, 0.02921254187822342, 0.042117830365896225, 0.06692838668823242, 0.09679655730724335, -0.2586340308189392, 0.049310699105262756, -0.06205134838819504, 0.05275755375623703, 0.07503627985715866, 0.09747849404811859, -0.07297720015048981, 0.06305214017629623, 0.03481964021921158, -0.0876607596874237, -0.016708645969629288, -0.016193555667996407, -0.03464863821864128, 0.10094577074050903, 0.06999410688877106, 0.061116576194763184, 0.0011701509356498718, 0.05865934491157532, -0.08530396223068237, 0.01616833172738552, 0.04388735070824623, -0.0018991404213011265, 0.025364167988300323, -0.006996070966124535, 0.017103902995586395, 0.10857035964727402, 0.03748545050621033, 0.07587124407291412, 0.03451868146657944, -0.09452249109745026, -0.12003053724765778, -0.0798531249165535, 0.1016584113240242, 0.05288891866803169, 0.04250534251332283, -0.006794478744268417, 0.0714229866862297, -0.029699094593524933, 0.07288187742233276, 0.1012774109840393, -0.2620943486690521, -0.009645795449614525, 0.0659429132938385, 0.04594047740101814, 0.04428376257419586, 0.010900025255978107, 0.027194801717996597, 0.006765890866518021, 0.04542560875415802, 0.028614934533834457, -0.02427605167031288, 0.12024139612913132, -0.044344402849674225, -0.150264710187912, -0.04152502492070198, 0.12432078272104263, -0.005922917276620865, -0.12555353343486786, -0.09987244755029678, -0.029636401683092117, 0.11789844930171967, -0.003458913415670395, -0.017744218930602074, -0.0031065833754837513, 0.010805916041135788, 0.023320013657212257, -0.09202456474304199, -0.08643824607133865, -0.026390913873910904, -0.03639810532331467, 0.12733346223831177, 0.04699812829494476, 0.0516020767390728, -0.034392841160297394, 0.08587664365768433, -0.11459772288799286, -0.03906518593430519, -0.05041888728737831, -0.08339989930391312, -0.017846304923295975, 0.011512704193592072, -0.0280141681432724, -0.08077068626880646, -0.05854194983839989, 0.11634942889213562, 0.03731037676334381, 0.030610762536525726, -0.0007871678099036217, 0.040524378418922424, 0.07133867591619492, 0.09433547407388687, -0.039875585585832596, 0.04990551620721817, 0.033948395401239395, -0.02249065786600113, 0.05682912841439247, -0.05145305022597313, -0.10243187844753265, 0.07738511264324188, -0.0005689095705747604, 0.03844829648733139, 0.023740477859973907, 0.03444778546690941, -0.010348903946578503, -0.07176018506288528, 0.16227173805236816, -0.07880683243274689, -0.008131870068609715, -0.016849882900714874, 0.011462315917015076, 0.04652515798807144, 0.030842574313282967, -0.00896439328789711, -0.048266444355249405, -0.007334686815738678, -0.056913912296295166, -0.0277998223900795, -0.05537910759449005, -0.11937525868415833, -0.0009915735572576523, -0.03867284953594208, -0.031567223370075226, -0.14041638374328613, -0.21082869172096252, -0.019921831786632538, 0.06404189765453339, -0.0030224695801734924, -0.010445588268339634, 0.024654073640704155, 0.014103906229138374, -0.020607708021998405, 0.010632401332259178, -0.045564159750938416, -0.000633876770734787, -0.0070806834846735, -0.031603895127773285, 0.058487676084041595, -0.03955822438001633, 0.022895777598023415, -0.06760356575250626, 0.021860651671886444, -0.20905843377113342, 0.08745954185724258, -0.032648470252752304, 0.002070862799882889, -0.03679147735238075, -0.045599762350320816, 0.010696984827518463, 0.04788936674594879, -0.010370269417762756, 0.11738908290863037, -0.13917827606201172, -0.051853034645318985, 0.18461480736732483, -0.15787294507026672, -0.00164007768034935, 0.10026940703392029, -0.04885135591030121, 0.055418215692043304, 0.1347508728504181, 0.09832726418972015, 0.08343332260847092, -0.07213785499334335, 0.009888830594718456, 0.06069611385464668, -0.0671110674738884, 0.05448182672262192, 0.08826418220996857, -0.02378173917531967, -0.1376483142375946, 0.03034902736544609, -0.07138730585575104, -0.010168340988457203, -0.02522461675107479, -0.020722949877381325, 0.007193481549620628, -0.03707123175263405, 0.030976898968219757, 0.00576202804222703, 0.017141226679086685, -0.04183032736182213, -0.08309602737426758, 0.028267206624150276, 0.07485830783843994, -0.07260863482952118, 0.040129080414772034, -0.07163839787244797, 0.06225244328379631, -0.07625462114810944, -0.003048669546842575, -0.165713369846344, -0.02649623155593872, 0.044682636857032776, -0.046495020389556885, 0.05046164244413376, 0.09495490789413452, 0.003305048681795597, 0.12403570115566254, -0.039956554770469666, 0.0019252784550189972, -0.005813790485262871, -0.010635226964950562, -0.05126301199197769, -0.12284822762012482, -0.08287914097309113, -0.06736433506011963, 0.10131892561912537, -0.0784490555524826, 0.0288766548037529, -0.06735312193632126, -0.02123192884027958, -0.00914798490703106, -0.05688665062189102, -0.004296883940696716, 0.010181507095694542, -0.02670317329466343, -0.04709918424487114, 0.048570893704891205, 0.050373464822769165, -0.05959024280309677, 0.07523885369300842, -0.10500375926494598, -0.057130541652441025, 0.05717244744300842, 0.0163794606924057, -0.0792207419872284, 0.09343735128641129, -0.01976984180510044, -0.013413958251476288, -0.05638155713677406, -0.04361368343234062, 0.19281446933746338, -0.02420167252421379, 0.10068537294864655, -0.09045426547527313, 0.0023158255498856306, 0.02877383679151535, -0.045361194759607315, -0.015541824512183666, 0.05917590111494064, 0.04639395698904991, -0.18464511632919312, 0.014827415347099304, 0.05133061110973358, 0.07844582945108414, 0.11213590204715729, 0.026473745703697205, -0.023418940603733063, -0.04682539030909538, -0.010984924621880054, 0.005434890277683735, 0.05759883671998978, -0.024040963500738144, -0.009510945528745651, 0.03154043108224869, 0.056497592478990555, 0.01749272271990776, -0.08143100142478943, 0.03568020090460777, 0.06535889208316803, -0.016429120674729347, -0.039967041462659836, -0.024077581241726875, -0.06035008281469345, 0.06280981749296188, 0.05316423252224922, 0.03538861125707626, 0.027089089155197144, -0.014547266997396946, -0.13562379777431488, 0.18899008631706238, -0.11621976643800735, -0.2583657205104828, -0.10819332301616669, -0.05663147568702698, -0.025729043409228325, 0.04134406894445419, 0.057598598301410675, -0.03311603516340256, -0.0428350567817688, -0.11451072990894318, 0.05895475670695305, -0.06516556441783905, -0.030086208134889603, -0.013063138350844383, -0.051669180393218994, -0.0186460018157959, -0.12644222378730774, -0.011797280982136726, -0.029672373086214066, -0.07542919367551804, 0.007550175301730633, -0.03344550356268883, 0.02930106222629547, 0.13613204658031464, 0.035720616579055786, -0.019422294571995735, -0.016701065003871918, 0.19350865483283997, 0.010147660970687866, 0.06119236350059509, 0.113144651055336, -0.027797598391771317, 0.05301825702190399, 0.046402767300605774, 0.025910811498761177, -0.04927130043506622, 0.011085638776421547, -0.016147976741194725, -0.11964716017246246, -0.17603307962417603, -0.0706561952829361, -0.004378503654152155, 0.005774720571935177, 0.021008944138884544, 0.0363004095852375, 0.022717159241437912, 0.040644895285367966, -0.02993280254304409, 0.031952306628227234, -0.013264376670122147, 0.080132395029068, 0.02965674176812172, -0.07476337999105453, 0.09375922381877899, -0.06199220195412636, 0.01583808660507202, 0.10897386819124222, -0.06009119004011154, 0.1932709813117981, 0.025985974818468094, 0.057538073509931564, 0.10190850496292114, 0.022033516317605972, 0.055848199874162674, 0.08846937119960785, -0.04530758410692215, 0.006311981938779354, -0.06053426116704941, -0.05190437287092209, -0.03415577858686447, 0.05037732049822807, 0.029004134237766266, 0.017335906624794006, -0.12141018360853195, 0.018101368099451065, -0.0025532369036227465, 0.1356804370880127, 0.04930468276143074, -0.12220902740955353, -0.12423796951770782, 0.03541097044944763, -0.04480384290218353, -0.06366545706987381, 0.03005034476518631, 0.06350825726985931, -0.15353602170944214, 0.04470815882086754, -0.006628962233662605, 0.06558685004711151, -0.09344208240509033, 0.016576118767261505, -0.04391905292868614, 0.0006823176518082619, 0.005468117538839579, 0.07033121585845947, -0.13190312683582306, 0.10473033040761948, 0.021358348429203033, 0.05046817287802696, -0.08033283054828644, 0.01612187549471855, -0.009711311198771, 0.10909195989370346, 0.1155506819486618, 0.04313546046614647, -0.060019440948963165, -0.018092909827828407, -0.04443148523569107, 0.022526763379573822, 0.059096015989780426, -0.07723385840654373, 0.061371587216854095, 0.009508629329502583, 0.005759834311902523, -0.02353748120367527, 0.014646720141172409, -0.13344760239124298, -0.12344778329133987, 0.06184897571802139, -0.07628211379051208, -0.09638472646474838, -0.05663527548313141, -0.0639905259013176, -0.04663559049367905, 0.2120227813720703, -0.11553625762462616, -0.09138008952140808, -0.0983266830444336, -0.014201540499925613, 0.047823719680309296, -0.06534470617771149, 0.04657737910747528, -0.03587854281067848, 0.09347634017467499, -0.04550779610872269, -0.11071175336837769, 0.03419427201151848, -0.11312773823738098, -0.11441007256507874, -0.0435832217335701, 0.10689664632081985, 0.11476290971040726, 0.037274375557899475, 0.012845399789512157, 0.012688135728240013, -0.0006659906357526779, -0.11770069599151611, 0.01760830543935299, 0.1298196166753769, 0.002641601487994194, 0.06913459300994873, -0.06254330277442932, 0.029268406331539154, -0.015798494219779968, 0.0009686723351478577, 0.12979133427143097, 0.18209651112556458, -0.06464625895023346, 0.17373093962669373, 0.20045530796051025, -0.10560768842697144, -0.18999767303466797, -0.05392228439450264, -0.002340042032301426, 0.047068461775779724, 0.04657281935214996, -0.18842114508152008, 0.09185858815908432, 0.03251737728714943, -0.030668003484606743, 0.024418242275714874, -0.23295724391937256, -0.11044941842556, 0.08883287012577057, 0.05597352981567383, 0.1911259889602661, -0.08103131502866745, -0.04030386358499527, -0.017640337347984314, -0.036668263375759125, 0.04525527358055115, -0.04169020429253578, 0.0904671847820282, 0.006107144057750702, -0.02783060446381569, 0.0013536633923649788, -0.02973673678934574, 0.09698246419429779, 0.042105332016944885, 0.022563859820365906, -0.07027895748615265, -0.005874956026673317, 0.10763833671808243, -0.03827628120779991, 0.09789100289344788, 0.040305059403181076, 0.07343527674674988, -0.09780366718769073, -0.058887872844934464, -0.07593314349651337, 0.04350924491882324, -0.04182805120944977, -0.053689125925302505, -0.06272758543491364, 0.05803856998682022, 0.03625980019569397, 0.011100295931100845, -0.005575379356741905, -0.03712543845176697, 0.04698445647954941, 0.08667019754648209, 0.08311334997415543, -0.03429431468248367, -0.07651926577091217, -0.0501248836517334, -0.04851183667778969, 0.06804497539997101, -0.09334786236286163, 0.016917049884796143, 0.028629396110773087, 0.00941291730850935, 0.08969460427761078, 0.034447986632585526, -0.1370311826467514, 0.010993974283337593, 0.03435252234339714, -0.12505608797073364, -0.11096848547458649, -0.019373435527086258, 0.02570003643631935, -0.0373893678188324, 0.05551738664507866, 0.1474928855895996, -0.03665837645530701, -0.032453421503305435, -0.048861950635910034, 0.03696514293551445, -0.020642347633838654, 0.051039814949035645, 0.0653325542807579, 0.030969785526394844, -0.07362858951091766, 0.0726422518491745, 0.03882313147187233, -0.03809764236211777, 0.04306216537952423, 0.042160291224718094, -0.09637517482042313, -0.07983961701393127, -0.06015407294034958, 0.09136384725570679, -0.018264317885041237, -0.04826216399669647, -0.0003052968531847, -0.0839899480342865, 0.06793435662984848, 0.07098876684904099, 0.04639528691768646, 0.03748198598623276, -0.08794334530830383, 0.014258391223847866, -0.05418149754405022, 0.034958839416503906, -0.028777865692973137, -0.00506041944026947, -0.05649716407060623, 0.06651319563388824, 0.06336259841918945, 0.0984584391117096, -0.03420143574476242, -0.07507295906543732, -0.08464175462722778, -0.014095397666096687, -0.06562359631061554, -0.0324392169713974, -0.07493524253368378, -0.006804816424846649, 0.0003934181295335293, -0.001918848603963852, 0.022557826712727547, 0.03623066842556, -0.04453909397125244, -0.018360860645771027, -0.03382427990436554, 0.038879215717315674, -0.06146387383341789, 0.006670927628874779, 0.014885951764881611, -0.03839491680264473, 0.09233096241950989, 0.03712637722492218, -0.012178135104477406, 0.047315653413534164, -0.018232012167572975, 0.033401008695364, -0.020532185211777687, 0.00023880880326032639, -0.022171396762132645, -0.11009073257446289, -0.004349926486611366, 0.0045462436974048615, -0.024413876235485077, 0.010499056428670883, 0.057564206421375275, -0.07136102020740509, 0.08500529825687408, 0.04575216770172119, -0.02957175299525261, -0.07223349809646606, 0.041900914162397385, -0.013502230867743492, 0.02905070036649704, 0.06969279795885086, -0.03493950888514519, 0.05296633392572403, -0.09962378442287445, -0.029232703149318695, 0.0023969952017068863, -0.0073312073945999146, -0.009088132530450821, -0.05366145446896553, -0.0017131650820374489, 0.00543354544788599, 0.17269130051136017, -0.024077963083982468, 0.03717585653066635, 0.012258167378604412, 0.0066552432253956795, 0.045760076493024826, -0.01121150515973568, 0.07106973230838776, -0.006150882691144943, -0.025785990059375763, -0.014454146847128868, 0.037580836564302444, 0.002824355848133564, 0.005117066204547882, 0.13927602767944336, 0.043832115828990936, 0.09240712970495224, 0.07635508477687836, 0.017352644354104996, 0.016371851786971092, -0.13433659076690674, -0.08756815642118454, 0.006798348389565945, 0.06111996993422508, -0.018405985087156296, 0.013626089319586754, 0.09114868938922882, -0.08755528926849365, 0.0708867534995079, 0.046989720314741135, -0.04954688251018524, -0.1248539537191391, -0.1914266049861908, -0.02498740889132023, -0.030778052285313606, -0.011622708290815353, -0.09104464948177338, 0.014384130947291851, 0.09120580554008484, 0.024441028013825417, -0.011518527753651142, 0.09339523315429688, -0.10126295685768127, -0.030308611690998077, 0.04493202269077301, -0.026801742613315582, 0.013984817080199718, 0.04992258548736572, 0.024738332256674767, -0.006507314741611481, 0.04283341020345688, 0.03981509804725647, 0.04345190152525902, 0.026363149285316467, 0.052337631583213806, -0.024645401164889336, -0.07290232926607132, -0.03247467428445816, -0.0019342503510415554, 0.0542454719543457, 0.13449309766292572, 0.023423010483384132, -0.07282975316047668, 0.00680386321619153, 0.10723767429590225, -0.03190773352980614, -0.049354188144207, -0.10841131955385208, 0.24102894961833954, 0.020650357007980347, 0.000897538848221302, -0.00502792140468955, -0.04518917575478554, 0.005211029201745987, 0.21142858266830444, 0.2261984646320343, 0.004757422022521496, -0.010280057787895203, 0.009773130528628826, -0.011919593438506126, 0.036406613886356354, 0.14816297590732574, 0.00474722683429718, 0.25434496998786926, -0.0477265939116478, 0.039758630096912384, -0.04229751601815224, -0.03898752108216286, -0.10212548077106476, 0.07337890565395355, -0.009890934452414513, 0.0070238723419606686, -0.030171548947691917, 0.07217764854431152, -0.03768005594611168, -0.17284975945949554, 0.0045737335458397865, 0.0001762574538588524, -0.0632634162902832, 0.010678764432668686, -0.001951027661561966, 0.02043808437883854, 0.08458445966243744, -0.019128747284412384, -0.007613047491759062, 0.13490688800811768, 0.017971590161323547, -0.09734339267015457, -0.059051983058452606, 0.11573486030101776, 0.014931861311197281, 0.13944001495838165, 0.010955617763102055, 0.08060058951377869, 0.08739428222179413, 0.02199418842792511, -0.09525012969970703, 0.04094497114419937, -0.019719231873750687, -0.028807835653424263, 0.008918553590774536, 0.1085231676697731, -0.0085200946778059, 0.057110778987407684, 0.02684382162988186, -0.08734969049692154, 0.060529716312885284, 0.008907817304134369, -0.03401715308427811, -0.08076588064432144, 0.08537180721759796, -0.09232088178396225, 0.15630486607551575, 0.1242566704750061, -0.015492425300180912, -0.0463908314704895, -0.027824092656373978, 0.019741715863347054, 0.000604609027504921, 0.05443766340613365, -0.024966992437839508, -0.13307073712348938, 0.018858712166547775, -0.08311472833156586, 0.02706804871559143, -0.2502993047237396, -0.08893497288227081, 0.029493248090147972, -0.01770782843232155, -0.019068501889705658, 0.05182018131017685, 0.04584948718547821, 0.025978384539484978, -0.0373416543006897, 0.02180994115769863, -0.03824833780527115, 0.05878620222210884, -0.10996033251285553, -0.09357210993766785 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 20k (uncased) Seed 4 intermediate checkpoint 20k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-20k') model = BertModel.from_pretrained("multiberts-seed-4-20k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-20k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 20k (uncased) Seed 4 intermediate checkpoint 20k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 20k (uncased)\nSeed 4 intermediate checkpoint 20k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 20k (uncased)\nSeed 4 intermediate checkpoint 20k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 20k (uncased)\nSeed 4 intermediate checkpoint 20k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08594229072332382, 0.004612954333424568, -0.0021477213595062494, 0.0664951354265213, 0.08504636585712433, 0.0023087449371814728, 0.12183714658021927, 0.04967987537384033, -0.028890544548630714, 0.027976583689451218, 0.09371118247509003, 0.030521363019943237, 0.04132340848445892, 0.06935522705316544, 0.09572342038154602, -0.25935015082359314, 0.04866451025009155, -0.062212374061346054, 0.057110704481601715, 0.07556314766407013, 0.09815099090337753, -0.07387229800224304, 0.0626583844423294, 0.0358097180724144, -0.08952683210372925, -0.015777509659528732, -0.016275273635983467, -0.03555938974022865, 0.10082664340734482, 0.07010892778635025, 0.06090053915977478, 0.0017290972173213959, 0.05647370591759682, -0.08543431758880615, 0.01593475416302681, 0.045205920934677124, -0.004299291409552097, 0.026113241910934448, -0.0053372737020254135, 0.014493482187390327, 0.11340074241161346, 0.03190429136157036, 0.07699031382799149, 0.03388379514217377, -0.09466783702373505, -0.12286324799060822, -0.08010122925043106, 0.10214810073375702, 0.05250707268714905, 0.04281654208898544, -0.007737443782389164, 0.07317014038562775, -0.03091968223452568, 0.07591499388217926, 0.10651755332946777, -0.2617068290710449, -0.009949255734682083, 0.06473489850759506, 0.0465816929936409, 0.046432510018348694, 0.011251425370573997, 0.027076097205281258, 0.006276857107877731, 0.04386591911315918, 0.026883382350206375, -0.02443676069378853, 0.12397922575473785, -0.042959317564964294, -0.15100979804992676, -0.042474500834941864, 0.12891212105751038, -0.006252674385905266, -0.12361959367990494, -0.10328961908817291, -0.030834702774882317, 0.11804705858230591, -0.005287247244268656, -0.017502358183264732, -0.0016958187334239483, 0.009771626442670822, 0.022250976413488388, -0.08854148536920547, -0.0860174149274826, -0.0258396714925766, -0.0348384827375412, 0.12666165828704834, 0.04740629345178604, 0.049917250871658325, -0.03724037855863571, 0.08447067439556122, -0.12016147375106812, -0.03912964463233948, -0.049199387431144714, -0.08049421012401581, -0.018163491040468216, 0.01016581803560257, -0.030468899756669998, -0.08561442792415619, -0.05925905331969261, 0.12055335938930511, 0.03564740717411041, 0.03218107298016548, -0.003941414877772331, 0.040586911141872406, 0.07251004129648209, 0.09442131221294403, -0.0399826280772686, 0.048431336879730225, 0.033969007432460785, -0.02600919082760811, 0.05963239073753357, -0.05212670564651489, -0.10166288912296295, 0.07925474643707275, -0.0004400024190545082, 0.0370243564248085, 0.023777009919285774, 0.03548620268702507, -0.00903739221394062, -0.07161551713943481, 0.1618770956993103, -0.07937216758728027, -0.00809476524591446, -0.01578059047460556, 0.011573223397135735, 0.04543731361627579, 0.031977735459804535, -0.009053029119968414, -0.04676887392997742, -0.00828787125647068, -0.058161742985248566, -0.028482913970947266, -0.05439864471554756, -0.11978752911090851, -0.0005593998357653618, -0.03714799880981445, -0.03049464151263237, -0.13666626811027527, -0.21444179117679596, -0.02098572812974453, 0.06328938901424408, -0.0019795107655227184, -0.010229677893221378, 0.02347196824848652, 0.013777846470475197, -0.01995813101530075, 0.011286336928606033, -0.04403667524456978, -0.0003565708175301552, -0.007434982806444168, -0.03051617741584778, 0.057703301310539246, -0.039634738117456436, 0.023287495598196983, -0.06615212559700012, 0.022811464965343475, -0.21096673607826233, 0.08790403604507446, -0.03393871337175369, 0.0023673418909311295, -0.03612278029322624, -0.04489351436495781, 0.014214243739843369, 0.04801921546459198, -0.01120172068476677, 0.11494788527488708, -0.13793551921844482, -0.05518545210361481, 0.1888207495212555, -0.15807059407234192, 0.0014312416315078735, 0.09829063713550568, -0.04907134175300598, 0.0549437515437603, 0.1347237527370453, 0.10180962830781937, 0.08059266209602356, -0.07017072290182114, 0.009516674093902111, 0.059921327978372574, -0.06685300916433334, 0.05554558336734772, 0.0895891934633255, -0.02248743176460266, -0.1333768218755722, 0.030764922499656677, -0.0723673403263092, -0.011358209885656834, -0.02467319183051586, -0.0200638547539711, 0.006350280717015266, -0.034282032400369644, 0.028313878923654556, 0.00767996720969677, 0.01768648624420166, -0.04384850338101387, -0.08333419263362885, 0.029484350234270096, 0.0757734626531601, -0.07303176820278168, 0.03868266940116882, -0.07199408859014511, 0.062032848596572876, -0.07487073540687561, -0.003063398413360119, -0.1661149561405182, -0.026248376816511154, 0.04455246403813362, -0.046764012426137924, 0.05173199251294136, 0.09664097428321838, 0.0042823911644518375, 0.12433111667633057, -0.03824686259031296, 0.0020424723625183105, -0.004445789381861687, -0.010891208425164223, -0.052134737372398376, -0.127055823802948, -0.08046084642410278, -0.06812439113855362, 0.10626862943172455, -0.081901416182518, 0.02890222892165184, -0.06426279991865158, -0.019204718992114067, -0.009822212159633636, -0.05806028097867966, -0.003609134815633297, 0.008376234211027622, -0.02709590457379818, -0.04770734906196594, 0.04776983708143234, 0.04997105523943901, -0.0596034899353981, 0.07395727187395096, -0.1092296689748764, -0.05917544290423393, 0.05764336138963699, 0.01618262380361557, -0.07948431372642517, 0.09076859056949615, -0.02140640653669834, -0.01359085738658905, -0.05253307521343231, -0.04139664024114609, 0.1937037855386734, -0.02530486136674881, 0.10192285478115082, -0.09009543061256409, 0.00046861404553055763, 0.029452864080667496, -0.045040570199489594, -0.016339754685759544, 0.06117045879364014, 0.04384082555770874, -0.18789902329444885, 0.01623981073498726, 0.05139759182929993, 0.07903236895799637, 0.1126871258020401, 0.025235263630747795, -0.02509411796927452, -0.04739885777235031, -0.009529121220111847, 0.0049114287830889225, 0.05684872344136238, -0.028727952390909195, -0.012686198577284813, 0.03255164623260498, 0.054539699107408524, 0.018183428794145584, -0.08178016543388367, 0.03633669391274452, 0.06524033099412918, -0.01495342142879963, -0.0431675985455513, -0.023884231224656105, -0.061123479157686234, 0.0618743896484375, 0.05311407893896103, 0.034899283200502396, 0.027889501303434372, -0.015249188058078289, -0.13552989065647125, 0.18831580877304077, -0.11666166037321091, -0.25749966502189636, -0.10726013779640198, -0.06039382517337799, -0.027411239221692085, 0.040545664727687836, 0.058723390102386475, -0.033559154719114304, -0.04465271532535553, -0.11316405236721039, 0.0564555749297142, -0.06786833703517914, -0.030669452622532845, -0.01258028857409954, -0.05243664234876633, -0.019547661766409874, -0.12634900212287903, -0.011589374393224716, -0.02973603457212448, -0.07401376962661743, 0.008612864650785923, -0.033723846077919006, 0.027220945805311203, 0.13459257781505585, 0.03781384974718094, -0.01764272153377533, -0.016642436385154724, 0.1917487382888794, 0.008138811215758324, 0.05860964208841324, 0.11304531246423721, -0.0268593430519104, 0.05239245668053627, 0.04424335062503815, 0.025031397119164467, -0.048819806426763535, 0.00932212546467781, -0.015444946475327015, -0.11877171695232391, -0.17736366391181946, -0.07094156742095947, -0.004122176207602024, 0.005091194063425064, 0.02169005572795868, 0.036389514803886414, 0.02745348960161209, 0.03898877650499344, -0.030597252771258354, 0.0348597913980484, -0.01259654387831688, 0.08157683908939362, 0.031397461891174316, -0.07733426243066788, 0.09383110702037811, -0.061111290007829666, 0.01673872396349907, 0.10930126160383224, -0.06253562867641449, 0.1910472810268402, 0.028795968741178513, 0.06426817178726196, 0.10171853005886078, 0.020872991532087326, 0.05471936985850334, 0.08724010735750198, -0.04306121915578842, 0.004837050102651119, -0.06158975884318352, -0.052598919719457626, -0.033606767654418945, 0.05267217755317688, 0.027483662590384483, 0.0153280571103096, -0.11938122659921646, 0.016444042325019836, -0.0015053467359393835, 0.13584408164024353, 0.0459347665309906, -0.12504661083221436, -0.12374308705329895, 0.035102926194667816, -0.04692868888378143, -0.06376266479492188, 0.02954167127609253, 0.061181165277957916, -0.1531972885131836, 0.047182533890008926, -0.007108419202268124, 0.06477202475070953, -0.09289726614952087, 0.017062658444046974, -0.0478392094373703, 0.0030340980738401413, 0.0058501968160271645, 0.07160922884941101, -0.13444118201732635, 0.10328424721956253, 0.02246527001261711, 0.04766475781798363, -0.0826299786567688, 0.016652263700962067, -0.009953339584171772, 0.10970886796712875, 0.11695681512355804, 0.04223978891968727, -0.06234150379896164, -0.014010039158165455, -0.04506849870085716, 0.022206779569387436, 0.061518967151641846, -0.07737340778112411, 0.06133256107568741, 0.009590772911906242, 0.005765111185610294, -0.022228963673114777, 0.015116695314645767, -0.12968990206718445, -0.12392310798168182, 0.06408587843179703, -0.07491987198591232, -0.09860463440418243, -0.05855586752295494, -0.06363241374492645, -0.05164027959108353, 0.2179553061723709, -0.11647263169288635, -0.08995424211025238, -0.1001734510064125, -0.009849194437265396, 0.046209003776311874, -0.06563954055309296, 0.04635213315486908, -0.036038633435964584, 0.09682130813598633, -0.04621288925409317, -0.11065457761287689, 0.03591380640864372, -0.11350025236606598, -0.11415871977806091, -0.04409394413232803, 0.10837150365114212, 0.11326088756322861, 0.03747519105672836, 0.010956114158034325, 0.013948138803243637, -0.003026876598596573, -0.11633818596601486, 0.018128057941794395, 0.13290779292583466, 0.0012603402137756348, 0.06773283332586288, -0.05883203446865082, 0.03270404040813446, -0.01472078263759613, 0.0013973768800497055, 0.13144904375076294, 0.1820579469203949, -0.06369403749704361, 0.17478200793266296, 0.19666481018066406, -0.10363493859767914, -0.18811607360839844, -0.052477531135082245, -0.0024106521159410477, 0.046371541917324066, 0.04595966637134552, -0.18914419412612915, 0.09198743849992752, 0.032471634447574615, -0.030593642964959145, 0.023812230676412582, -0.23081529140472412, -0.10914382338523865, 0.09075625985860825, 0.056613679975271225, 0.1890176236629486, -0.08093901723623276, -0.041744183748960495, -0.01701304316520691, -0.029795058071613312, 0.045120351016521454, -0.04943208023905754, 0.09025035798549652, 0.007052924484014511, -0.027245793491601944, 0.0012993523851037025, -0.030922498553991318, 0.09494714438915253, 0.04190535843372345, 0.02390027791261673, -0.06998123228549957, -0.005440933629870415, 0.11202478408813477, -0.03812742233276367, 0.09726922959089279, 0.03966128081083298, 0.0737927109003067, -0.09576182067394257, -0.05827312916517258, -0.07580125331878662, 0.04405183345079422, -0.041859742254018784, -0.05265417322516441, -0.06307252496480942, 0.05631587654352188, 0.036082006990909576, 0.011571667157113552, -0.000445658341050148, -0.03657990321516991, 0.04527915269136429, 0.08772125095129013, 0.08163826912641525, -0.03389652073383331, -0.07558776438236237, -0.05214095488190651, -0.047337401658296585, 0.06700624525547028, -0.09647616744041443, 0.015118742361664772, 0.027186965569853783, 0.01224537007510662, 0.09194336831569672, 0.03322480618953705, -0.13927464187145233, 0.009385999292135239, 0.033919282257556915, -0.12760981917381287, -0.11223702877759933, -0.020130645483732224, 0.027717089280486107, -0.03533603996038437, 0.056071583181619644, 0.14762087166309357, -0.034702591598033905, -0.032103970646858215, -0.04874344915151596, 0.03624047711491585, -0.02002873457968235, 0.04895045980811119, 0.06374108791351318, 0.03055533580482006, -0.07636158168315887, 0.07097331434488297, 0.04034443944692612, -0.038054462522268295, 0.04426388442516327, 0.044413454830646515, -0.09616721421480179, -0.07864493131637573, -0.059839729219675064, 0.09493937343358994, -0.018145708367228508, -0.05000082775950432, -0.0028523150831460953, -0.08204799145460129, 0.06768244504928589, 0.07248101383447647, 0.045734841376543045, 0.03685527294874191, -0.08780916035175323, 0.015048336237668991, -0.05464102327823639, 0.034885816276073456, -0.03355822712182999, -0.003681028261780739, -0.05644100159406662, 0.07059034705162048, 0.06385140120983124, 0.09642478823661804, -0.0338684543967247, -0.07546348869800568, -0.0837840884923935, -0.014780336059629917, -0.06080115586519241, -0.03423033654689789, -0.07642711699008942, -0.0071864319033920765, 0.0010532303713262081, -0.00032182037830352783, 0.02260078303515911, 0.03613788262009621, -0.04455316811800003, -0.01846487820148468, -0.032924309372901917, 0.04047035425901413, -0.06313209235668182, 0.0082189766690135, 0.013548830524086952, -0.03834458813071251, 0.09245586395263672, 0.040208496153354645, -0.010641569271683693, 0.048432983458042145, -0.020182214677333832, 0.03340425342321396, -0.020017419010400772, -0.0010100812651216984, -0.02030884101986885, -0.10950631648302078, -0.005787387024611235, 0.0042520854622125626, -0.026609808206558228, 0.010206961072981358, 0.052069611847400665, -0.07082899659872055, 0.084559366106987, 0.04610063135623932, -0.03117411583662033, -0.07357773184776306, 0.04251296445727348, -0.016641434282064438, 0.026729686185717583, 0.067267045378685, -0.03431243821978569, 0.052498310804367065, -0.09942933917045593, -0.029496531933546066, 0.0020041237585246563, -0.008226688951253891, -0.008359316736459732, -0.05371173471212387, -0.0021388307213783264, 0.0036167455837130547, 0.17671164870262146, -0.02359829843044281, 0.03566262125968933, 0.011952492408454418, 0.008022932335734367, 0.047109510749578476, -0.013098219409584999, 0.07116873562335968, -0.004752166569232941, -0.024450261145830154, -0.014255575835704803, 0.03868842497467995, 0.00396430678665638, 0.0019947607070207596, 0.13658598065376282, 0.04342203587293625, 0.08959838002920151, 0.07562734186649323, 0.017977964133024216, 0.014109563082456589, -0.13584676384925842, -0.08102525025606155, 0.007195139303803444, 0.06238074228167534, -0.01792101562023163, 0.014077723026275635, 0.09345882385969162, -0.08978751301765442, 0.07102230936288834, 0.04897361993789673, -0.048707515001297, -0.12339461594820023, -0.19212087988853455, -0.025950627401471138, -0.032433733344078064, -0.011747412383556366, -0.09188570827245712, 0.01671459712088108, 0.08585305511951447, 0.024494413286447525, -0.011615986935794353, 0.09472571313381195, -0.09958402812480927, -0.03116540051996708, 0.043599918484687805, -0.02688290737569332, 0.012225380167365074, 0.04573880881071091, 0.024888955056667328, -0.005505714565515518, 0.041229248046875, 0.039819151163101196, 0.04150151461362839, 0.030032947659492493, 0.052730001509189606, -0.024546382948756218, -0.07288376986980438, -0.03160576522350311, -0.002082564402371645, 0.05356975644826889, 0.12925666570663452, 0.02220083400607109, -0.07336239516735077, 0.006127233617007732, 0.10392403602600098, -0.03192643076181412, -0.04555689916014671, -0.10742351412773132, 0.24469399452209473, 0.019623128697276115, -0.00011055264621973038, -0.005210967268794775, -0.045543160289525986, 0.007169906049966812, 0.21033786237239838, 0.2236010730266571, 0.008162403479218483, -0.009628774598240852, 0.010404037311673164, -0.012148873880505562, 0.03414159268140793, 0.14283522963523865, 0.004378184676170349, 0.25496384501457214, -0.04681897163391113, 0.03941398859024048, -0.041406724601984024, -0.040079545229673386, -0.10067939758300781, 0.07369383424520493, -0.006693498697131872, 0.005443636327981949, -0.027941780164837837, 0.07094884663820267, -0.0359623022377491, -0.17389436066150665, 0.004357331432402134, -0.00032378872856497765, -0.06294062733650208, 0.010762108489871025, -0.0014476282522082329, 0.021623533219099045, 0.08400727808475494, -0.018303629010915756, -0.00949376355856657, 0.13207271695137024, 0.018193302676081657, -0.0986582338809967, -0.05779052525758743, 0.1163460835814476, 0.010896699503064156, 0.1387062966823578, 0.00966203585267067, 0.08122441172599792, 0.08859986066818237, 0.021335210651159286, -0.09321685135364532, 0.04044690355658531, -0.018968889489769936, -0.03056744858622551, 0.006703902967274189, 0.1119304969906807, -0.008326340466737747, 0.05779546499252319, 0.027715610340237617, -0.08803273737430573, 0.0614563412964344, 0.01031263917684555, -0.0354166142642498, -0.08159787952899933, 0.084856316447258, -0.09167574346065521, 0.15635551512241364, 0.12542229890823364, -0.015216696076095104, -0.04599202424287796, -0.029555154964327812, 0.020678047090768814, 0.0014086929149925709, 0.05578670650720596, -0.024019520729780197, -0.13278818130493164, 0.02055480144917965, -0.08053654432296753, 0.02809477224946022, -0.2499670684337616, -0.08874385058879852, 0.02942831441760063, -0.01840851828455925, -0.018890082836151123, 0.04863130673766136, 0.0474715530872345, 0.025049256160855293, -0.03756848722696304, 0.02420744113624096, -0.037906814366579056, 0.05853758752346039, -0.10967207700014114, -0.09226873517036438 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 300k (uncased) Seed 4 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-300k') model = BertModel.from_pretrained("multiberts-seed-4-300k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-300k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 300k (uncased) Seed 4 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 300k (uncased)\nSeed 4 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 300k (uncased)\nSeed 4 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 300k (uncased)\nSeed 4 intermediate checkpoint 300k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08484818041324615, -0.002812077524140477, -0.002051102463155985, 0.07067913562059402, 0.08720719814300537, 0.004260686691850424, 0.11818145215511322, 0.04844910651445389, -0.02968214452266693, 0.02316218987107277, 0.09414537250995636, 0.02935227006673813, 0.041807107627391815, 0.06308606266975403, 0.09637437760829926, -0.25802937150001526, 0.04949968680739403, -0.06242756545543671, 0.0560988113284111, 0.07669562846422195, 0.09856560826301575, -0.07223016023635864, 0.06392848491668701, 0.033982664346694946, -0.08480101078748703, -0.014875303953886032, -0.016941100358963013, -0.03461911529302597, 0.09971868246793747, 0.06944698095321655, 0.061690084636211395, 0.0003619566559791565, 0.060533277690410614, -0.08636236935853958, 0.01580357737839222, 0.04278777167201042, -0.0012476281262934208, 0.02449878118932247, -0.009033866226673126, 0.016084419563412666, 0.10665804147720337, 0.0366256944835186, 0.07747089117765427, 0.034478869289159775, -0.09587179124355316, -0.11568030714988708, -0.08176179230213165, 0.1008123829960823, 0.05387537553906441, 0.04359640181064606, -0.006513386033475399, 0.07453246414661407, -0.0319112166762352, 0.07400581240653992, 0.10698047280311584, -0.2607364058494568, -0.009179998189210892, 0.06882543861865997, 0.046815454959869385, 0.045025840401649475, 0.011069905012845993, 0.026713216677308083, 0.005719631910324097, 0.04528331756591797, 0.030993297696113586, -0.024491842836141586, 0.12126447260379791, -0.044659893959760666, -0.15103371441364288, -0.041364558041095734, 0.12465594708919525, -0.004839945584535599, -0.12515586614608765, -0.1041988804936409, -0.02960849367082119, 0.11364555358886719, -0.003048091195523739, -0.018490135669708252, -0.0029050479643046856, 0.009613681584596634, 0.02235390804708004, -0.09166333079338074, -0.08547431230545044, -0.027720950543880463, -0.03648628666996956, 0.13064441084861755, 0.04601083695888519, 0.0523136742413044, -0.033525172621011734, 0.08679255843162537, -0.11185918748378754, -0.040328189730644226, -0.05041980743408203, -0.08305753022432327, -0.017231415957212448, 0.010673646815121174, -0.02766578271985054, -0.08246137946844101, -0.05919564887881279, 0.11954408884048462, 0.03691989928483963, 0.030883856117725372, 0.0015200104098767042, 0.039815329015254974, 0.07313719391822815, 0.0965864285826683, -0.04057087376713753, 0.04438617452979088, 0.03420769050717354, -0.02041441574692726, 0.05868507921695709, -0.05131176859140396, -0.1009746640920639, 0.07752171158790588, -0.0015823766589164734, 0.039295606315135956, 0.02337944507598877, 0.03508201986551285, -0.011641545221209526, -0.07200146466493607, 0.1666339933872223, -0.07995235919952393, -0.009176385588943958, -0.01806533709168434, 0.012328371405601501, 0.04930311441421509, 0.0313078947365284, -0.008643525652587414, -0.048798006027936935, -0.0070840343832969666, -0.05540946125984192, -0.028902098536491394, -0.055854927748441696, -0.11946944147348404, -0.002847793512046337, -0.03549601137638092, -0.031406618654727936, -0.13971516489982605, -0.2137090563774109, -0.019759658724069595, 0.06441017240285873, -0.002643352374434471, -0.008948433212935925, 0.024697374552488327, 0.014988487586379051, -0.020000746473670006, 0.010195162147283554, -0.04568487033247948, -0.000547211617231369, -0.007128940895199776, -0.028366316109895706, 0.057857826352119446, -0.0417591854929924, 0.022485466673970222, -0.06820083409547806, 0.021833207458257675, -0.20759961009025574, 0.08748731017112732, -0.031871646642684937, 0.0020531639456748962, -0.03618449717760086, -0.0472484827041626, 0.010949213057756424, 0.0486060231924057, -0.00930027011781931, 0.11672621965408325, -0.13272884488105774, -0.050557587295770645, 0.18075504899024963, -0.15704762935638428, -0.002328634262084961, 0.10001397877931595, -0.049215108156204224, 0.05395101383328438, 0.1346825212240219, 0.09586407244205475, 0.08069393038749695, -0.07272680848836899, 0.010926797054708004, 0.05999703332781792, -0.0669458732008934, 0.05392809957265854, 0.08905179053544998, -0.02352961152791977, -0.13439872860908508, 0.030525121837854385, -0.07346056401729584, -0.010807836428284645, -0.025338707491755486, -0.021150173619389534, 0.007876923307776451, -0.03805393725633621, 0.028414558619260788, 0.005679646972566843, 0.016721492633223534, -0.04264192655682564, -0.08288447558879852, 0.028208497911691666, 0.074565589427948, -0.07316285371780396, 0.04132203385233879, -0.07039850950241089, 0.06184231489896774, -0.07702088356018066, -0.004451951943337917, -0.16558507084846497, -0.027408115565776825, 0.04436158016324043, -0.043615665286779404, 0.05081203207373619, 0.09191080927848816, 0.003589396830648184, 0.12330038845539093, -0.04144037514925003, 0.0016938104527071118, -0.007577503100037575, -0.01029745303094387, -0.0499257817864418, -0.12029880285263062, -0.08461670577526093, -0.06756534427404404, 0.09778156131505966, -0.0736435204744339, 0.028353210538625717, -0.06754450500011444, -0.022715410217642784, -0.00919889286160469, -0.05780809372663498, -0.004557047970592976, 0.011091186664998531, -0.027237478643655777, -0.04717162251472473, 0.04809706658124924, 0.04992985352873802, -0.05910434201359749, 0.0726635530591011, -0.10618036985397339, -0.05656849592924118, 0.05722443759441376, 0.015033523552119732, -0.08122535049915314, 0.09109511226415634, -0.019216051325201988, -0.012647859752178192, -0.05728643387556076, -0.045647189021110535, 0.19336536526679993, -0.023664318025112152, 0.1008533388376236, -0.0907646119594574, 0.0011658977018669248, 0.02681991085410118, -0.045205920934677124, -0.016006678342819214, 0.057128939777612686, 0.04586479067802429, -0.18621444702148438, 0.015465345233678818, 0.051405034959316254, 0.07741257548332214, 0.11299967765808105, 0.027074195444583893, -0.023745547980070114, -0.04669421911239624, -0.012964564375579357, 0.0047087594866752625, 0.05799083784222603, -0.021314769983291626, -0.01009396929293871, 0.03112686797976494, 0.057316940277814865, 0.017542343586683273, -0.07943842560052872, 0.0360587053000927, 0.06600664556026459, -0.016246873885393143, -0.037644632160663605, -0.024605127051472664, -0.06120621785521507, 0.06244239956140518, 0.05417042225599289, 0.035891301929950714, 0.026479395106434822, -0.015599335543811321, -0.13516585528850555, 0.18842975795269012, -0.11390692740678787, -0.25634655356407166, -0.1089668720960617, -0.059088632464408875, -0.02791496552526951, 0.040449488908052444, 0.056245170533657074, -0.031735967844724655, -0.04363761469721794, -0.1156272143125534, 0.060393065214157104, -0.06399110704660416, -0.030666502192616463, -0.010540807619690895, -0.05258934199810028, -0.01975461281836033, -0.1271132081747055, -0.013008488342165947, -0.030636131763458252, -0.07619055360555649, 0.006575639359652996, -0.03455142676830292, 0.03072432428598404, 0.13347125053405762, 0.03625553846359253, -0.018291691318154335, -0.017368510365486145, 0.19403335452079773, 0.009873813018202782, 0.06056822091341019, 0.11419697850942612, -0.02723279781639576, 0.052101194858551025, 0.04442726820707321, 0.02671489678323269, -0.05039598047733307, 0.011621377430856228, -0.016533302143216133, -0.12147141993045807, -0.17327217757701874, -0.0707806944847107, -0.005385342054069042, 0.0032543514389544725, 0.02078128792345524, 0.03485904261469841, 0.02018558792769909, 0.04039663448929787, -0.03009677864611149, 0.029180996119976044, -0.012234408408403397, 0.08096496015787125, 0.028706461191177368, -0.07362575829029083, 0.09480156749486923, -0.062340907752513885, 0.014701824635267258, 0.10873018950223923, -0.05670895427465439, 0.18879786133766174, 0.025496194139122963, 0.05846317857503891, 0.10149981826543808, 0.020391665399074554, 0.05451352894306183, 0.09029935300350189, -0.045314911752939224, 0.005510170944035053, -0.060449469834566116, -0.051918841898441315, -0.03565090522170067, 0.048477157950401306, 0.029557932168245316, 0.019858695566654205, -0.12016192078590393, 0.016964416950941086, -0.001023847726173699, 0.13949334621429443, 0.04823329299688339, -0.12465463578701019, -0.12290346622467041, 0.035661481320858, -0.04503867030143738, -0.06156302988529205, 0.030836958438158035, 0.05767882242798805, -0.1545005440711975, 0.04793732613325119, -0.00574128795415163, 0.06564503908157349, -0.09481416642665863, 0.016439151018857956, -0.04173383489251137, 0.00032366812229156494, 0.00524137681350112, 0.06864647567272186, -0.13307902216911316, 0.1054409071803093, 0.021645396947860718, 0.049967627972364426, -0.08082546293735504, 0.016380146145820618, -0.010277587920427322, 0.10629252344369888, 0.11646752059459686, 0.04233565926551819, -0.052111413329839706, -0.01834462396800518, -0.04435395821928978, 0.02169475145637989, 0.05896570906043053, -0.07618403434753418, 0.06264367699623108, 0.00853346660733223, 0.0064948308281600475, -0.023474255576729774, 0.016580983996391296, -0.13219058513641357, -0.1250084489583969, 0.060186468064785004, -0.07635186612606049, -0.10014539957046509, -0.05599776655435562, -0.06277480721473694, -0.048651665449142456, 0.21392309665679932, -0.11418728530406952, -0.0918785110116005, -0.0991545170545578, -0.014248214662075043, 0.04649670794606209, -0.06445807963609695, 0.04531216621398926, -0.035771533846855164, 0.09177535027265549, -0.04549216479063034, -0.11042438447475433, 0.03439799323678017, -0.11384362727403641, -0.11421482264995575, -0.04262922704219818, 0.10710003972053528, 0.11479063332080841, 0.03720283880829811, 0.012797465547919273, 0.013203267008066177, 0.000026943162083625793, -0.11870777606964111, 0.015492675825953484, 0.1280156672000885, 0.0007294211536645889, 0.07312589138746262, -0.06149093061685562, 0.027402300387620926, -0.01504925824701786, 0.0007166620343923569, 0.13062553107738495, 0.1824355125427246, -0.06266164779663086, 0.1730835735797882, 0.2019830346107483, -0.10385765135288239, -0.1896473467350006, -0.055453747510910034, -0.002820582129061222, 0.04778303951025009, 0.049473147839307785, -0.18500611186027527, 0.091218501329422, 0.03253595158457756, -0.030749265104532242, 0.01725737378001213, -0.23333144187927246, -0.11066460609436035, 0.0892944484949112, 0.05868982896208763, 0.1898106336593628, -0.08189253509044647, -0.038669999688863754, -0.01960703916847706, -0.03910069167613983, 0.04523869603872299, -0.03982139006257057, 0.09105588495731354, 0.0050668660551309586, -0.02849767915904522, 0.0007162326946854591, -0.030836421996355057, 0.09607957303524017, 0.04098077118396759, 0.023999426513910294, -0.07134898006916046, -0.004750240594148636, 0.11267448961734772, -0.03805065155029297, 0.09981251507997513, 0.03903637081384659, 0.07362835109233856, -0.09645894169807434, -0.06015891581773758, -0.07593606412410736, 0.041651345789432526, -0.0418216846883297, -0.05490359291434288, -0.0640377402305603, 0.057791050523519516, 0.037290289998054504, 0.009953595697879791, -0.004486653953790665, -0.03784819692373276, 0.04562433436512947, 0.08641549944877625, 0.08547724783420563, -0.03381767123937607, -0.0764903798699379, -0.050882574170827866, -0.048843249678611755, 0.06702402234077454, -0.0884607583284378, 0.016132906079292297, 0.02779955416917801, 0.009410817176103592, 0.08946691453456879, 0.034778520464897156, -0.13582077622413635, 0.010520920157432556, 0.03472018614411354, -0.12360027432441711, -0.10524359345436096, -0.01896071434020996, 0.024231549352407455, -0.038474082946777344, 0.054302141070365906, 0.14740806818008423, -0.03482336550951004, -0.03232529014348984, -0.04812111333012581, 0.037260107696056366, -0.022521190345287323, 0.051173292100429535, 0.06445707380771637, 0.030701614916324615, -0.07387928664684296, 0.0724475309252739, 0.03742649778723717, -0.03344839811325073, 0.04225050285458565, 0.04495767131447792, -0.09587547928094864, -0.07869628071784973, -0.06019626185297966, 0.08947198837995529, -0.022544698789715767, -0.046255923807621, -0.0022878535091876984, -0.08363054692745209, 0.06725133955478668, 0.06789959222078323, 0.04855174943804741, 0.03614383563399315, -0.08806396275758743, 0.014703962951898575, -0.05376185476779938, 0.03364298492670059, -0.030923211947083473, -0.004832642152905464, -0.05573011934757233, 0.06090811640024185, 0.06332281976938248, 0.09520875662565231, -0.034781888127326965, -0.07436680793762207, -0.08553847670555115, -0.012301590293645859, -0.06114017963409424, -0.03485424816608429, -0.07589633017778397, -0.005849550478160381, -0.0001447773538529873, -0.0014289859682321548, 0.02160194143652916, 0.036824215203523636, -0.04276925325393677, -0.01884806714951992, -0.033294156193733215, 0.037753209471702576, -0.061531465500593185, 0.007243563421070576, 0.0149424122646451, -0.0388612225651741, 0.09181932359933853, 0.03492975980043411, -0.012549031525850296, 0.046898506581783295, -0.019875578582286835, 0.03243354707956314, -0.02029738947749138, 0.0009686027187854052, -0.02423998713493347, -0.1086479127407074, -0.0048563191667199135, 0.006562450900673866, -0.02356540784239769, 0.011835089884698391, 0.05914546549320221, -0.07177934795618057, 0.08533815294504166, 0.04430358111858368, -0.02898704633116722, -0.07192865014076233, 0.04239088296890259, -0.01029195636510849, 0.029110809788107872, 0.06994645297527313, -0.035206139087677, 0.052273571491241455, -0.10035095363855362, -0.028749998658895493, 0.0031931251287460327, -0.007249191403388977, -0.013827243819832802, -0.05312687158584595, -0.0014014411717653275, 0.006567320786416531, 0.17801432311534882, -0.024310991168022156, 0.03512377291917801, 0.013669967651367188, 0.01116091012954712, 0.04910578206181526, -0.01220761053264141, 0.0713447779417038, -0.006950953975319862, -0.026667719706892967, -0.013659033924341202, 0.037600331008434296, 0.0029812324792146683, 0.0037114322185516357, 0.14104565978050232, 0.04442758858203888, 0.0920226201415062, 0.07628051936626434, 0.016811292618513107, 0.01769338734447956, -0.12984102964401245, -0.09215257316827774, 0.008449411019682884, 0.0586409717798233, -0.017028329893946648, 0.00961388275027275, 0.09176363050937653, -0.08781534433364868, 0.07169423997402191, 0.048028070479631424, -0.04918195679783821, -0.1244959682226181, -0.1904374659061432, -0.024412330240011215, -0.029216626659035683, -0.011187205091118813, -0.09211724996566772, 0.014020588248968124, 0.09133822470903397, 0.02485336735844612, -0.010841868817806244, 0.09634706377983093, -0.1027526706457138, -0.02996665984392166, 0.043660394847393036, -0.027402177453041077, 0.01404177863150835, 0.05093711242079735, 0.024695076048374176, -0.007123993709683418, 0.04244135320186615, 0.040728095918893814, 0.04304449260234833, 0.022914867848157883, 0.05087900906801224, -0.023537874221801758, -0.07329386472702026, -0.0329301580786705, -0.0028981147333979607, 0.054936908185482025, 0.13892021775245667, 0.022425618022680283, -0.07206925004720688, 0.006365131586790085, 0.10469301044940948, -0.03218519687652588, -0.05082684010267258, -0.10896255075931549, 0.23961439728736877, 0.02198002301156521, 0.0009060737211257219, -0.005875993520021439, -0.04451346769928932, 0.00503934733569622, 0.21075034141540527, 0.22919070720672607, 0.00202842615544796, -0.009848405607044697, 0.009450486861169338, -0.011377980932593346, 0.0369274765253067, 0.14763516187667847, 0.0037797167897224426, 0.25440800189971924, -0.04721829295158386, 0.03976402431726456, -0.04198553040623665, -0.04045983403921127, -0.09900431334972382, 0.0724330022931099, -0.007653985638171434, 0.007085876539349556, -0.030858464539051056, 0.07190310209989548, -0.036693405359983444, -0.1692478060722351, 0.004144738428294659, -0.002740542870014906, -0.06306110322475433, 0.010171355679631233, -0.005183835048228502, 0.020147589966654778, 0.08422143757343292, -0.017342619597911835, -0.006564714480191469, 0.13179755210876465, 0.018265999853610992, -0.09859578311443329, -0.05897323042154312, 0.11649846285581589, 0.017354849725961685, 0.13906928896903992, 0.012040911242365837, 0.08149681985378265, 0.08770734071731567, 0.020343361422419548, -0.09404180198907852, 0.043539859354496, -0.019850468263030052, -0.027783658355474472, 0.00836987979710102, 0.1082700714468956, -0.008286705240607262, 0.057590924203395844, 0.02459113858640194, -0.08885565400123596, 0.06130059435963631, 0.01117660105228424, -0.034875448793172836, -0.08113481849431992, 0.08439920842647552, -0.09164425730705261, 0.15791159868240356, 0.12381770461797714, -0.014881952665746212, -0.046097368001937866, -0.027032887563109398, 0.02100498601794243, 0.001076927874237299, 0.05442177131772041, -0.02546955645084381, -0.13436856865882874, 0.01786068268120289, -0.08679132908582687, 0.02522014081478119, -0.2506351172924042, -0.09054712951183319, 0.02958245947957039, -0.018170125782489777, -0.0206788070499897, 0.05148937180638313, 0.045056574046611786, 0.02647014707326889, -0.03621948882937431, 0.02368725836277008, -0.03802517056465149, 0.05814141407608986, -0.11240387707948685, -0.09402873367071152 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 400k (uncased) Seed 4 intermediate checkpoint 400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-400k') model = BertModel.from_pretrained("multiberts-seed-4-400k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-400k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 400k (uncased) Seed 4 intermediate checkpoint 400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 400k (uncased)\nSeed 4 intermediate checkpoint 400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 400k (uncased)\nSeed 4 intermediate checkpoint 400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 400k (uncased)\nSeed 4 intermediate checkpoint 400k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08529366552829742, -0.0016898156609386206, -0.002126299776136875, 0.07095761597156525, 0.08648521453142166, 0.005272936541587114, 0.11664549261331558, 0.048907481133937836, -0.030552353709936142, 0.02348989062011242, 0.09147587418556213, 0.030458979308605194, 0.043114278465509415, 0.06445097178220749, 0.09555783867835999, -0.25881174206733704, 0.047556713223457336, -0.06245053559541702, 0.053898900747299194, 0.07504203170537949, 0.09793489426374435, -0.07251158356666565, 0.06474631279706955, 0.034942660480737686, -0.08488740772008896, -0.016234474256634712, -0.016082877293229103, -0.03563971817493439, 0.10135465860366821, 0.0706818699836731, 0.06179475039243698, 0.0007414482533931732, 0.0600692443549633, -0.08452305197715759, 0.01621193252503872, 0.04290226474404335, -0.002221066039055586, 0.026054851710796356, -0.007430605590343475, 0.017467843368649483, 0.10493409633636475, 0.03596970811486244, 0.07606920599937439, 0.03393244370818138, -0.0955297201871872, -0.12095440179109573, -0.08137376606464386, 0.10088241845369339, 0.0531180240213871, 0.04138541221618652, -0.006578458473086357, 0.07081816345453262, -0.030828513205051422, 0.07383370399475098, 0.09960167109966278, -0.26470786333084106, -0.009443139657378197, 0.06659575551748276, 0.0449441596865654, 0.04682466387748718, 0.009474145248532295, 0.028270015493035316, 0.00672728568315506, 0.04538664594292641, 0.028755847364664078, -0.02464430034160614, 0.12227056920528412, -0.04338447004556656, -0.14939165115356445, -0.040155354887247086, 0.12659698724746704, -0.00560910627245903, -0.1254308670759201, -0.10079146921634674, -0.028401771560311317, 0.11580862104892731, -0.003821294754743576, -0.01713949255645275, -0.002972656860947609, 0.009688058868050575, 0.02236224338412285, -0.09020673483610153, -0.08566221594810486, -0.026306835934519768, -0.03628373146057129, 0.12683404982089996, 0.04674492031335831, 0.05258215218782425, -0.033743418753147125, 0.08629295229911804, -0.11509954929351807, -0.03953181579709053, -0.05018749088048935, -0.08335812389850616, -0.017199501395225525, 0.010707796551287174, -0.027208814397454262, -0.07853132486343384, -0.058898523449897766, 0.11857394874095917, 0.03788358345627785, 0.031849347054958344, 0.000522959977388382, 0.03979019448161125, 0.07144934684038162, 0.09577134251594543, -0.03918386623263359, 0.04469849169254303, 0.03580252081155777, -0.0226936936378479, 0.05751499533653259, -0.050473522394895554, -0.10158559679985046, 0.07642148435115814, -0.0024563614279031754, 0.03909572958946228, 0.023060332983732224, 0.03675990179181099, -0.009283242747187614, -0.07087936997413635, 0.166936457157135, -0.07921887934207916, -0.008595611900091171, -0.016853198409080505, 0.011009102687239647, 0.04699617624282837, 0.030885281041264534, -0.008366279304027557, -0.047131843864917755, -0.008165011182427406, -0.055741894990205765, -0.02870386838912964, -0.05535077303647995, -0.12019887566566467, -0.0026724962517619133, -0.034170106053352356, -0.03157441318035126, -0.14078083634376526, -0.2131069004535675, -0.02042701095342636, 0.06371772289276123, -0.0021401955746114254, -0.00982598401606083, 0.02612447738647461, 0.015267277136445045, -0.02080974355340004, 0.010876696556806564, -0.044704169034957886, -0.000997096300125122, -0.007025960832834244, -0.030379604548215866, 0.05884234607219696, -0.04103326424956322, 0.022713657468557358, -0.06733488291501999, 0.021035917103290558, -0.2142162173986435, 0.08740062266588211, -0.03261988237500191, 0.0031502917408943176, -0.03581298515200615, -0.0472479909658432, 0.011828035116195679, 0.04684071987867355, -0.01102694682776928, 0.1164742112159729, -0.13627062737941742, -0.05074396729469299, 0.17777647078037262, -0.15762344002723694, -0.001608237624168396, 0.10056987404823303, -0.04874754697084427, 0.05574055016040802, 0.1352536678314209, 0.10005976259708405, 0.08365555852651596, -0.07226838916540146, 0.010501605458557606, 0.060848530381917953, -0.06888416409492493, 0.05637345463037491, 0.08764159679412842, -0.023368684574961662, -0.13841886818408966, 0.031283799558877945, -0.06947237253189087, -0.010656410828232765, -0.02488587610423565, -0.020962515845894814, 0.008009558543562889, -0.03755434975028038, 0.02941220998764038, 0.005089502781629562, 0.01840844191610813, -0.042503006756305695, -0.0824478268623352, 0.024188540875911713, 0.07450228929519653, -0.07179753482341766, 0.04014766216278076, -0.0711536556482315, 0.06289678066968918, -0.0765371322631836, -0.0035804579965770245, -0.16443833708763123, -0.027473168447613716, 0.04418692737817764, -0.04623212665319443, 0.05036497861146927, 0.09258151799440384, 0.0045840200036764145, 0.12421879172325134, -0.04152725636959076, 0.0017407054547220469, -0.00398065522313118, -0.010258120484650135, -0.05222072824835777, -0.12166383117437363, -0.0839119404554367, -0.06613979488611221, 0.10099200904369354, -0.07700926810503006, 0.028892286121845245, -0.06847257912158966, -0.022052744403481483, -0.009364504367113113, -0.05656983703374863, -0.005105017684400082, 0.010859900154173374, -0.025835296139121056, -0.04624546691775322, 0.04868132621049881, 0.049232132732868195, -0.058503881096839905, 0.0724562406539917, -0.1044962927699089, -0.056229572743177414, 0.056864358484745026, 0.014047805219888687, -0.08123341202735901, 0.09215967357158661, -0.019018178805708885, -0.01363450288772583, -0.05465347692370415, -0.045063529163599014, 0.19649533927440643, -0.026094894856214523, 0.10067032277584076, -0.08952432870864868, 0.002400417346507311, 0.027721058577299118, -0.04375625401735306, -0.013699275441467762, 0.058212436735630035, 0.04252928867936134, -0.1821763962507248, 0.01361304521560669, 0.05120128393173218, 0.07917872071266174, 0.11095629632472992, 0.026382680982351303, -0.0222991444170475, -0.04623691737651825, -0.013587947934865952, 0.004304327070713043, 0.0585230328142643, -0.022281289100646973, -0.009730102494359016, 0.03245212510228157, 0.056650370359420776, 0.017265524715185165, -0.08061864972114563, 0.03625732660293579, 0.06451524794101715, -0.015876956284046173, -0.04382523149251938, -0.024723639711737633, -0.05956495180726051, 0.06226377934217453, 0.054285015910863876, 0.03557862713932991, 0.027196470648050308, -0.015193624421954155, -0.13441574573516846, 0.18730351328849792, -0.11601023375988007, -0.25630053877830505, -0.10913670808076859, -0.057640790939331055, -0.02580258436501026, 0.04056604579091072, 0.05785946547985077, -0.033346615731716156, -0.0437869057059288, -0.11354289948940277, 0.061281561851501465, -0.06384371221065521, -0.030184026807546616, -0.012243032455444336, -0.05327159911394119, -0.018917350098490715, -0.12656250596046448, -0.0115430299192667, -0.030217308551073074, -0.07535591721534729, 0.008153673261404037, -0.033506982028484344, 0.029390156269073486, 0.13424178957939148, 0.03550606966018677, -0.019427046179771423, -0.017451120540499687, 0.19551801681518555, 0.00863368809223175, 0.06359579414129257, 0.11256800591945648, -0.027184154838323593, 0.05247391015291214, 0.04684947058558464, 0.02707667648792267, -0.04873029142618179, 0.010310893878340721, -0.016922980546951294, -0.11932262033224106, -0.17462566494941711, -0.0696849673986435, -0.003916964400559664, 0.005726734176278114, 0.018967118114233017, 0.03527321666479111, 0.02145504020154476, 0.04098634421825409, -0.02850981242954731, 0.03252018243074417, -0.010802127420902252, 0.08019103109836578, 0.0318748839199543, -0.07486889511346817, 0.09413538873195648, -0.06237905099987984, 0.015349550172686577, 0.10878436267375946, -0.05911251902580261, 0.1889025717973709, 0.02536422200500965, 0.05873670428991318, 0.09994377195835114, 0.024645060300827026, 0.05598863959312439, 0.09026232361793518, -0.045623529702425, 0.006092512514442205, -0.05972260236740112, -0.0518675334751606, -0.03374950587749481, 0.04942196235060692, 0.028829151764512062, 0.01781035214662552, -0.1207435354590416, 0.015426833182573318, -0.001906246761791408, 0.1359836310148239, 0.04791413992643356, -0.12344563007354736, -0.12463785707950592, 0.03529974818229675, -0.04422939568758011, -0.06314695626497269, 0.031113870441913605, 0.059331074357032776, -0.1545221507549286, 0.046404823660850525, -0.005982286296784878, 0.0655444785952568, -0.09464170038700104, 0.01736694574356079, -0.04171096161007881, 0.0011438354849815369, 0.004321036860346794, 0.06967626512050629, -0.13064107298851013, 0.10096916556358337, 0.022339588031172752, 0.05069027468562126, -0.08031778782606125, 0.015942173078656197, -0.008881671354174614, 0.11093201488256454, 0.11440865695476532, 0.04235738515853882, -0.056370459496974945, -0.020826464518904686, -0.044734954833984375, 0.023576151579618454, 0.05948789045214653, -0.07655617594718933, 0.06208760663866997, 0.009850945323705673, 0.00570414774119854, -0.024299586191773415, 0.012793529778718948, -0.13371196389198303, -0.12299855053424835, 0.06110166758298874, -0.07616906613111496, -0.09288112819194794, -0.05597037822008133, -0.06337128579616547, -0.053404949605464935, 0.21276599168777466, -0.11261516809463501, -0.09217341244220734, -0.09844344854354858, -0.0178479366004467, 0.047486625611782074, -0.06422078609466553, 0.04591548815369606, -0.0352482832968235, 0.0909452736377716, -0.043831344693899155, -0.11118465662002563, 0.034644439816474915, -0.11226044595241547, -0.11380785703659058, -0.04308054596185684, 0.10702385008335114, 0.11281845718622208, 0.03735259920358658, 0.013059430755674839, 0.012741302140057087, -0.002149812877178192, -0.11782527714967728, 0.01704350672662258, 0.12635143101215363, 0.0022711101919412613, 0.07197993993759155, -0.06378620117902756, 0.028898246586322784, -0.015521535649895668, 0.0020708907395601273, 0.12855538725852966, 0.1814541220664978, -0.06382492184638977, 0.17324620485305786, 0.19968444108963013, -0.10445777326822281, -0.18965467810630798, -0.05430426076054573, -0.003261053003370762, 0.04704565554857254, 0.04698058217763901, -0.18725451827049255, 0.0923665463924408, 0.03280790150165558, -0.030227094888687134, 0.02628648653626442, -0.2279706746339798, -0.11076866090297699, 0.08761689066886902, 0.055447451770305634, 0.18929752707481384, -0.08158884942531586, -0.03963106870651245, -0.01737481728196144, -0.03993261605501175, 0.04007154703140259, -0.04273023456335068, 0.09070957452058792, 0.005517611280083656, -0.025357499718666077, 0.0010888297110795975, -0.029866548255085945, 0.09716030210256577, 0.04286833107471466, 0.024073127657175064, -0.07051534950733185, -0.005549512803554535, 0.10861934721469879, -0.038066230714321136, 0.09766983985900879, 0.041703540831804276, 0.07336868345737457, -0.09855929017066956, -0.05823056027293205, -0.07617315649986267, 0.04248074069619179, -0.04195915907621384, -0.053395822644233704, -0.06363876909017563, 0.0567975752055645, 0.03524617478251457, 0.010373120196163654, -0.0018625110387802124, -0.03691306337714195, 0.04697223752737045, 0.0860256552696228, 0.08270256966352463, -0.03309528902173042, -0.07699684798717499, -0.04943099245429039, -0.04952406883239746, 0.06753231585025787, -0.09146678447723389, 0.017419233918190002, 0.029270794242620468, 0.008809493854641914, 0.09031382203102112, 0.035014137625694275, -0.13505440950393677, 0.011427867226302624, 0.034857552498579025, -0.12429136037826538, -0.10612305253744125, -0.018877966329455376, 0.018909160047769547, -0.03651924803853035, 0.05575602501630783, 0.14736266434192657, -0.035172879695892334, -0.03244917839765549, -0.04858741909265518, 0.03734813258051872, -0.021477343514561653, 0.05119815096259117, 0.0651521384716034, 0.03057260252535343, -0.07475696504116058, 0.07293014973402023, 0.038852252066135406, -0.036247771233320236, 0.04074797406792641, 0.04318179935216904, -0.09738766402006149, -0.07941155880689621, -0.0601741299033165, 0.09302667528390884, -0.01901846192777157, -0.04777512699365616, -0.0019320286810398102, -0.08517144620418549, 0.06643570959568024, 0.0680522471666336, 0.04655833914875984, 0.03558233007788658, -0.08662495762109756, 0.01539711095392704, -0.05435258150100708, 0.03533465787768364, -0.03112833760678768, -0.004879072308540344, -0.05609915405511856, 0.06607089936733246, 0.06295670568943024, 0.09748616069555283, -0.03474743664264679, -0.07465384155511856, -0.08480669558048248, -0.014208638109266758, -0.06562834978103638, -0.0334063321352005, -0.07491887360811234, -0.00532696396112442, 0.00031709717586636543, -0.0013079065829515457, 0.022256681695580482, 0.03615707904100418, -0.044062335044145584, -0.01853887550532818, -0.032238151878118515, 0.038902658969163895, -0.06003160774707794, 0.006129572167992592, 0.013982841745018959, -0.037790488451719284, 0.09256543219089508, 0.03775133937597275, -0.01136070303618908, 0.047587163746356964, -0.015860699117183685, 0.0339791476726532, -0.021099165081977844, 0.0005384464748203754, -0.0227070152759552, -0.10911652445793152, -0.005506441928446293, 0.0036573614925146103, -0.02536327950656414, 0.011446895077824593, 0.058830372989177704, -0.07198172807693481, 0.0866062343120575, 0.04587745666503906, -0.028106890618801117, -0.07221493870019913, 0.04123713821172714, -0.014741016551852226, 0.027702998369932175, 0.06865561008453369, -0.03529691323637962, 0.05238281190395355, -0.09996594488620758, -0.029544241726398468, 0.0025250029284507036, -0.008351031690835953, -0.009545698761940002, -0.05427183955907822, -0.0009706690907478333, 0.005745528265833855, 0.17526905238628387, -0.023424863815307617, 0.03493715450167656, 0.012322952970862389, 0.009369063191115856, 0.0454220287501812, -0.012202175334095955, 0.07155483961105347, -0.006550309248268604, -0.026661867275834084, -0.013960364274680614, 0.03923696279525757, 0.002447589300572872, 0.004078015685081482, 0.13681522011756897, 0.04278702288866043, 0.09279972314834595, 0.07510384172201157, 0.016428925096988678, 0.014752790331840515, -0.13144387304782867, -0.08901945501565933, 0.0076896389946341515, 0.060683585703372955, -0.018886147066950798, 0.009127074852585793, 0.08954113721847534, -0.08707546442747116, 0.07105688750743866, 0.046317487955093384, -0.04955188184976578, -0.1234125941991806, -0.18742606043815613, -0.024480989202857018, -0.030201805755496025, -0.011391924694180489, -0.0914597138762474, 0.014888671226799488, 0.09275229275226593, 0.0239263866096735, -0.010761676356196404, 0.09584173560142517, -0.10526996850967407, -0.029916128143668175, 0.04407418519258499, -0.026830973103642464, 0.013673069886863232, 0.051244817674160004, 0.024490918964147568, -0.006978582590818405, 0.039740413427352905, 0.038971804082393646, 0.04260542616248131, 0.024851053953170776, 0.05125991255044937, -0.02294362150132656, -0.0732906386256218, -0.03329397365450859, -0.0009421207942068577, 0.05521538853645325, 0.13611911237239838, 0.022427581250667572, -0.07149011641740799, 0.006104435306042433, 0.10701156407594681, -0.03204980120062828, -0.047509852796792984, -0.10849395394325256, 0.241519033908844, 0.02005375549197197, 0.00005809729918837547, -0.005127097479999065, -0.04555564001202583, 0.0065897442400455475, 0.21437226235866547, 0.22886285185813904, 0.003936001565307379, -0.010758880525827408, 0.009417694993317127, -0.01184053160250187, 0.03756767883896828, 0.14728818833827972, 0.005275856703519821, 0.2508196234703064, -0.04767867922782898, 0.04078686982393265, -0.04222361743450165, -0.03912081941962242, -0.10254502296447754, 0.07298549264669418, -0.010235097259283066, 0.0065025147050619125, -0.030251888558268547, 0.07172742486000061, -0.03672569617629051, -0.17529433965682983, 0.005973366089165211, -0.0007218027021735907, -0.06284800916910172, 0.010213839821517467, -0.0016617793589830399, 0.020449567586183548, 0.08398157358169556, -0.018460333347320557, -0.007547267712652683, 0.1372377723455429, 0.017459098249673843, -0.0962943509221077, -0.05973678082227707, 0.11635366827249527, 0.012644628994166851, 0.13755980134010315, 0.011262325569987297, 0.08271822333335876, 0.0872679203748703, 0.021278467029333115, -0.09457607567310333, 0.042057350277900696, -0.020297158509492874, -0.03205469623208046, 0.00786507222801447, 0.10867719352245331, -0.007795331068336964, 0.058261122554540634, 0.025402214378118515, -0.08650943636894226, 0.06078099459409714, 0.012447521090507507, -0.035378847271203995, -0.0795273631811142, 0.08325491845607758, -0.09127587080001831, 0.15668068826198578, 0.12491119652986526, -0.014349843375384808, -0.04502381384372711, -0.027511943131685257, 0.018877174705266953, 0.0025207283906638622, 0.05162965878844261, -0.02652554027736187, -0.13339674472808838, 0.018528036773204803, -0.08509944379329681, 0.026227358728647232, -0.24960869550704956, -0.08892431855201721, 0.0281064510345459, -0.017879342660307884, -0.019402991980314255, 0.05143285542726517, 0.04673692211508751, 0.026001231744885445, -0.036094844341278076, 0.023262646049261093, -0.03799369931221008, 0.05863197520375252, -0.11093153059482574, -0.09451770782470703 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 40k (uncased) Seed 4 intermediate checkpoint 40k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-40k') model = BertModel.from_pretrained("multiberts-seed-4-40k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-40k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 40k (uncased) Seed 4 intermediate checkpoint 40k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 40k (uncased)\nSeed 4 intermediate checkpoint 40k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 40k (uncased)\nSeed 4 intermediate checkpoint 40k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 40k (uncased)\nSeed 4 intermediate checkpoint 40k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08608052134513855, 0.0016040928894653916, -0.0021604441571980715, 0.0689769983291626, 0.08567361533641815, 0.005027686711400747, 0.11849229037761688, 0.04927641525864601, -0.028305137529969215, 0.025412781164050102, 0.0911107212305069, 0.03005199134349823, 0.042534008622169495, 0.06278058886528015, 0.0962706208229065, -0.25913286209106445, 0.047641560435295105, -0.06197819858789444, 0.05564621090888977, 0.07485873997211456, 0.09797365963459015, -0.07319507002830505, 0.06412448734045029, 0.03560912236571312, -0.08444024622440338, -0.017054885625839233, -0.016536615788936615, -0.03430679813027382, 0.10135965049266815, 0.07050292938947678, 0.06119208782911301, 0.0014145653694868088, 0.05881902575492859, -0.08659103512763977, 0.01610853150486946, 0.044118523597717285, -0.0036101213190704584, 0.026237016543745995, -0.005988839082419872, 0.01758522354066372, 0.10680358111858368, 0.035693101584911346, 0.07561661303043365, 0.034450713545084, -0.09493879228830338, -0.12172386050224304, -0.08091284334659576, 0.09984886646270752, 0.05220358818769455, 0.04169108718633652, -0.007220235653221607, 0.07404810190200806, -0.03094612807035446, 0.07447104901075363, 0.10134536027908325, -0.2645636200904846, -0.007971223443746567, 0.06730135530233383, 0.047186873853206635, 0.046120911836624146, 0.009243656881153584, 0.0276747215539217, 0.006692677736282349, 0.044961098581552505, 0.02685851976275444, -0.024444997310638428, 0.12866652011871338, -0.04270268604159355, -0.15014788508415222, -0.04077039286494255, 0.12323629856109619, -0.006251584738492966, -0.12477099150419235, -0.10270277410745621, -0.030131051316857338, 0.11742925643920898, -0.0050101871602237225, -0.015917792916297913, -0.002389043103903532, 0.010618673637509346, 0.024067837744951248, -0.08984047174453735, -0.08589334785938263, -0.026031307876110077, -0.03505824878811836, 0.1241675317287445, 0.0476982519030571, 0.051408763974905014, -0.034649528563022614, 0.08587595820426941, -0.11887749284505844, -0.039719026535749435, -0.050447650253772736, -0.08173976093530655, -0.01643911376595497, 0.010601461865007877, -0.028899598866701126, -0.0796375572681427, -0.05809764564037323, 0.1191522628068924, 0.035092543810606, 0.03172474354505539, -0.0013758987188339233, 0.04027876257896423, 0.07257023453712463, 0.09586426615715027, -0.03969796746969223, 0.04466434568166733, 0.034323256462812424, -0.02347976714372635, 0.05775517225265503, -0.050997525453567505, -0.10210005939006805, 0.07619474828243256, -0.0016413507983088493, 0.038263387978076935, 0.023800084367394447, 0.03665589168667793, -0.009072445333003998, -0.07131633162498474, 0.16800007224082947, -0.07853050529956818, -0.008454576134681702, -0.015447762794792652, 0.011205360293388367, 0.04693490266799927, 0.03150097280740738, -0.0075223492458462715, -0.04829704761505127, -0.009794658981263638, -0.0561644621193409, -0.02834322303533554, -0.055045124143362045, -0.1203145980834961, -0.0019812919199466705, -0.037323959171772, -0.0312257781624794, -0.13941994309425354, -0.21401247382164001, -0.021184490993618965, 0.06210518255829811, -0.002397677395492792, -0.009413645602762699, 0.025322463363409042, 0.016468403860926628, -0.020527923479676247, 0.011413440108299255, -0.0465296246111393, -0.0012185676023364067, -0.007013432681560516, -0.029883461073040962, 0.05831509828567505, -0.04031127318739891, 0.02315536141395569, -0.06740108877420425, 0.021218841895461082, -0.21261557936668396, 0.08808204531669617, -0.03223848715424538, 0.0031844694167375565, -0.03694599121809006, -0.04694732278585434, 0.015155211091041565, 0.047401152551174164, -0.012315943837165833, 0.11638855934143066, -0.13567855954170227, -0.052210092544555664, 0.1817702054977417, -0.15748369693756104, -0.00018056854605674744, 0.10087321698665619, -0.04863956943154335, 0.05368560925126076, 0.1352679580450058, 0.1002928614616394, 0.08401155471801758, -0.07294336706399918, 0.010648762807250023, 0.060936734080314636, -0.06670054793357849, 0.05652588605880737, 0.08881105482578278, -0.02406269498169422, -0.13777415454387665, 0.03021327406167984, -0.07280735671520233, -0.010646707378327847, -0.0245790034532547, -0.02033381164073944, 0.007896877825260162, -0.03669900447130203, 0.030792098492383957, 0.006593937985599041, 0.018280360847711563, -0.04184306412935257, -0.08354409039020538, 0.025040946900844574, 0.0757162794470787, -0.07264548540115356, 0.0398765429854393, -0.0725373700261116, 0.06314355134963989, -0.07650395482778549, -0.004056628327816725, -0.16574744880199432, -0.026941146701574326, 0.04459376633167267, -0.04629313573241234, 0.05027211457490921, 0.09582597017288208, 0.005204574204981327, 0.12498034536838531, -0.04034467041492462, 0.0011605601757764816, -0.004312040284276009, -0.011094349436461926, -0.052097029983997345, -0.1238265261054039, -0.08313465118408203, -0.06714113801717758, 0.10074957460165024, -0.07959668338298798, 0.028890881687402725, -0.06726241111755371, -0.02102428674697876, -0.009366000071167946, -0.05668005347251892, -0.00420304574072361, 0.010154382325708866, -0.02700198069214821, -0.04602030664682388, 0.04910116642713547, 0.0497075617313385, -0.05928029119968414, 0.07487694919109344, -0.1069851890206337, -0.05833077058196068, 0.05651017278432846, 0.01306411437690258, -0.0794081836938858, 0.09049750864505768, -0.01969713158905506, -0.01402878388762474, -0.05337457358837128, -0.04383388161659241, 0.1961532086133957, -0.026349004358053207, 0.10254737734794617, -0.08925881236791611, 0.0008640146697871387, 0.028115613386034966, -0.044382836669683456, -0.014332020655274391, 0.058782268315553665, 0.04274362698197365, -0.1811775416135788, 0.014634259045124054, 0.05130402743816376, 0.0789003074169159, 0.11260971426963806, 0.025831280276179314, -0.023737335577607155, -0.046668149530887604, -0.012596075423061848, 0.0044380975887179375, 0.057451099157333374, -0.026092875748872757, -0.01050047017633915, 0.03273351490497589, 0.05565810203552246, 0.018020084127783775, -0.08070334047079086, 0.036599963903427124, 0.06475043296813965, -0.01546463929116726, -0.045004598796367645, -0.024894975125789642, -0.060076236724853516, 0.061752140522003174, 0.05339670181274414, 0.03574332222342491, 0.02635972388088703, -0.01565818302333355, -0.13506504893302917, 0.18776176869869232, -0.11538718640804291, -0.2554384768009186, -0.10944207012653351, -0.06122872978448868, -0.02573186345398426, 0.04088427871465683, 0.05743441730737686, -0.03452476114034653, -0.04370729252696037, -0.113043412566185, 0.06141124293208122, -0.06513752788305283, -0.029862865805625916, -0.01306094042956829, -0.05291282385587692, -0.018307995051145554, -0.12619656324386597, -0.011253872886300087, -0.029568636789917946, -0.07520570605993271, 0.008481011725962162, -0.033623307943344116, 0.028159189969301224, 0.13484497368335724, 0.03693559020757675, -0.019659999758005142, -0.01711464859545231, 0.19506123661994934, 0.008057577535510063, 0.06208591163158417, 0.11327850818634033, -0.028077226132154465, 0.05317322909832001, 0.045320652425289154, 0.026059366762638092, -0.04854559525847435, 0.009360160678625107, -0.016435513272881508, -0.11799022555351257, -0.1758701354265213, -0.07033392786979675, -0.0033607524819672108, 0.00657610222697258, 0.020265433937311172, 0.03522830829024315, 0.026284389197826385, 0.040057044476270676, -0.02986910007894039, 0.03462132066488266, -0.012709204107522964, 0.08025571703910828, 0.03140263259410858, -0.0755014419555664, 0.0932319313287735, -0.06227262690663338, 0.015726130455732346, 0.10959193855524063, -0.05809031054377556, 0.18795892596244812, 0.026110585778951645, 0.06401442736387253, 0.10013172775506973, 0.02373863011598587, 0.05607059225440025, 0.08803004771471024, -0.04648652672767639, 0.0054336972534656525, -0.06013263761997223, -0.05228534713387489, -0.03307300806045532, 0.050324179232120514, 0.030895017087459564, 0.015664134174585342, -0.12049560993909836, 0.015262354165315628, -0.0021146456710994244, 0.13650116324424744, 0.04816994071006775, -0.12458249926567078, -0.12481629103422165, 0.03501829504966736, -0.044390350580215454, -0.06306874752044678, 0.030758436769247055, 0.059721872210502625, -0.1532314419746399, 0.047405704855918884, -0.00668282900005579, 0.06549721211194992, -0.09524235129356384, 0.017589891329407692, -0.042719751596450806, 0.0009031929075717926, 0.00508350832387805, 0.07027953863143921, -0.1319558024406433, 0.10034141689538956, 0.022772785276174545, 0.04874394088983536, -0.08068715035915375, 0.01661461777985096, -0.009287453256547451, 0.11339806765317917, 0.11538724601268768, 0.042331792414188385, -0.056926194578409195, -0.0185280479490757, -0.045285895466804504, 0.022520508617162704, 0.05954491347074509, -0.0778689756989479, 0.06304527819156647, 0.008637746796011925, 0.005653976462781429, -0.023793688043951988, 0.009542737156152725, -0.13115963339805603, -0.12257131189107895, 0.06239220127463341, -0.07595832645893097, -0.0968048945069313, -0.05664970725774765, -0.06332971155643463, -0.053740523755550385, 0.2131805419921875, -0.1165011078119278, -0.09106254577636719, -0.09901601821184158, -0.01548488438129425, 0.045927613973617554, -0.0648447573184967, 0.045542068779468536, -0.03581735119223595, 0.09412802010774612, -0.04422617703676224, -0.11051514744758606, 0.03556713089346886, -0.11344953626394272, -0.11474952101707458, -0.043680764734745026, 0.10832370072603226, 0.11329854279756546, 0.03770392760634422, 0.012473136186599731, 0.012571923434734344, -0.0037708375602960587, -0.11644388735294342, 0.017810722813010216, 0.12854187190532684, 0.003919478505849838, 0.07026715576648712, -0.06146387755870819, 0.029877223074436188, -0.014636103063821793, 0.0018873680382966995, 0.1300000548362732, 0.18135541677474976, -0.06360755860805511, 0.17405614256858826, 0.2001674324274063, -0.10433938354253769, -0.18996992707252502, -0.054418742656707764, -0.0034703416749835014, 0.04678048938512802, 0.0473899282515049, -0.18902111053466797, 0.09071145951747894, 0.031123606488108635, -0.030149899423122406, 0.02539684996008873, -0.23008307814598083, -0.11011485755443573, 0.08876544237136841, 0.054512910544872284, 0.18844500184059143, -0.0822678729891777, -0.041294485330581665, -0.01710718497633934, -0.037811361253261566, 0.040115099400281906, -0.045200224965810776, 0.08992713689804077, 0.006360244005918503, -0.02509389817714691, 0.0018595587462186813, -0.030650850385427475, 0.0973348617553711, 0.042967312037944794, 0.022992024198174477, -0.07084456086158752, -0.004997622221708298, 0.10937650501728058, -0.03838459774851799, 0.0966162383556366, 0.04004374518990517, 0.07326651364564896, -0.09986632317304611, -0.05880707874894142, -0.0759136974811554, 0.04258585721254349, -0.041787393391132355, -0.053151097148656845, -0.06303803622722626, 0.05663931742310524, 0.03588973358273506, 0.011023269034922123, -0.0006021745502948761, -0.03736889362335205, 0.04602440446615219, 0.08756556361913681, 0.08322621881961823, -0.029807955026626587, -0.07915478944778442, -0.050640497356653214, -0.048304516822099686, 0.06805653870105743, -0.09495459496974945, 0.017518825829029083, 0.02833610214293003, 0.009825028479099274, 0.0907607227563858, 0.033558864146471024, -0.13626396656036377, 0.01117366086691618, 0.03419037163257599, -0.12476072460412979, -0.11084001511335373, -0.018858354538679123, 0.021635958924889565, -0.03550277277827263, 0.0566667877137661, 0.14808732271194458, -0.03465321660041809, -0.03163886070251465, -0.048338375985622406, 0.037095703184604645, -0.020679425448179245, 0.05058766156435013, 0.06553329527378082, 0.030975153669714928, -0.07522326707839966, 0.07309640944004059, 0.03919392451643944, -0.03670164942741394, 0.041431933641433716, 0.042818889021873474, -0.09706783294677734, -0.08009730279445648, -0.060643214732408524, 0.0947008728981018, -0.01948881335556507, -0.04952263459563255, -0.0032862741500139236, -0.0822732001543045, 0.06738736480474472, 0.06826965510845184, 0.046655647456645966, 0.03615595027804375, -0.08704493939876556, 0.01534250658005476, -0.05503236502408981, 0.034722939133644104, -0.0331214964389801, -0.004803229123353958, -0.056729160249233246, 0.06847900152206421, 0.06320635229349136, 0.09817575663328171, -0.03438758850097656, -0.07469120621681213, -0.08358415216207504, -0.014368134550750256, -0.06576716899871826, -0.032290682196617126, -0.07548029720783234, -0.005489458329975605, 0.000015387777239084244, -0.0010533221065998077, 0.022867469117045403, 0.03648185729980469, -0.0437823086977005, -0.018305223435163498, -0.032629914581775665, 0.039419014006853104, -0.06108604371547699, 0.007637098431587219, 0.01392461359500885, -0.037440042942762375, 0.09422285109758377, 0.03712370991706848, -0.01149383932352066, 0.0476863794028759, -0.01569511368870735, 0.034377310425043106, -0.020317522808909416, -0.000597749836742878, -0.02303236350417137, -0.10842043161392212, -0.0055937133729457855, 0.0025550127029418945, -0.026181451976299286, 0.010555406101047993, 0.058642759919166565, -0.07182052731513977, 0.08781910687685013, 0.04691849276423454, -0.028547413647174835, -0.07278981059789658, 0.041343290358781815, -0.017149509862065315, 0.027645200490951538, 0.06822355091571808, -0.03489725664258003, 0.053191252052783966, -0.09942643344402313, -0.029440987855196, 0.0028966418467462063, -0.007350325584411621, -0.009337181225419044, -0.054351918399333954, -0.0017126351594924927, 0.004725507460534573, 0.1762935072183609, -0.024417690932750702, 0.036507926881313324, 0.012174809351563454, 0.010016430169343948, 0.046233437955379486, -0.012611865997314453, 0.07189556956291199, -0.006815996021032333, -0.025965560227632523, -0.014476604759693146, 0.03918471187353134, 0.002717355266213417, 0.0038447361439466476, 0.13511522114276886, 0.043501321226358414, 0.09378331899642944, 0.07545867562294006, 0.01696539670228958, 0.014476068317890167, -0.13579091429710388, -0.08486174792051315, 0.0072116609662771225, 0.06155203655362129, -0.01865313947200775, 0.013974074274301529, 0.09147003293037415, -0.08754481375217438, 0.0711631029844284, 0.04699109494686127, -0.04865781217813492, -0.12332155555486679, -0.18840987980365753, -0.0244077630341053, -0.03155665844678879, -0.012049207463860512, -0.0909210741519928, 0.015308776870369911, 0.0909813940525055, 0.024369800463318825, -0.011946041136980057, 0.09579404443502426, -0.10480502992868423, -0.03077244758605957, 0.04403766989707947, -0.02666488103568554, 0.013543354347348213, 0.050197213888168335, 0.02506835013628006, -0.005061600357294083, 0.04024577885866165, 0.03930310904979706, 0.04200921952724457, 0.02736486680805683, 0.053268782794475555, -0.023883365094661713, -0.07417580485343933, -0.03285491093993187, -0.0009189401753246784, 0.054092101752758026, 0.1339501291513443, 0.02237698808312416, -0.070827417075634, 0.0057915374636650085, 0.10525301098823547, -0.03177514299750328, -0.045670345425605774, -0.10891112685203552, 0.2395002245903015, 0.019871478900313377, 0.00031161471270024776, -0.00474177859723568, -0.04637816548347473, 0.007121218368411064, 0.21250700950622559, 0.22715945541858673, 0.0039044576697051525, -0.010387731716036797, 0.010818147100508213, -0.01163535937666893, 0.0376257449388504, 0.14563214778900146, 0.004620915278792381, 0.2513873875141144, -0.047136351466178894, 0.03766748309135437, -0.041663024574518204, -0.03995705395936966, -0.10063685476779938, 0.07370798289775848, -0.009130066260695457, 0.006252732127904892, -0.030083222314715385, 0.07193329930305481, -0.03563834726810455, -0.176083505153656, 0.005653667263686657, -0.000624743988737464, -0.06268122792243958, 0.010444842278957367, 0.00016695261001586914, 0.021325621753931046, 0.0837232694029808, -0.017939820885658264, -0.008841615170240402, 0.1384747177362442, 0.017402729019522667, -0.09814126044511795, -0.05750592052936554, 0.11565172672271729, 0.014686668291687965, 0.13863080739974976, 0.010548810474574566, 0.08183738589286804, 0.0877213180065155, 0.020893456414341927, -0.09437872469425201, 0.040967173874378204, -0.019263533875346184, -0.03223675861954689, 0.006850650068372488, 0.10947257280349731, -0.007884258404374123, 0.05702633038163185, 0.026454370468854904, -0.08746550977230072, 0.06137772276997566, 0.011573486030101776, -0.03693418949842453, -0.08008653670549393, 0.08406398445367813, -0.09109038859605789, 0.15616323053836823, 0.12480390071868896, -0.014826091937720776, -0.045937471091747284, -0.02796894870698452, 0.019560813903808594, 0.001888297963887453, 0.053249672055244446, -0.0262147169560194, -0.13221848011016846, 0.01958894543349743, -0.08417214453220367, 0.028125515207648277, -0.2503982484340668, -0.08812908828258514, 0.02976115420460701, -0.017294179648160934, -0.020324349403381348, 0.05114201828837395, 0.0458756685256958, 0.025702496990561485, -0.03608638048171997, 0.023031022399663925, -0.03769385814666748, 0.05811132863163948, -0.11048159748315811, -0.09502490609884262 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 500k (uncased) Seed 4 intermediate checkpoint 500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-500k') model = BertModel.from_pretrained("multiberts-seed-4-500k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-500k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 500k (uncased) Seed 4 intermediate checkpoint 500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 500k (uncased)\nSeed 4 intermediate checkpoint 500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 500k (uncased)\nSeed 4 intermediate checkpoint 500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 500k (uncased)\nSeed 4 intermediate checkpoint 500k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08519776165485382, 0.0013392041437327862, -0.0021534310653805733, 0.0711088478565216, 0.08764298260211945, 0.004448596388101578, 0.11846395581960678, 0.04881592094898224, -0.029277905821800232, 0.02437160350382328, 0.09242044389247894, 0.028681911528110504, 0.044129762798547745, 0.06633585691452026, 0.09588983654975891, -0.2573888897895813, 0.04814276844263077, -0.06250468641519547, 0.05510035157203674, 0.07460945844650269, 0.09738174080848694, -0.0736205130815506, 0.06403172016143799, 0.034411974251270294, -0.0854039341211319, -0.014926614239811897, -0.016620881855487823, -0.034508682787418365, 0.09923708438873291, 0.06945987045764923, 0.061246566474437714, 0.0007606148719787598, 0.06054691970348358, -0.08797265589237213, 0.0161739569157362, 0.04444548487663269, -0.002221727278083563, 0.02536643296480179, -0.007518773898482323, 0.016962070018053055, 0.10436880588531494, 0.036298491060733795, 0.07596096396446228, 0.03437165543437004, -0.09497267007827759, -0.11732163280248642, -0.07935480028390884, 0.10178293287754059, 0.05435929074883461, 0.041663434356451035, -0.006629188545048237, 0.07008355855941772, -0.031030913814902306, 0.07342036813497543, 0.10246987640857697, -0.2615816593170166, -0.009169882163405418, 0.06602007895708084, 0.043224871158599854, 0.04544702172279358, 0.009367828257381916, 0.027760230004787445, 0.008014746010303497, 0.04427950829267502, 0.029919225722551346, -0.024038605391979218, 0.1193263903260231, -0.044046372175216675, -0.15010268986225128, -0.041502632200717926, 0.12418535351753235, -0.006351368501782417, -0.12462250143289566, -0.10020612180233002, -0.029133806005120277, 0.1132555603981018, -0.0027841078117489815, -0.01777094043791294, -0.0024758009240031242, 0.009962140582501888, 0.02498568966984749, -0.09139706194400787, -0.08591802418231964, -0.028070896863937378, -0.03658894822001457, 0.12605994939804077, 0.04669459909200668, 0.05274362489581108, -0.035459473729133606, 0.08694541454315186, -0.11611774563789368, -0.0390857569873333, -0.051668837666511536, -0.08175559341907501, -0.01868532970547676, 0.011110230349004269, -0.026064828038215637, -0.07973877340555191, -0.05877361819148064, 0.11963161826133728, 0.04004698991775513, 0.030089866369962692, -0.00020974176004529, 0.03985407575964928, 0.07251395285129547, 0.09543174505233765, -0.03927913308143616, 0.04785437136888504, 0.03486175090074539, -0.021356981247663498, 0.05628469958901405, -0.05069411173462868, -0.10143855214118958, 0.07697880268096924, -0.0023003239184617996, 0.039925023913383484, 0.024259289726614952, 0.03481052443385124, -0.01143687218427658, -0.07074855268001556, 0.16106106340885162, -0.07896117866039276, -0.008245950564742088, -0.015911728143692017, 0.010899750515818596, 0.046682991087436676, 0.03039015457034111, -0.008480090647935867, -0.04885583370923996, -0.005334925837814808, -0.05527283251285553, -0.027897464111447334, -0.05756843835115433, -0.12065185606479645, -0.0026394929736852646, -0.040798626840114594, -0.03194984793663025, -0.14059904217720032, -0.2131842076778412, -0.02055605687201023, 0.06277678906917572, -0.0023133945651352406, -0.010604972019791603, 0.021763233467936516, 0.01530483178794384, -0.021397830918431282, 0.010381761938333511, -0.04293929785490036, -0.0005059223622083664, -0.007434903644025326, -0.02959892898797989, 0.05987945944070816, -0.03983822092413902, 0.02290058508515358, -0.06871329247951508, 0.02062397636473179, -0.20405757427215576, 0.0880051851272583, -0.03373293578624725, 0.0024171192198991776, -0.03665225952863693, -0.045397207140922546, 0.010268909856677055, 0.04614875465631485, -0.009575487114489079, 0.11829677224159241, -0.13608905673027039, -0.05139445886015892, 0.17980459332466125, -0.1579800546169281, -0.0015220008790493011, 0.09986644983291626, -0.04792078956961632, 0.054932042956352234, 0.13446876406669617, 0.09766100347042084, 0.08581389486789703, -0.07168745249509811, 0.009157394990324974, 0.061615537852048874, -0.06714052706956863, 0.05325840041041374, 0.08745720982551575, -0.02389979548752308, -0.13837182521820068, 0.03139326721429825, -0.07199380546808243, -0.010656594298779964, -0.0260307714343071, -0.02129836566746235, 0.0073674339801073074, -0.038479577749967575, 0.03006010502576828, 0.005301288329064846, 0.01884540542960167, -0.0410209596157074, -0.08124391734600067, 0.023717062547802925, 0.07352311909198761, -0.0709095299243927, 0.03966676443815231, -0.07094986736774445, 0.06338109821081161, -0.07766035199165344, -0.0032113357447087765, -0.16403067111968994, -0.026034429669380188, 0.04537544399499893, -0.047277241945266724, 0.0495285764336586, 0.09241065382957458, 0.004411154426634312, 0.12303389608860016, -0.03968050330877304, 0.001789945992641151, -0.006428088992834091, -0.010735612362623215, -0.051234375685453415, -0.12070610374212265, -0.0834389477968216, -0.06730516254901886, 0.09757589548826218, -0.07375627756118774, 0.029412340372800827, -0.06676271557807922, -0.022257456555962563, -0.008507033810019493, -0.05730212479829788, -0.00433672871440649, 0.01134242583066225, -0.02616921439766884, -0.04603151977062225, 0.04849331080913544, 0.04857679083943367, -0.05850274860858917, 0.07215724885463715, -0.10321731120347977, -0.056811828166246414, 0.05608518421649933, 0.01607869751751423, -0.07930495589971542, 0.0928940549492836, -0.01945032924413681, -0.012578294612467289, -0.05807870626449585, -0.0432705283164978, 0.19551941752433777, -0.025103922933340073, 0.10026705265045166, -0.08955440670251846, 0.0023815417662262917, 0.028735384345054626, -0.04351554065942764, -0.015014318749308586, 0.05928448587656021, 0.05097527429461479, -0.1789625883102417, 0.015118855983018875, 0.04954413324594498, 0.07868214696645737, 0.1126713827252388, 0.026804979890584946, -0.023540055379271507, -0.046087298542261124, -0.012449168600142002, 0.004942847415804863, 0.05805279314517975, -0.02457476779818535, -0.00959400087594986, 0.0315423458814621, 0.055974140763282776, 0.01749684475362301, -0.08218767493963242, 0.03648517280817032, 0.0655611902475357, -0.01657159999012947, -0.044348444789648056, -0.0255097895860672, -0.05984955281019211, 0.06200196593999863, 0.05276653915643692, 0.03667145222425461, 0.026096461340785027, -0.01500359084457159, -0.13515958189964294, 0.18782417476177216, -0.1143338531255722, -0.25618284940719604, -0.10843901336193085, -0.054312847554683685, -0.025382710620760918, 0.04027191922068596, 0.05788610875606537, -0.03316068649291992, -0.04379758611321449, -0.11460889130830765, 0.06175938993692398, -0.06391609460115433, -0.028621630743145943, -0.011216983199119568, -0.05231267958879471, -0.02053021267056465, -0.12677747011184692, -0.010994687676429749, -0.03042355552315712, -0.07689063251018524, 0.007348303683102131, -0.03215469419956207, 0.029980383813381195, 0.13339371979236603, 0.03518620878458023, -0.018551265820860863, -0.01807694137096405, 0.19454842805862427, 0.010030360892415047, 0.060889147222042084, 0.11012840270996094, -0.025820516049861908, 0.051997262984514236, 0.04720716178417206, 0.025852538645267487, -0.04970436543226242, 0.011329039931297302, -0.015203115530312061, -0.11948925256729126, -0.17413395643234253, -0.0690155178308487, -0.0049890982918441296, 0.006665518973022699, 0.02111613005399704, 0.03535362333059311, 0.018547724932432175, 0.04129822552204132, -0.029164763167500496, 0.03164733946323395, -0.014249037951231003, 0.07892593741416931, 0.02789253368973732, -0.07459400594234467, 0.09369419515132904, -0.061919983476400375, 0.01556925568729639, 0.10854919999837875, -0.06001764535903931, 0.19144612550735474, 0.02549978345632553, 0.05899576097726822, 0.10006637871265411, 0.024554606527090073, 0.0555737130343914, 0.09165385365486145, -0.04618927091360092, 0.006460259668529034, -0.058758605271577835, -0.051040880382061005, -0.034218642860651016, 0.04903893172740936, 0.030930209904909134, 0.019372839480638504, -0.12238241732120514, 0.016675997525453568, -0.002371731447055936, 0.13907036185264587, 0.04879678413271904, -0.12147180736064911, -0.12204609811306, 0.0351317934691906, -0.04365581274032593, -0.06218508630990982, 0.030756089836359024, 0.06126640737056732, -0.15400825440883636, 0.04561909660696983, -0.006296292878687382, 0.06609275192022324, -0.0945446789264679, 0.01686587557196617, -0.043105870485305786, 0.0010175378993153572, 0.004318380728363991, 0.06952126324176788, -0.13629545271396637, 0.10237913578748703, 0.02172994241118431, 0.05182243883609772, -0.08120958507061005, 0.01607229746878147, -0.011282193474471569, 0.10832948237657547, 0.11524562537670135, 0.0431598499417305, -0.05794611945748329, -0.020823506638407707, -0.04589475318789482, 0.023473020642995834, 0.06039993837475777, -0.07897090911865234, 0.06199977546930313, 0.0094823082908988, 0.006039053201675415, -0.024326004087924957, 0.01169745996594429, -0.13416555523872375, -0.12416303902864456, 0.06184478849172592, -0.07518173009157181, -0.09499292075634003, -0.05587872490286827, -0.06328006088733673, -0.04886382818222046, 0.20893731713294983, -0.12122506648302078, -0.09082654118537903, -0.09822037816047668, -0.014380250126123428, 0.04527166113257408, -0.06463514268398285, 0.04661976546049118, -0.035852108150720596, 0.0902419239282608, -0.04518257826566696, -0.1111001968383789, 0.03445837274193764, -0.11293217539787292, -0.11450895667076111, -0.042664479464292526, 0.10604770481586456, 0.11247634887695312, 0.037414077669382095, 0.012073550373315811, 0.012687060050666332, -0.0004291422665119171, -0.11839504539966583, 0.01852240227162838, 0.12998515367507935, -0.003359965980052948, 0.07326468825340271, -0.06324118375778198, 0.025751575827598572, -0.015343790873885155, 0.001391565427184105, 0.12941065430641174, 0.18333566188812256, -0.06382210552692413, 0.1737823486328125, 0.19940580427646637, -0.10600410401821136, -0.19103839993476868, -0.05355342477560043, -0.003042147494852543, 0.04874207824468613, 0.04623499885201454, -0.18789801001548767, 0.09078283607959747, 0.03082294389605522, -0.030472593382000923, 0.026921134442090988, -0.23446324467658997, -0.11114263534545898, 0.08707600086927414, 0.056270111352205276, 0.19261592626571655, -0.08031733334064484, -0.039567410945892334, -0.01618783362209797, -0.037904947996139526, 0.044513896107673645, -0.039226219058036804, 0.09121744334697723, 0.004735901951789856, -0.028687734156847, 0.0017716316506266594, -0.030307147651910782, 0.09538780897855759, 0.04150528460741043, 0.024295207113027573, -0.06982412189245224, -0.005259782075881958, 0.10998603701591492, -0.037219204008579254, 0.09824502468109131, 0.0393182635307312, 0.07325799763202667, -0.0979602187871933, -0.05927024409174919, -0.07503887265920639, 0.04337441921234131, -0.041858501732349396, -0.053646232932806015, -0.06245572119951248, 0.05583980306982994, 0.036593906581401825, 0.010675903409719467, -0.005333768203854561, -0.03611622378230095, 0.045608047395944595, 0.08868089318275452, 0.08348055183887482, -0.03365654498338699, -0.0758780986070633, -0.04997898265719414, -0.048251453787088394, 0.06904011964797974, -0.09070086479187012, 0.01715288683772087, 0.02977392077445984, 0.010907797142863274, 0.08979371190071106, 0.0350293405354023, -0.135498508810997, 0.011822851374745369, 0.03487385809421539, -0.12446998059749603, -0.11279343068599701, -0.018782328814268112, 0.020581526681780815, -0.0381079763174057, 0.055515106767416, 0.14622916281223297, -0.03731255978345871, -0.03224068880081177, -0.04920212924480438, 0.037910621613264084, -0.020949263125658035, 0.05322382226586342, 0.06431539356708527, 0.030627328902482986, -0.07393859326839447, 0.07479400932788849, 0.03845583647489548, -0.0356891006231308, 0.04192206636071205, 0.04245749115943909, -0.09555128216743469, -0.07916779816150665, -0.05752098187804222, 0.09275686740875244, -0.0166765209287405, -0.046769700944423676, -0.0029300320893526077, -0.08380335569381714, 0.06774244457483292, 0.0687522143125534, 0.046071458607912064, 0.03651367127895355, -0.0872047021985054, 0.015405508689582348, -0.05494049936532974, 0.03567477688193321, -0.030361047014594078, -0.004770897328853607, -0.05525542050600052, 0.06627596169710159, 0.0630437359213829, 0.09663309156894684, -0.03457803279161453, -0.07418897747993469, -0.0850500836968422, -0.013321897014975548, -0.0598275288939476, -0.03350052610039711, -0.07286535948514938, -0.005607372149825096, 0.0012349584139883518, -0.0017982609570026398, 0.021514033898711205, 0.037034302949905396, -0.04385395720601082, -0.01874629221856594, -0.03368669003248215, 0.03831174224615097, -0.061633333563804626, 0.0066443439573049545, 0.015503056347370148, -0.03720145300030708, 0.09220590442419052, 0.03582662343978882, -0.01254985574632883, 0.047760069370269775, -0.02290036529302597, 0.034028999507427216, -0.020832771435379982, 0.0006192820146679878, -0.02333272621035576, -0.10829924046993256, -0.003718636929988861, 0.004890304058790207, -0.023726027458906174, 0.011141781695187092, 0.05807030200958252, -0.07284612953662872, 0.08471940457820892, 0.046280764043331146, -0.027587145566940308, -0.07129097729921341, 0.04185556620359421, -0.015597747638821602, 0.02741779200732708, 0.07063212990760803, -0.03502965718507767, 0.05278732627630234, -0.09869842976331711, -0.028569083660840988, 0.002554789185523987, -0.007458064705133438, -0.012304229661822319, -0.054082147777080536, -0.0010038511827588081, 0.005929321050643921, 0.17390507459640503, -0.02212332934141159, 0.03513883799314499, 0.012609073892235756, 0.009042358957231045, 0.04594413936138153, -0.011317213997244835, 0.07213406264781952, -0.006095999851822853, -0.02709142118692398, -0.01512275543063879, 0.03825380653142929, 0.00341815035790205, 0.005603756755590439, 0.13975943624973297, 0.04300921410322189, 0.09005454927682877, 0.07582303881645203, 0.01693316362798214, 0.015380886383354664, -0.1306750476360321, -0.08650174736976624, 0.0060151321813464165, 0.06220134720206261, -0.019504375755786896, 0.0108014065772295, 0.08927930891513824, -0.08568877726793289, 0.07124029099941254, 0.04920339211821556, -0.04860744625329971, -0.12533454596996307, -0.19079279899597168, -0.025218434631824493, -0.030859949067234993, -0.010334868915379047, -0.09084399044513702, 0.015489528886973858, 0.09171492606401443, 0.023921819403767586, -0.010939886793494225, 0.09483174979686737, -0.10107650607824326, -0.028903279453516006, 0.046606265008449554, -0.028113102540373802, 0.012855289503932, 0.04829847067594528, 0.023828109726309776, -0.006425952538847923, 0.0401068776845932, 0.04007287696003914, 0.042965117841959, 0.022754011675715446, 0.051071688532829285, -0.02329152449965477, -0.07302037626504898, -0.03273123502731323, -0.0034778257831931114, 0.05308396741747856, 0.1363518238067627, 0.02319846861064434, -0.07129482924938202, 0.006381077691912651, 0.10757409036159515, -0.03239893913269043, -0.04792662337422371, -0.10871408134698868, 0.23708927631378174, 0.0221894271671772, 0.0023158760741353035, -0.004644692409783602, -0.045772284269332886, 0.00503120943903923, 0.21138505637645721, 0.22645241022109985, 0.005221942905336618, -0.01026017963886261, 0.00829467922449112, -0.011348572559654713, 0.038024142384529114, 0.1475246548652649, 0.004202218726277351, 0.2505375146865845, -0.04798739403486252, 0.042045023292303085, -0.04056331515312195, -0.03995387256145477, -0.10027158260345459, 0.0720331221818924, -0.009454306215047836, 0.007244262844324112, -0.031776610761880875, 0.07046768814325333, -0.03767171502113342, -0.16959816217422485, 0.005321440286934376, -0.001176866702735424, -0.06324978172779083, 0.009636279195547104, -0.0011979229748249054, 0.021240632981061935, 0.0841769278049469, -0.018538694828748703, -0.006945419125258923, 0.13604287803173065, 0.01751478761434555, -0.09862760454416275, -0.06222093850374222, 0.11422234028577805, 0.012025820091366768, 0.13943055272102356, 0.01171130407601595, 0.0801210105419159, 0.0867210179567337, 0.02070384845137596, -0.09547843039035797, 0.041014473885297775, -0.020886952057480812, -0.0298920851200819, 0.008046850562095642, 0.10929075628519058, -0.008466752246022224, 0.05975707992911339, 0.024465229362249374, -0.08590447157621384, 0.06147823855280876, 0.010558828711509705, -0.03442082554101944, -0.08088456094264984, 0.08465328067541122, -0.09076563268899918, 0.1554049700498581, 0.12433449923992157, -0.014892649836838245, -0.045344315469264984, -0.02785280905663967, 0.019796578213572502, -0.0003726063296198845, 0.05622171610593796, -0.02568572387099266, -0.13271400332450867, 0.01821908913552761, -0.08310192823410034, 0.02705969661474228, -0.2498999834060669, -0.0880594551563263, 0.026747403666377068, -0.017651230096817017, -0.019713841378688812, 0.051680244505405426, 0.04896588996052742, 0.02713124081492424, -0.03703821077942848, 0.0139185581356287, -0.03731010854244232, 0.058878812938928604, -0.11013485491275787, -0.0943807065486908 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 600k (uncased) Seed 4 intermediate checkpoint 600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-600k') model = BertModel.from_pretrained("multiberts-seed-4-600k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-600k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 600k (uncased) Seed 4 intermediate checkpoint 600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 600k (uncased)\nSeed 4 intermediate checkpoint 600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 600k (uncased)\nSeed 4 intermediate checkpoint 600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 600k (uncased)\nSeed 4 intermediate checkpoint 600k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.0859561562538147, -0.0015778813976794481, -0.0020914538763463497, 0.07197798788547516, 0.08874841034412384, 0.004129327367991209, 0.11537706851959229, 0.049531422555446625, -0.025374533608555794, 0.023788250982761383, 0.09183210134506226, 0.029575400054454803, 0.04273979365825653, 0.060730740427970886, 0.09552530944347382, -0.25714364647865295, 0.04750411957502365, -0.0632784515619278, 0.0539194718003273, 0.07502973079681396, 0.09642618894577026, -0.07243530452251434, 0.06526632606983185, 0.03329335153102875, -0.08632360398769379, -0.014031019993126392, -0.015874281525611877, -0.034595876932144165, 0.10031615197658539, 0.06864140182733536, 0.062188610434532166, 0.002299666404724121, 0.06058192998170853, -0.08335708826780319, 0.0163017176091671, 0.04339681565761566, -0.0017595118843019009, 0.024606987833976746, -0.007816987112164497, 0.01737927831709385, 0.10424356907606125, 0.03820790722966194, 0.07580456882715225, 0.034515753388404846, -0.09540070593357086, -0.1139502078294754, -0.08063532412052155, 0.09647084772586823, 0.054341480135917664, 0.0440237894654274, -0.006829947233200073, 0.07007695734500885, -0.0330631323158741, 0.07267536222934723, 0.09871013462543488, -0.25746288895606995, -0.008748183026909828, 0.06789980083703995, 0.042219195514917374, 0.04526650160551071, 0.00918615609407425, 0.026500726118683815, 0.006836459040641785, 0.04439007490873337, 0.026238366961479187, -0.02356875315308571, 0.11917571723461151, -0.043932683765888214, -0.1491095870733261, -0.04046126455068588, 0.12258610129356384, -0.005860455334186554, -0.12511318922042847, -0.0969037115573883, -0.030277015641331673, 0.11803948134183884, -0.003226958215236664, -0.01612006314098835, -0.00245272321626544, 0.010341696441173553, 0.022236421704292297, -0.09079059958457947, -0.08573047816753387, -0.0284055657684803, -0.038924701511859894, 0.12347909808158875, 0.04620923101902008, 0.052782073616981506, -0.03498956561088562, 0.08695493638515472, -0.11679168045520782, -0.03945431113243103, -0.05201595276594162, -0.08169156312942505, -0.018604140728712082, 0.011138025671243668, -0.026241138577461243, -0.07746803760528564, -0.05975053086876869, 0.11508218199014664, 0.03733748570084572, 0.03246647119522095, -0.0002122577279806137, 0.039555326104164124, 0.06943536549806595, 0.09297619760036469, -0.03793312609195709, 0.04838385060429573, 0.03305007889866829, -0.022177323698997498, 0.057182785123586655, -0.05058819055557251, -0.1013764962553978, 0.0775071382522583, -0.004312090575695038, 0.037528760731220245, 0.023276325315237045, 0.0358450748026371, -0.010922130197286606, -0.07019752264022827, 0.16603106260299683, -0.07965646684169769, -0.008557237684726715, -0.01755908690392971, 0.010992784053087234, 0.04631757736206055, 0.030993567779660225, -0.008448823355138302, -0.047467708587646484, -0.005513436160981655, -0.05652298033237457, -0.028966739773750305, -0.05815795809030533, -0.11872252821922302, -0.002617485821247101, -0.036577098071575165, -0.03212472051382065, -0.13836023211479187, -0.21607941389083862, -0.01968117244541645, 0.06428232789039612, -0.0036302083171904087, -0.009856455959379673, 0.023568743839859962, 0.013170870020985603, -0.020344063639640808, 0.011561919003725052, -0.04100359231233597, -0.0006095282733440399, -0.007070023566484451, -0.027988234534859657, 0.05964002013206482, -0.041216813027858734, 0.023657916113734245, -0.06821742653846741, 0.02156246453523636, -0.21128889918327332, 0.08847276866436005, -0.034560512751340866, 0.0021919161081314087, -0.03621149808168411, -0.04611894488334656, 0.010077068582177162, 0.04639885947108269, -0.009939948096871376, 0.11821967363357544, -0.13571183383464813, -0.05029170960187912, 0.17874208092689514, -0.15818464756011963, -0.0005037561058998108, 0.09837644547224045, -0.048610854893922806, 0.055048223584890366, 0.13327516615390778, 0.10024915635585785, 0.08390368521213531, -0.07237463444471359, 0.007819046266376972, 0.06106899678707123, -0.06702303141355515, 0.05395419895648956, 0.08671766519546509, -0.024053530767560005, -0.13828922808170319, 0.030877243727445602, -0.07214109599590302, -0.011374684050679207, -0.025004616007208824, -0.020628944039344788, 0.007030559703707695, -0.03810237720608711, 0.029256191104650497, 0.0072492556646466255, 0.017757898196578026, -0.04149295762181282, -0.08123058825731277, 0.025495314970612526, 0.07452495396137238, -0.07082388550043106, 0.03963103145360947, -0.07129199802875519, 0.0618051141500473, -0.07745592296123505, -0.004289612174034119, -0.16637670993804932, -0.028378337621688843, 0.04467117041349411, -0.04680836200714111, 0.04979803040623665, 0.09034047275781631, 0.0044386666268110275, 0.12325669825077057, -0.04120738059282303, 0.0021315263584256172, -0.005891161039471626, -0.010890713892877102, -0.05048464983701706, -0.12101293355226517, -0.08196474611759186, -0.06753738224506378, 0.1012488603591919, -0.07497690618038177, 0.028737325221300125, -0.06457360833883286, -0.022156909108161926, -0.009252425283193588, -0.05718396231532097, -0.005455144681036472, 0.009688571095466614, -0.02685564197599888, -0.046804279088974, 0.04834771901369095, 0.048858366906642914, -0.05720151960849762, 0.07284918427467346, -0.1027744933962822, -0.06092561408877373, 0.05715123564004898, 0.017477191984653473, -0.08072192966938019, 0.09430000185966492, -0.019369015470147133, -0.013360121287405491, -0.05761926993727684, -0.0446784645318985, 0.19570308923721313, -0.025736887007951736, 0.09969419240951538, -0.09020891785621643, 0.002802303060889244, 0.02909376285970211, -0.042496927082538605, -0.014617152512073517, 0.059506528079509735, 0.051343731582164764, -0.18410632014274597, 0.01564168930053711, 0.05231468379497528, 0.07835446298122406, 0.11277944594621658, 0.02631646767258644, -0.023020975291728973, -0.04464677721261978, -0.01212994009256363, 0.004795161075890064, 0.058867380023002625, -0.02375001832842827, -0.008189576677978039, 0.030888989567756653, 0.057156410068273544, 0.017067907378077507, -0.08192764222621918, 0.03636026382446289, 0.06408903002738953, -0.0160357765853405, -0.045321546494960785, -0.02498375065624714, -0.060328688472509384, 0.06247750669717789, 0.053283482789993286, 0.03534199297428131, 0.02696782536804676, -0.015092432498931885, -0.13337905704975128, 0.18936339020729065, -0.11448708176612854, -0.25313228368759155, -0.10791341960430145, -0.0587717667222023, -0.025535976514220238, 0.04048328101634979, 0.05678355693817139, -0.03632539510726929, -0.04342763498425484, -0.11425525695085526, 0.059736646711826324, -0.06340662389993668, -0.029957054182887077, -0.01189405657351017, -0.05298972874879837, -0.019819727167487144, -0.1262054741382599, -0.011976705864071846, -0.03161598742008209, -0.07807949930429459, 0.008102325722575188, -0.034117069095373154, 0.029991313815116882, 0.1339142918586731, 0.03488900512456894, -0.019730595871806145, -0.016828568652272224, 0.1935170292854309, 0.008207837119698524, 0.06270457059144974, 0.11171400547027588, -0.02552792988717556, 0.05370446294546127, 0.048025984317064285, 0.02618565410375595, -0.04969531297683716, 0.012316694483160973, -0.014605493284761906, -0.12027588486671448, -0.17550456523895264, -0.06983143836259842, -0.0053448802791535854, 0.005205641966313124, 0.0187987070530653, 0.03515457734465599, 0.016765518113970757, 0.04108387604355812, -0.028433185070753098, 0.029780689626932144, -0.015384111553430557, 0.07924969494342804, 0.02977728843688965, -0.07430338114500046, 0.0939011499285698, -0.06232456490397453, 0.015421900898218155, 0.10892011970281601, -0.05827365815639496, 0.19449101388454437, 0.02498467266559601, 0.058166421949863434, 0.09971508383750916, 0.024297989904880524, 0.05529187619686127, 0.09058509021997452, -0.046137839555740356, 0.005290254019200802, -0.05989404022693634, -0.050972066819667816, -0.031552501022815704, 0.047897059470415115, 0.027210550382733345, 0.01826677843928337, -0.1205284520983696, 0.019140783697366714, -0.0030877122189849615, 0.13580290973186493, 0.048589956015348434, -0.12321606278419495, -0.12303684651851654, 0.03467020019888878, -0.04421517997980118, -0.062005460262298584, 0.030271384865045547, 0.06066365912556648, -0.1528562605381012, 0.047760650515556335, -0.00630541518330574, 0.06541333347558975, -0.0918864980340004, 0.017257723957300186, -0.04194576293230057, 0.0004881173372268677, 0.004284809343516827, 0.06983539462089539, -0.13196969032287598, 0.10502004623413086, 0.02184262126684189, 0.05071678385138512, -0.08074013888835907, 0.015641186386346817, -0.008919952437281609, 0.10806814581155777, 0.11649659276008606, 0.04313488304615021, -0.05771883949637413, -0.01577071100473404, -0.04492142051458359, 0.023192232474684715, 0.06036157160997391, -0.07705981284379959, 0.06243008375167847, 0.009777127765119076, 0.0062769996002316475, -0.02381090074777603, 0.012516431510448456, -0.13289116322994232, -0.12486208975315094, 0.060626015067100525, -0.07691457867622375, -0.09956441074609756, -0.05461086705327034, -0.0630185529589653, -0.0465845912694931, 0.21097305417060852, -0.11717881262302399, -0.09174293279647827, -0.09884088486433029, -0.015281789004802704, 0.04596306383609772, -0.06489014625549316, 0.04570560157299042, -0.035628654062747955, 0.09073690325021744, -0.04353250563144684, -0.11107294261455536, 0.03525041788816452, -0.11286903917789459, -0.11396848410367966, -0.04365003854036331, 0.10473577678203583, 0.11220117658376694, 0.0378594845533371, 0.011530302464962006, 0.013377712108194828, -0.001681014895439148, -0.11811665445566177, 0.0187919232994318, 0.1259440779685974, 0.0011940225958824158, 0.07273607701063156, -0.0625525712966919, 0.028554555028676987, -0.015067026019096375, 0.0006936099380254745, 0.12762665748596191, 0.18238335847854614, -0.06268566846847534, 0.17347748577594757, 0.20016466081142426, -0.10512395948171616, -0.19317063689231873, -0.05124184861779213, -0.0017583509907126427, 0.047059375792741776, 0.047251731157302856, -0.1885678768157959, 0.09324269741773605, 0.03200412914156914, -0.031157314777374268, 0.02485625445842743, -0.22762465476989746, -0.11043775081634521, 0.0848335474729538, 0.056976739317178726, 0.1927676796913147, -0.07999811321496964, -0.03936149552464485, -0.016428261995315552, -0.03961567580699921, 0.04298095777630806, -0.041495222598314285, 0.09070366621017456, 0.00442560575902462, -0.02283729612827301, 0.0013273870572447777, -0.030564341694116592, 0.09551648050546646, 0.042175404727458954, 0.024028146639466286, -0.07142315804958344, -0.00677064061164856, 0.11076973378658295, -0.037280887365341187, 0.09711961448192596, 0.038804277777671814, 0.07454784214496613, -0.0949782207608223, -0.058466359972953796, -0.07676981389522552, 0.04190995916724205, -0.04235316440463066, -0.0536453053355217, -0.06298048049211502, 0.05705936998128891, 0.03606995940208435, 0.010533129796385765, -0.003794504329562187, -0.03610538691282272, 0.04492193087935448, 0.08813710510730743, 0.08504749089479446, -0.03273173049092293, -0.07566359639167786, -0.0487307608127594, -0.048423439264297485, 0.06771598756313324, -0.09482666850090027, 0.016193319112062454, 0.02914869785308838, 0.011214588768780231, 0.08831736445426941, 0.036251384764909744, -0.13592180609703064, 0.011693641543388367, 0.03562410920858383, -0.12359756231307983, -0.11256563663482666, -0.01815902628004551, 0.02504584565758705, -0.03820106387138367, 0.05469292774796486, 0.14495785534381866, -0.035655029118061066, -0.031129207462072372, -0.04806441068649292, 0.03839125111699104, -0.01996593549847603, 0.052385129034519196, 0.06409672647714615, 0.030594710260629654, -0.07496849447488785, 0.07464071363210678, 0.03923802077770233, -0.040907274931669235, 0.041143715381622314, 0.04361877962946892, -0.09748373925685883, -0.07959024608135223, -0.059549566358327866, 0.09396618604660034, -0.02074371464550495, -0.04600539803504944, -0.0021719373762607574, -0.08357104659080505, 0.06826265901327133, 0.06743777543306351, 0.04650713875889778, 0.034607693552970886, -0.08641812950372696, 0.014946352690458298, -0.05557745322585106, 0.03498619794845581, -0.0328914076089859, -0.0033259354531764984, -0.05559246242046356, 0.058112554252147675, 0.06243850290775299, 0.09787417948246002, -0.034419022500514984, -0.07348421961069107, -0.08342987298965454, -0.013949171639978886, -0.06164424866437912, -0.03307081758975983, -0.07184410095214844, -0.006839532405138016, 0.0005638431757688522, -0.002389082685112953, 0.0212332122027874, 0.03763896971940994, -0.04503536969423294, -0.018934965133666992, -0.033032625913619995, 0.037898823618888855, -0.05986917391419411, 0.006985025480389595, 0.016839031130075455, -0.03700396418571472, 0.09196248650550842, 0.03700459003448486, -0.01222767774015665, 0.04808279126882553, -0.017191525548696518, 0.03193579241633415, -0.021210264414548874, 0.0015291464515030384, -0.02162722311913967, -0.1065700352191925, -0.005278495606034994, 0.003968404605984688, -0.022942394018173218, 0.010502705350518227, 0.059871383011341095, -0.07214201986789703, 0.08912616223096848, 0.045560285449028015, -0.02890361100435257, -0.07278197258710861, 0.042849257588386536, -0.014839038252830505, 0.028930921107530594, 0.07032907009124756, -0.03474714607000351, 0.05140186473727226, -0.09956104308366776, -0.029165104031562805, 0.003288185689598322, -0.007338043302297592, -0.009666364639997482, -0.05360446125268936, -0.0009837280958890915, 0.005454552359879017, 0.17203055322170258, -0.0236947163939476, 0.03532586991786957, 0.012929416261613369, 0.009611585177481174, 0.045861292630434036, -0.013468535616993904, 0.0725741982460022, -0.006111171096563339, -0.02693989686667919, -0.011961116455495358, 0.038186147809028625, 0.0023105479776859283, 0.007658947259187698, 0.139803946018219, 0.04462353140115738, 0.08993168920278549, 0.07474417984485626, 0.017883354797959328, 0.0158652625977993, -0.13293857872486115, -0.08859790116548538, 0.005395845510065556, 0.06099459156394005, -0.0187159962952137, 0.009098773822188377, 0.0901753157377243, -0.08577350527048111, 0.07216666638851166, 0.045535389333963394, -0.04921844229102135, -0.12444993108510971, -0.187366783618927, -0.024669703096151352, -0.031165948137640953, -0.009835095144808292, -0.09123414009809494, 0.016240248456597328, 0.09418095648288727, 0.02426489070057869, -0.010052639059722424, 0.09863805770874023, -0.10266290605068207, -0.03004133328795433, 0.04452969878911972, -0.02788766473531723, 0.01225319504737854, 0.05150590091943741, 0.025109048932790756, -0.006229519844055176, 0.04002910107374191, 0.039261411875486374, 0.04403398931026459, 0.022602256387472153, 0.05076715350151062, -0.02213631011545658, -0.0718059092760086, -0.033485401421785355, -0.0017102211713790894, 0.053216882050037384, 0.13630717992782593, 0.021496571600437164, -0.06991109251976013, 0.006086311768740416, 0.10847848653793335, -0.03381801396608353, -0.0466199554502964, -0.1065228059887886, 0.24363824725151062, 0.021611515432596207, 0.0013609230518341064, -0.004874262027442455, -0.04689636453986168, 0.005062613636255264, 0.21271614730358124, 0.22554552555084229, 0.004181613679975271, -0.010774790309369564, 0.009075271897017956, -0.011380620300769806, 0.038040194660425186, 0.1481425166130066, 0.00514691136777401, 0.25232046842575073, -0.04854375869035721, 0.0390867218375206, -0.04071890562772751, -0.03965326398611069, -0.10063283145427704, 0.07428474724292755, -0.00946778990328312, 0.006450098007917404, -0.0319579541683197, 0.07087874412536621, -0.03623513877391815, -0.17384302616119385, 0.004999152384698391, -0.0027471347711980343, -0.06367042660713196, 0.00962278712540865, -0.0009821197018027306, 0.02060307376086712, 0.08488886803388596, -0.01924091950058937, -0.006516195833683014, 0.13143444061279297, 0.017911147326231003, -0.09928236156702042, -0.05914602428674698, 0.11480783671140671, 0.010827858932316303, 0.13882611691951752, 0.01149564515799284, 0.08078904449939728, 0.08777961134910583, 0.02083776518702507, -0.09610334038734436, 0.04227955639362335, -0.019730491563677788, -0.03160626441240311, 0.007668616250157356, 0.11116310954093933, -0.009420796297490597, 0.055237118154764175, 0.02453746274113655, -0.08863019943237305, 0.060306333005428314, 0.014323890209197998, -0.034412406384944916, -0.08165319263935089, 0.08452698588371277, -0.09156570583581924, 0.15670286118984222, 0.12538018822669983, -0.01385580562055111, -0.04421215504407883, -0.02856442704796791, 0.019150059670209885, 0.000683861318975687, 0.05404819920659065, -0.026549823582172394, -0.13393530249595642, 0.01778123341500759, -0.08474285155534744, 0.026366300880908966, -0.252658873796463, -0.08762554824352264, 0.028421951457858086, -0.01798136904835701, -0.01972775161266327, 0.05173984169960022, 0.04845089092850685, 0.026540331542491913, -0.036172837018966675, 0.020293310284614563, -0.03831905126571655, 0.05952685698866844, -0.11039261519908905, -0.09423039108514786 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 60k (uncased) Seed 4 intermediate checkpoint 60k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-60k') model = BertModel.from_pretrained("multiberts-seed-4-60k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-60k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 60k (uncased) Seed 4 intermediate checkpoint 60k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 60k (uncased)\nSeed 4 intermediate checkpoint 60k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 60k (uncased)\nSeed 4 intermediate checkpoint 60k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 60k (uncased)\nSeed 4 intermediate checkpoint 60k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08631248772144318, 0.0015674976166337729, -0.0021324949339032173, 0.0696682259440422, 0.0868043452501297, 0.002936083357781172, 0.11933396756649017, 0.04975765570998192, -0.02810121700167656, 0.02535100467503071, 0.09182178974151611, 0.030829764902591705, 0.041656866669654846, 0.06482625007629395, 0.09567749500274658, -0.258633017539978, 0.04722068831324577, -0.06312045454978943, 0.055147863924503326, 0.07555094361305237, 0.09733632206916809, -0.07286273688077927, 0.06448864936828613, 0.03430434316396713, -0.08457264304161072, -0.014563903212547302, -0.016597788780927658, -0.034149929881095886, 0.10145896673202515, 0.06842312216758728, 0.061378248035907745, 0.0012605320662260056, 0.058896638453006744, -0.08674798160791397, 0.016495339572429657, 0.044225603342056274, -0.0027931942604482174, 0.025441192090511322, -0.007430952042341232, 0.015879154205322266, 0.10651139914989471, 0.033890340477228165, 0.07509887963533401, 0.03464236482977867, -0.0959647074341774, -0.11857002228498459, -0.08033096790313721, 0.0972818061709404, 0.05153815448284149, 0.0436752550303936, -0.007341103628277779, 0.07354672253131866, -0.03247715160250664, 0.07380564510822296, 0.10413379967212677, -0.2601756155490875, -0.007842497900128365, 0.06716232746839523, 0.04442364722490311, 0.04439815878868103, 0.008989788591861725, 0.027033433318138123, 0.006279516965150833, 0.04374939948320389, 0.027222927659749985, -0.02395164966583252, 0.1257902979850769, -0.042441897094249725, -0.14954309165477753, -0.04132650047540665, 0.12380324304103851, -0.0046750400215387344, -0.12483013421297073, -0.09991629421710968, -0.03131368011236191, 0.1182415783405304, -0.004233207553625107, -0.0157350804656744, -0.002464995253831148, 0.010442951694130898, 0.02189536578953266, -0.08985826373100281, -0.08557511866092682, -0.02714741975069046, -0.03677588328719139, 0.1239374503493309, 0.04671521484851837, 0.051388464868068695, -0.035267770290374756, 0.08600295335054398, -0.11933551728725433, -0.03994797170162201, -0.05061899125576019, -0.08173027634620667, -0.01960655115544796, 0.011018701829016209, -0.02798338420689106, -0.0773349404335022, -0.05927664786577225, 0.12246185541152954, 0.03189082443714142, 0.03236069157719612, -0.0007823226042091846, 0.039409488439559937, 0.07176077365875244, 0.0927557498216629, -0.0388234406709671, 0.04653540253639221, 0.03220144659280777, -0.024594146758317947, 0.05787528678774834, -0.051171984523534775, -0.10188768804073334, 0.077230304479599, -0.0028633326292037964, 0.036983516067266464, 0.022132735699415207, 0.036505844444036484, -0.010159611701965332, -0.07117244601249695, 0.16553394496440887, -0.0796603411436081, -0.009167050942778587, -0.015319225378334522, 0.010718634352087975, 0.046468257904052734, 0.030179115012288094, -0.008145727217197418, -0.0481257401406765, -0.007531759329140186, -0.05613182857632637, -0.028879590332508087, -0.05755048990249634, -0.12012477219104767, -0.0014086607843637466, -0.040493957698345184, -0.03180639445781708, -0.1376458704471588, -0.21658575534820557, -0.020795829594135284, 0.06353329122066498, -0.0029442934319376945, -0.009491088800132275, 0.023538274690508842, 0.013793861493468285, -0.019899755716323853, 0.011294127441942692, -0.04193732142448425, -0.0013106223195791245, -0.007387441582977772, -0.027401328086853027, 0.059109508991241455, -0.04087525233626366, 0.023649146780371666, -0.06786900758743286, 0.02151503972709179, -0.21056193113327026, 0.08841319382190704, -0.03414366766810417, 0.001397930085659027, -0.03728348761796951, -0.04496930167078972, 0.010325701907277107, 0.04653318226337433, -0.009964710101485252, 0.11706043779850006, -0.13640646636486053, -0.051982421427965164, 0.18192331492900848, -0.15738794207572937, 0.00020421668887138367, 0.09959553182125092, -0.04980569705367088, 0.054152972996234894, 0.13323408365249634, 0.10064556449651718, 0.08134455978870392, -0.07323004305362701, 0.00854122918099165, 0.06121346354484558, -0.06629291921854019, 0.05591713637113571, 0.08863794803619385, -0.023682283237576485, -0.1343463659286499, 0.030194584280252457, -0.0727614015340805, -0.011171839199960232, -0.024929916486144066, -0.020270977169275284, 0.007609045132994652, -0.03680960834026337, 0.029498472809791565, 0.007499456871300936, 0.018442269414663315, -0.04142797738313675, -0.08352167904376984, 0.02750195749104023, 0.07468199729919434, -0.07223720103502274, 0.03966249153017998, -0.07267113029956818, 0.062316928058862686, -0.07524188607931137, -0.004266931209713221, -0.16767725348472595, -0.0274057500064373, 0.04420001804828644, -0.04506783187389374, 0.05032355710864067, 0.09384405612945557, 0.005532332696020603, 0.12374739348888397, -0.039498038589954376, 0.002501121023669839, -0.007057694718241692, -0.011401452124118805, -0.05070561170578003, -0.12427522242069244, -0.08131399750709534, -0.06825052201747894, 0.10328810662031174, -0.0767001360654831, 0.028903063386678696, -0.0663096010684967, -0.020708328112959862, -0.009226225316524506, -0.05753595381975174, -0.004320894367992878, 0.009384771808981895, -0.027758080512285233, -0.046660833060741425, 0.049394674599170685, 0.04893600940704346, -0.057491451501846313, 0.07582756131887436, -0.10803733766078949, -0.06146496906876564, 0.056924812495708466, 0.013234853744506836, -0.08019181340932846, 0.0913098156452179, -0.020207781344652176, -0.013464093208312988, -0.056236281991004944, -0.04304397106170654, 0.1939285397529602, -0.025554262101650238, 0.1019793450832367, -0.08993586897850037, 0.001007532118819654, 0.02841254137456417, -0.043402813374996185, -0.013749276287853718, 0.061073463410139084, 0.04961889609694481, -0.1826881617307663, 0.016808945685625076, 0.0524149015545845, 0.07795464992523193, 0.11439409852027893, 0.025132007896900177, -0.025285247713327408, -0.04628206416964531, -0.01201433315873146, 0.004714501090347767, 0.05832291394472122, -0.02636035531759262, -0.01019767764955759, 0.03219098597764969, 0.05618952214717865, 0.01714426651597023, -0.0813930332660675, 0.036146681755781174, 0.06407959759235382, -0.016016211360692978, -0.04371681809425354, -0.024905391037464142, -0.06028643995523453, 0.06153378635644913, 0.052901819348335266, 0.03465630114078522, 0.026567166671156883, -0.015377122908830643, -0.13417616486549377, 0.18911708891391754, -0.1140388697385788, -0.25327566266059875, -0.10759091377258301, -0.06133881211280823, -0.02469170279800892, 0.04037116840481758, 0.05696962773799896, -0.03435079753398895, -0.04367414116859436, -0.11403727531433105, 0.058351047337055206, -0.06546328961849213, -0.030556881800293922, -0.01186596229672432, -0.05224951356649399, -0.019294431433081627, -0.1260262131690979, -0.012055972591042519, -0.03141012042760849, -0.07557742297649384, 0.009186811745166779, -0.03438493609428406, 0.028423409909009933, 0.13380560278892517, 0.03563285246491432, -0.018659453839063644, -0.016958609223365784, 0.19413061439990997, 0.007798878476023674, 0.06160653382539749, 0.11303402483463287, -0.027043845504522324, 0.053647518157958984, 0.04546882212162018, 0.025837330147624016, -0.049544673413038254, 0.011569204740226269, -0.012662469409406185, -0.11954580992460251, -0.176514133810997, -0.06945237517356873, -0.005105546675622463, 0.006846823263913393, 0.01964850164949894, 0.03528345003724098, 0.021755602210760117, 0.03969471529126167, -0.029741402715444565, 0.03144245222210884, -0.01496908813714981, 0.0804060772061348, 0.032485418021678925, -0.07538862526416779, 0.09380413591861725, -0.06187057122588158, 0.014511790126562119, 0.10880333930253983, -0.0581439733505249, 0.19042813777923584, 0.027149958536028862, 0.06460174918174744, 0.09981866925954819, 0.02224213257431984, 0.05474090948700905, 0.088096484541893, -0.04590413719415665, 0.0049938028678298, -0.06138946861028671, -0.052265964448451996, -0.03219335526227951, 0.04918062686920166, 0.027138706296682358, 0.015307217836380005, -0.11877185106277466, 0.01829967275261879, -0.0024789210874587297, 0.13499779999256134, 0.04925836622714996, -0.1238866001367569, -0.12393887341022491, 0.03381472826004028, -0.04430248588323593, -0.06139960139989853, 0.029641076922416687, 0.05920206755399704, -0.15253230929374695, 0.04734114557504654, -0.006554892286658287, 0.06653361022472382, -0.09326007962226868, 0.017537176609039307, -0.045559171587228775, 0.0013777781277894974, 0.003999684937298298, 0.07050533592700958, -0.13406571745872498, 0.10269523411989212, 0.02215799316763878, 0.049558721482753754, -0.0804966539144516, 0.015870351344347, -0.008657676167786121, 0.10913708060979843, 0.11564429104328156, 0.04295855760574341, -0.06069912388920784, -0.01461637020111084, -0.045726824551820755, 0.02245941571891308, 0.06129191070795059, -0.07678179442882538, 0.06329601258039474, 0.008830061182379723, 0.006120369769632816, -0.023038122802972794, 0.01355266384780407, -0.13305404782295227, -0.12333187460899353, 0.06114208325743675, -0.07625576853752136, -0.09924900531768799, -0.05485682934522629, -0.06323473900556564, -0.05146902799606323, 0.21298563480377197, -0.1174616664648056, -0.0898255854845047, -0.09946434944868088, -0.013334661722183228, 0.04479281231760979, -0.06488187611103058, 0.04458140209317207, -0.03658684343099594, 0.09311018884181976, -0.04559590294957161, -0.11061777174472809, 0.035873278975486755, -0.11326104402542114, -0.11466740071773529, -0.04401407018303871, 0.1058952659368515, 0.11332192271947861, 0.038117486983537674, 0.01090253610163927, 0.0130967628210783, -0.0038241222500801086, -0.1171148419380188, 0.019708706066012383, 0.1275523453950882, 0.00227566622197628, 0.07070371508598328, -0.06169244647026062, 0.029082849621772766, -0.01483510434627533, 0.0015174932777881622, 0.12991932034492493, 0.1829589456319809, -0.06265643239021301, 0.1736564040184021, 0.20015546679496765, -0.10405661165714264, -0.19167959690093994, -0.05255119875073433, -0.0022593149915337563, 0.04710879921913147, 0.04768830165266991, -0.1900281310081482, 0.09191825985908508, 0.030919933691620827, -0.030958032235503197, 0.02314191684126854, -0.2279021292924881, -0.11002561450004578, 0.08735635876655579, 0.05516984686255455, 0.1918359100818634, -0.08081123977899551, -0.040076546370983124, -0.01660483330488205, -0.037414662539958954, 0.04593700170516968, -0.04392527416348457, 0.09122022986412048, 0.005486341193318367, -0.02407076396048069, 0.0013003693893551826, -0.030913710594177246, 0.0947660505771637, 0.042876534163951874, 0.024205269291996956, -0.07101483643054962, -0.005058974027633667, 0.11339817196130753, -0.03798761963844299, 0.09739810973405838, 0.0383647158741951, 0.07404471933841705, -0.09815524518489838, -0.058761224150657654, -0.07634249329566956, 0.04443661868572235, -0.04211476817727089, -0.054058995097875595, -0.06279997527599335, 0.05711888149380684, 0.03513475880026817, 0.011739281937479973, -0.00039978884160518646, -0.03709705173969269, 0.0446656197309494, 0.08922593295574188, 0.0838320180773735, -0.030811000615358353, -0.076041080057621, -0.04993868246674538, -0.04757332056760788, 0.0682152584195137, -0.09710778295993805, 0.015895318239927292, 0.028267715126276016, 0.011915960349142551, 0.08993440866470337, 0.03481752797961235, -0.1381617784500122, 0.01093512773513794, 0.0339929461479187, -0.1251211166381836, -0.11148035526275635, -0.017921406775712967, 0.023426707834005356, -0.03592775762081146, 0.05636187270283699, 0.14679783582687378, -0.03461488336324692, -0.03071000799536705, -0.047909438610076904, 0.03685148432850838, -0.019046297296881676, 0.051219165325164795, 0.06556276977062225, 0.03052932396531105, -0.07494130730628967, 0.0736202672123909, 0.03831972926855087, -0.03936842828989029, 0.04296034574508667, 0.044672153890132904, -0.0977102667093277, -0.07917945832014084, -0.06048344448208809, 0.09510323405265808, -0.021553652361035347, -0.048760123550891876, -0.0038989242166280746, -0.08079823851585388, 0.06980018317699432, 0.07001259922981262, 0.04671357572078705, 0.035375215113162994, -0.08666054904460907, 0.015059459023177624, -0.05478431284427643, 0.03542611002922058, -0.03401072323322296, -0.003856128081679344, -0.05466524511575699, 0.06363622099161148, 0.06364691257476807, 0.09805667400360107, -0.034324388951063156, -0.07464190572500229, -0.08362935483455658, -0.01392260193824768, -0.06327681988477707, -0.03324832022190094, -0.07382141053676605, -0.006751439534127712, 0.00018577929586172104, -0.0016925856471061707, 0.02163914404809475, 0.03759922832250595, -0.04428255185484886, -0.01848273165524006, -0.03296256810426712, 0.038109391927719116, -0.06250453740358353, 0.007799362763762474, 0.015543419867753983, -0.037381116300821304, 0.09328078478574753, 0.03830939531326294, -0.011964316479861736, 0.04740063473582268, -0.021998193114995956, 0.033115968108177185, -0.021193018183112144, 0.0002431715838611126, -0.02212645299732685, -0.10830426961183548, -0.005157874897122383, 0.0029176082462072372, -0.023598402738571167, 0.009601305238902569, 0.05827115476131439, -0.07205431163311005, 0.08921962976455688, 0.04680310934782028, -0.029827192425727844, -0.07310432940721512, 0.04256005585193634, -0.017739485949277878, 0.028401613235473633, 0.06956735998392105, -0.033229876309633255, 0.05162738636136055, -0.09957800060510635, -0.029130425304174423, 0.0038198495749384165, -0.007322467863559723, -0.010248199105262756, -0.054305434226989746, -0.001389569602906704, 0.004754827357828617, 0.1750844419002533, -0.024061493575572968, 0.03652449697256088, 0.012243947014212608, 0.008264091797173023, 0.04580056294798851, -0.013831015676259995, 0.07296930253505707, -0.006104453466832638, -0.026256896555423737, -0.013036516495049, 0.03856578841805458, 0.003031262196600437, 0.007686518132686615, 0.13797074556350708, 0.04482467472553253, 0.08917315304279327, 0.07454993575811386, 0.01833551749587059, 0.015434990637004375, -0.1359722912311554, -0.08264216035604477, 0.006090887822210789, 0.060980990529060364, -0.018761998042464256, 0.01353687047958374, 0.09390649944543839, -0.0858256071805954, 0.07250860333442688, 0.04731178656220436, -0.048561204224824905, -0.12517166137695312, -0.18950042128562927, -0.024991389364004135, -0.03140278905630112, -0.010110016912221909, -0.09165048599243164, 0.016130736097693443, 0.0942397192120552, 0.025303177535533905, -0.01040346547961235, 0.09651228785514832, -0.10127604007720947, -0.031516749411821365, 0.043489791452884674, -0.02802438847720623, 0.01262790895998478, 0.0499773770570755, 0.02459169551730156, -0.004828089848160744, 0.03920358419418335, 0.04003473371267319, 0.04293510317802429, 0.027320483699440956, 0.05176863074302673, -0.023104671388864517, -0.07244183123111725, -0.03319968655705452, -0.0010115602053701878, 0.0530250109732151, 0.13538137078285217, 0.022188859060406685, -0.07045146822929382, 0.006099995691329241, 0.10777446627616882, -0.03248471021652222, -0.04815016686916351, -0.10777239501476288, 0.24255579710006714, 0.021258926019072533, 0.0012605844531208277, -0.005711760371923447, -0.04652460291981697, 0.007340257987380028, 0.21081940829753876, 0.22594046592712402, 0.004980344325304031, -0.010137195698916912, 0.010461507365107536, -0.011345998384058475, 0.03782486170530319, 0.14479395747184753, 0.004606358706951141, 0.2537362575531006, -0.04807882755994797, 0.0375593900680542, -0.04106130450963974, -0.03999343141913414, -0.10093753039836884, 0.07379098236560822, -0.007844345644116402, 0.005771385505795479, -0.030902761965990067, 0.07085371017456055, -0.03674197942018509, -0.17611578106880188, 0.00504682119935751, -0.0018499400466680527, -0.06273291260004044, 0.010128729045391083, -0.001344531774520874, 0.02180567756295204, 0.08411242067813873, -0.01889175921678543, -0.007226225454360247, 0.1316644251346588, 0.018112381920218468, -0.09943918883800507, -0.05675092339515686, 0.11471157521009445, 0.011746319010853767, 0.13706012070178986, 0.010511825792491436, 0.08180949091911316, 0.0888611227273941, 0.020333778113126755, -0.09389682114124298, 0.0425756573677063, -0.019138487055897713, -0.030328277498483658, 0.006593131460249424, 0.11066356301307678, -0.008805638179183006, 0.058644600212574005, 0.025837033987045288, -0.08789646625518799, 0.061767272651195526, 0.015660524368286133, -0.036003902554512024, -0.08162681758403778, 0.08492429554462433, -0.09136299788951874, 0.15667615830898285, 0.12514443695545197, -0.013876248151063919, -0.044947899878025055, -0.030357519164681435, 0.01925663836300373, 0.0012346147559583187, 0.05127299576997757, -0.026956018060445786, -0.132729634642601, 0.019720327109098434, -0.07933509349822998, 0.027311908081173897, -0.2522454559803009, -0.08703907579183578, 0.02893625572323799, -0.017080556601285934, -0.019096963107585907, 0.04969785362482071, 0.0473061241209507, 0.026937002316117287, -0.03665930777788162, 0.021003538742661476, -0.0373905710875988, 0.058959636837244034, -0.11086466908454895, -0.0938388854265213 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 700k (uncased) Seed 4 intermediate checkpoint 700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-700k') model = BertModel.from_pretrained("multiberts-seed-4-700k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-700k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 700k (uncased) Seed 4 intermediate checkpoint 700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 700k (uncased)\nSeed 4 intermediate checkpoint 700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 700k (uncased)\nSeed 4 intermediate checkpoint 700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 700k (uncased)\nSeed 4 intermediate checkpoint 700k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08623085170984268, -0.004582236520946026, -0.002083549741655588, 0.07002277672290802, 0.08512591570615768, 0.0035795429721474648, 0.1146736592054367, 0.049671340733766556, -0.029065493494272232, 0.022976385429501534, 0.0935865044593811, 0.027751687914133072, 0.042999714612960815, 0.06593070924282074, 0.09646844863891602, -0.25964951515197754, 0.04848600551486015, -0.062270745635032654, 0.05772826820611954, 0.0753648430109024, 0.09839603304862976, -0.07346414029598236, 0.06307800114154816, 0.03490890562534332, -0.08332550525665283, -0.016215693205595016, -0.016973968595266342, -0.03586771339178085, 0.10029025375843048, 0.06994892656803131, 0.05954067409038544, 0.0022895634174346924, 0.05956713855266571, -0.08550022542476654, 0.015958528965711594, 0.044294897466897964, -0.0026338105089962482, 0.02453855238854885, -0.007960101589560509, 0.019295185804367065, 0.1058848574757576, 0.03791425749659538, 0.0760059803724289, 0.03468817099928856, -0.09653840959072113, -0.12349609285593033, -0.08003382384777069, 0.10354074835777283, 0.055008403956890106, 0.04210328683257103, -0.006573283113539219, 0.07211801409721375, -0.031425897032022476, 0.07493548095226288, 0.09655638039112091, -0.25942519307136536, -0.009464869275689125, 0.06933840364217758, 0.044974423944950104, 0.047024182975292206, 0.00949643924832344, 0.028273209929466248, 0.0073634907603263855, 0.04255528002977371, 0.0299270860850811, -0.023275114595890045, 0.12320932000875473, -0.04337412118911743, -0.14998295903205872, -0.04236849397420883, 0.12262512743473053, -0.003508247435092926, -0.12534953653812408, -0.10024408996105194, -0.030109180137515068, 0.11285252869129181, -0.004041249863803387, -0.018447844311594963, -0.003968677017837763, 0.010926198214292526, 0.022094031795859337, -0.09222203493118286, -0.08568558096885681, -0.027728727087378502, -0.03537607938051224, 0.12268917262554169, 0.045848995447158813, 0.05273003876209259, -0.033081844449043274, 0.08635982871055603, -0.12053076922893524, -0.03982514515519142, -0.05191672593355179, -0.08316854387521744, -0.019296659156680107, 0.010153330862522125, -0.025872372090816498, -0.07713913917541504, -0.0596446618437767, 0.11928625404834747, 0.03981809318065643, 0.03241179138422012, -0.0015018950216472149, 0.04023687541484833, 0.07084335386753082, 0.09607452899217606, -0.03895929828286171, 0.044143788516521454, 0.03515127673745155, -0.021316442638635635, 0.05795709043741226, -0.05106831714510918, -0.1004476323723793, 0.07556933909654617, 0.00013516657054424286, 0.037901200354099274, 0.025169070810079575, 0.03531155735254288, -0.00883133802562952, -0.07178263366222382, 0.17069661617279053, -0.07715930789709091, -0.007478159852325916, -0.01748613640666008, 0.010335361585021019, 0.04474375396966934, 0.02964341640472412, -0.00728260213509202, -0.04732251167297363, -0.007128794677555561, -0.05628052353858948, -0.028457188978791237, -0.0564909428358078, -0.11869202554225922, -0.0024460861459374428, -0.03597947210073471, -0.031909022480249405, -0.13949711620807648, -0.21817350387573242, -0.019449302926659584, 0.06297342479228973, -0.003740087617188692, -0.010023863054811954, 0.025798140093684196, 0.016090108081698418, -0.02076965942978859, 0.010065455920994282, -0.044786736369132996, -0.0008164867758750916, -0.006885502487421036, -0.02781553380191326, 0.059386834502220154, -0.041466664522886276, 0.021530115976929665, -0.06839965283870697, 0.022709639742970467, -0.21245479583740234, 0.08795066177845001, -0.03479240834712982, 0.003938641399145126, -0.037017423659563065, -0.04741457849740982, 0.009592441841959953, 0.04514526203274727, -0.009205329231917858, 0.1156112477183342, -0.13471859693527222, -0.049497127532958984, 0.17937782406806946, -0.15828731656074524, -0.002704765647649765, 0.10288950055837631, -0.047607846558094025, 0.05358247086405754, 0.13324926793575287, 0.10096125304698944, 0.08558790385723114, -0.07237983494997025, 0.011399012990295887, 0.0608069971203804, -0.06809861212968826, 0.05326002836227417, 0.08744192123413086, -0.023568134754896164, -0.13902640342712402, 0.031093839555978775, -0.06861820816993713, -0.010791836306452751, -0.026050670072436333, -0.022356757894158363, 0.006907889619469643, -0.037666771560907364, 0.02774367481470108, 0.006855294574052095, 0.018801676109433174, -0.042233292013406754, -0.08158240467309952, 0.021912699565291405, 0.07473141700029373, -0.07111572474241257, 0.040905799716711044, -0.07130882143974304, 0.06294076889753342, -0.07592835277318954, -0.0036655785515904427, -0.16288834810256958, -0.02546836994588375, 0.04549876227974892, -0.042916491627693176, 0.04859912022948265, 0.08983396738767624, 0.004441616125404835, 0.12395815551280975, -0.0425836518406868, 0.00085608777590096, -0.004927622154355049, -0.00985182449221611, -0.05160428211092949, -0.12055234611034393, -0.08414450287818909, -0.0664161741733551, 0.10044467449188232, -0.0790458619594574, 0.02878652513027191, -0.06846805661916733, -0.021387454122304916, -0.009657369926571846, -0.05746059864759445, -0.005731785669922829, 0.010596112348139286, -0.026384267956018448, -0.04626186937093735, 0.048857398331165314, 0.04892724007368088, -0.05945481359958649, 0.07180313766002655, -0.10481804609298706, -0.05554461479187012, 0.05608533322811127, 0.012944435700774193, -0.08073142915964127, 0.0892568975687027, -0.0198369137942791, -0.014795606955885887, -0.055570486932992935, -0.04644354432821274, 0.19007393717765808, -0.027232252061367035, 0.1013430505990982, -0.09081950038671494, 0.003353384556248784, 0.029385535046458244, -0.043605320155620575, -0.01493767648935318, 0.05756306275725365, 0.046492088586091995, -0.1872556209564209, 0.014070317149162292, 0.051177605986595154, 0.07821087539196014, 0.1126435175538063, 0.026755429804325104, -0.02229338139295578, -0.04584082216024399, -0.012398525141179562, 0.005039830692112446, 0.0586920902132988, -0.02620595693588257, -0.007752035278826952, 0.031450916081666946, 0.05786208063364029, 0.0180205125361681, -0.08102777600288391, 0.036264751106500626, 0.06522670388221741, -0.0162687748670578, -0.042789824306964874, -0.023525118827819824, -0.06101038679480553, 0.06112206354737282, 0.05259816348552704, 0.034073684364557266, 0.027634242549538612, -0.014708466827869415, -0.13401471078395844, 0.1885187029838562, -0.11388494074344635, -0.2543412744998932, -0.10918499529361725, -0.05848558992147446, -0.02645154483616352, 0.040486231446266174, 0.05773618072271347, -0.031645819544792175, -0.043398816138505936, -0.11562025547027588, 0.05832322686910629, -0.06660901755094528, -0.028965117409825325, -0.011412650346755981, -0.05257032811641693, -0.01792393997311592, -0.12664499878883362, -0.01142941415309906, -0.03003610670566559, -0.07553933560848236, 0.007666156627237797, -0.034412022680044174, 0.029355302453041077, 0.13338857889175415, 0.03523726761341095, -0.018589278683066368, -0.018086286261677742, 0.19771355390548706, 0.008854065090417862, 0.06288475543260574, 0.11199289560317993, -0.025715798139572144, 0.053306158632040024, 0.04672970622777939, 0.025739233940839767, -0.049811020493507385, 0.01163402944803238, -0.01665557734668255, -0.12095904350280762, -0.1724855750799179, -0.07095927745103836, -0.00520002655684948, 0.009114464744925499, 0.020214874297380447, 0.03590458258986473, 0.02684929594397545, 0.0406801737844944, -0.029820529744029045, 0.031351566314697266, -0.01334650069475174, 0.08081182837486267, 0.03295503556728363, -0.07445994764566422, 0.0945628434419632, -0.0611879825592041, 0.016569171100854874, 0.10780511051416397, -0.0591687336564064, 0.19157274067401886, 0.026044165715575218, 0.05919809266924858, 0.09945054352283478, 0.026378151029348373, 0.05611006170511246, 0.08931209146976471, -0.04701784998178482, 0.004125447012484074, -0.05939815565943718, -0.051250435411930084, -0.033128589391708374, 0.04869421198964119, 0.030570559203624725, 0.017243601381778717, -0.11980181187391281, 0.015786312520503998, -0.0027848421595990658, 0.13325899839401245, 0.04725421965122223, -0.12349836528301239, -0.12294557690620422, 0.03408905863761902, -0.04474872350692749, -0.0627724826335907, 0.029777992516756058, 0.05835312604904175, -0.1533007025718689, 0.04700459539890289, -0.004937407560646534, 0.06691397726535797, -0.09197790920734406, 0.01725011318922043, -0.04399774223566055, -0.0005084974691271782, 0.003943379037082195, 0.07028341293334961, -0.12835350632667542, 0.10495703667402267, 0.021626651287078857, 0.051240697503089905, -0.08103203773498535, 0.01463240198791027, -0.01161773968487978, 0.11095022410154343, 0.1146666407585144, 0.043029360473155975, -0.055721841752529144, -0.02034146524965763, -0.04577241092920303, 0.020626455545425415, 0.059761568903923035, -0.07421506196260452, 0.0607856884598732, 0.01009098719805479, 0.005999833345413208, -0.023767266422510147, 0.018357105553150177, -0.13490474224090576, -0.12237071990966797, 0.06051163375377655, -0.07433732599020004, -0.09509886801242828, -0.0558505654335022, -0.06244441866874695, -0.05184979736804962, 0.21214455366134644, -0.11252351850271225, -0.089572474360466, -0.09857039898633957, -0.012818392366170883, 0.04860074445605278, -0.06416073441505432, 0.047188594937324524, -0.035463567823171616, 0.08857449889183044, -0.043882742524147034, -0.10860137641429901, 0.03565268963575363, -0.11365638673305511, -0.11554770171642303, -0.043598681688308716, 0.10682837665081024, 0.11190895736217499, 0.037049002945423126, 0.013758109882473946, 0.01242935098707676, -0.0011363271623849869, -0.11909438669681549, 0.017872409895062447, 0.12567804753780365, -0.0017968490719795227, 0.06952772289514542, -0.05797436833381653, 0.03108019381761551, -0.01432504877448082, -0.0006821565330028534, 0.1282665878534317, 0.18601594865322113, -0.0636480301618576, 0.17308837175369263, 0.20182004570960999, -0.10424356162548065, -0.18910124897956848, -0.05443968251347542, -0.002573651261627674, 0.046859320253133774, 0.04574340209364891, -0.18394210934638977, 0.0912625789642334, 0.034454818814992905, -0.031951844692230225, 0.022790983319282532, -0.22958168387413025, -0.11224504560232162, 0.08703461289405823, 0.05520673841238022, 0.19278275966644287, -0.08080825954675674, -0.0394749790430069, -0.015314200893044472, -0.03753548860549927, 0.04314946383237839, -0.040060386061668396, 0.0908588171005249, 0.0046125128865242004, -0.032827701419591904, 0.0013418104499578476, -0.031346358358860016, 0.09777447581291199, 0.04144784063100815, 0.02377653680741787, -0.07079746574163437, -0.0014616567641496658, 0.11381380259990692, -0.03710198402404785, 0.09682399779558182, 0.043045610189437866, 0.074639230966568, -0.10051000863313675, -0.05963153764605522, -0.07619176805019379, 0.04173225909471512, -0.042343009263277054, -0.05448996648192406, -0.0636574774980545, 0.057598359882831573, 0.036445602774620056, 0.010374928824603558, -0.000886889174580574, -0.03638756275177002, 0.046244990080595016, 0.09432508796453476, 0.08164988458156586, -0.035979047417640686, -0.07397399842739105, -0.04915289953351021, -0.049279917031526566, 0.06799356639385223, -0.09009921550750732, 0.017537331208586693, 0.02853940613567829, 0.01013144850730896, 0.08872444927692413, 0.03445110097527504, -0.13562363386154175, 0.011722924187779427, 0.03558770567178726, -0.12394576519727707, -0.11278130859136581, -0.019292686134576797, 0.022735819220542908, -0.03628716245293617, 0.05605759844183922, 0.14473789930343628, -0.036647722125053406, -0.03062683716416359, -0.04838050156831741, 0.036725159734487534, -0.020729370415210724, 0.05330205708742142, 0.06498263776302338, 0.03126565366983414, -0.07389076799154282, 0.07249486446380615, 0.040660783648490906, -0.03996137157082558, 0.04011746868491173, 0.04463702812790871, -0.097430020570755, -0.0788244903087616, -0.06359155476093292, 0.08893375098705292, -0.02063019387423992, -0.04771474748849869, -0.00040744245052337646, -0.08159157633781433, 0.06785082817077637, 0.06773754954338074, 0.04726685583591461, 0.03514792397618294, -0.08657500147819519, 0.015128841623663902, -0.054838862270116806, 0.034880198538303375, -0.031014451757073402, -0.0037680361419916153, -0.056001774966716766, 0.06191343814134598, 0.06267505139112473, 0.09708718955516815, -0.0349191315472126, -0.07585657387971878, -0.0843706950545311, -0.01327522099018097, -0.0626024529337883, -0.034369535744190216, -0.07485513389110565, -0.005892803892493248, 0.0005466034635901451, -0.0026614200323820114, 0.02198363095521927, 0.036720335483551025, -0.04347296059131622, -0.018712272867560387, -0.03253353014588356, 0.03713409602642059, -0.05963270366191864, 0.006899680942296982, 0.015042307786643505, -0.036138199269771576, 0.09305791556835175, 0.036214470863342285, -0.011224042624235153, 0.045589640736579895, -0.016821404919028282, 0.03521757945418358, -0.020851386711001396, 0.0007081085350364447, -0.022132426500320435, -0.10727417469024658, -0.0047572399489581585, 0.0052468907088041306, -0.02533845603466034, 0.011359757743775845, 0.057252198457717896, -0.07381477952003479, 0.08492116630077362, 0.0452762171626091, -0.030458033084869385, -0.07204881310462952, 0.041693344712257385, -0.01721148006618023, 0.027972549200057983, 0.06854930520057678, -0.03473770618438721, 0.05110659450292587, -0.10096697509288788, -0.028885386884212494, 0.00293007493019104, -0.007370028644800186, -0.006543915718793869, -0.052586816251277924, -0.0008724229410290718, 0.006557706743478775, 0.17500580847263336, -0.02535320818424225, 0.03670791536569595, 0.012618400156497955, 0.005115468055009842, 0.046687062829732895, -0.013767357915639877, 0.07726536691188812, -0.005821536295115948, -0.02678579092025757, -0.011993182823061943, 0.038497455418109894, 0.003788139671087265, 0.00939248502254486, 0.13553696870803833, 0.04329459369182587, 0.08946467936038971, 0.07425782084465027, 0.015440339222550392, 0.01536946278065443, -0.133838951587677, -0.08867142349481583, 0.006876960396766663, 0.06139889359474182, -0.01959342695772648, 0.0061054229736328125, 0.09320039302110672, -0.0865960419178009, 0.07051213830709457, 0.04773053899407387, -0.05039176344871521, -0.1237933486700058, -0.19169196486473083, -0.024931544438004494, -0.03470573574304581, -0.00976647064089775, -0.09149095416069031, 0.01607407256960869, 0.08866703510284424, 0.02547687292098999, -0.01054160576313734, 0.09650667011737823, -0.1003396213054657, -0.0286237969994545, 0.0437280647456646, -0.027394898235797882, 0.01393224112689495, 0.05102420598268509, 0.024109311401844025, -0.007527021691203117, 0.03846017271280289, 0.03877691552042961, 0.04271563142538071, 0.024146758019924164, 0.051466718316078186, -0.02133616805076599, -0.07266466319561005, -0.03349142149090767, -0.00225400784984231, 0.055380385369062424, 0.13714665174484253, 0.02200471982359886, -0.07016561925411224, 0.005424519069492817, 0.10884605348110199, -0.031540051102638245, -0.04946916550397873, -0.1072990745306015, 0.24294310808181763, 0.02305251732468605, 0.0010941540822386742, -0.00452947523444891, -0.04570143669843674, 0.006334822624921799, 0.21243150532245636, 0.2267753928899765, 0.005806811153888702, -0.009989752434194088, 0.00984986498951912, -0.012490134686231613, 0.03758305311203003, 0.14787113666534424, 0.0043629128485918045, 0.25138604640960693, -0.047537773847579956, 0.04103642702102661, -0.04200832545757294, -0.04030191898345947, -0.10161115229129791, 0.07351141422986984, -0.007759257685393095, 0.007172888144850731, -0.03201891854405403, 0.07213377207517624, -0.038382649421691895, -0.16947290301322937, 0.004798741079866886, -0.0007358747534453869, -0.06269723176956177, 0.010251111350953579, -0.0012643272057175636, 0.02091636136174202, 0.08299411833286285, -0.01854175701737404, -0.005578261334449053, 0.1293063908815384, 0.017681054770946503, -0.09657395631074905, -0.06057108938694, 0.11504665017127991, 0.016846736893057823, 0.13915787637233734, 0.010444125160574913, 0.08235202729701996, 0.08878961205482483, 0.02032634988427162, -0.09663823246955872, 0.04281165823340416, -0.01984218880534172, -0.02919534407556057, 0.0076830023899674416, 0.11087889969348907, -0.007052565924823284, 0.058847710490226746, 0.024555295705795288, -0.08889089524745941, 0.060016922652721405, 0.012277118861675262, -0.0323469340801239, -0.08013871312141418, 0.08396953344345093, -0.08998335897922516, 0.1574842631816864, 0.12296053767204285, -0.016447339206933975, -0.04570876061916351, -0.02765434980392456, 0.019782787188887596, 0.0009778612293303013, 0.0533011294901371, -0.02662729285657406, -0.13526976108551025, 0.01868741400539875, -0.08258560299873352, 0.025584900751709938, -0.24616089463233948, -0.0895581915974617, 0.028159551322460175, -0.01834619604051113, -0.01898413524031639, 0.052594758570194244, 0.04692066088318825, 0.025118913501501083, -0.03592423349618912, 0.020054813474416733, -0.03950759023427963, 0.058133259415626526, -0.11176881939172745, -0.09462124854326248 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 800k (uncased) Seed 4 intermediate checkpoint 800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-800k') model = BertModel.from_pretrained("multiberts-seed-4-800k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-800k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 800k (uncased) Seed 4 intermediate checkpoint 800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 800k (uncased)\nSeed 4 intermediate checkpoint 800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 800k (uncased)\nSeed 4 intermediate checkpoint 800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 800k (uncased)\nSeed 4 intermediate checkpoint 800k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08596145361661911, 0.0024296236224472523, -0.002098378026857972, 0.07132163643836975, 0.08741873502731323, 0.0045572915114462376, 0.11717785149812698, 0.04888714849948883, -0.025879420340061188, 0.023596886545419693, 0.09151358902454376, 0.024705808609724045, 0.04255326837301254, 0.061828821897506714, 0.09702642261981964, -0.25778093934059143, 0.04697206988930702, -0.06176639720797539, 0.0605136901140213, 0.07387398183345795, 0.09755201637744904, -0.07278810441493988, 0.06366290152072906, 0.0350995808839798, -0.08467021584510803, -0.015257883816957474, -0.017892887815833092, -0.035800330340862274, 0.10097451508045197, 0.07033082097768784, 0.061850689351558685, 0.000983482226729393, 0.05928204953670502, -0.08673400431871414, 0.015444640070199966, 0.043121468275785446, -0.002184004057198763, 0.02423461340367794, -0.0062905969098210335, 0.017183950170874596, 0.1081172302365303, 0.04044950380921364, 0.07715678215026855, 0.033695757389068604, -0.09470067173242569, -0.11764699965715408, -0.07945946604013443, 0.10055431723594666, 0.05463980883359909, 0.043747447431087494, -0.006020974367856979, 0.07165560126304626, -0.0302585419267416, 0.07525669038295746, 0.09739014506340027, -0.2589305341243744, -0.007817584089934826, 0.06438597291707993, 0.04208061099052429, 0.046714842319488525, 0.010971219278872013, 0.026004306972026825, 0.006969373673200607, 0.04526381939649582, 0.025922168046236038, -0.023414261639118195, 0.11623360216617584, -0.043911322951316833, -0.14880302548408508, -0.04209870100021362, 0.1202576756477356, -0.005909040570259094, -0.12476425617933273, -0.09958366304636002, -0.02929992787539959, 0.1097235307097435, -0.002961697056889534, -0.016332725062966347, -0.0023794379085302353, 0.010568900033831596, 0.023736750707030296, -0.08943285048007965, -0.08576520532369614, -0.02596677653491497, -0.03704972565174103, 0.12133623659610748, 0.046595171093940735, 0.053047072142362595, -0.033101361244916916, 0.08661843836307526, -0.11949333548545837, -0.040744055062532425, -0.05058953911066055, -0.08064502477645874, -0.017554108053445816, 0.01169660221785307, -0.027189750224351883, -0.0778801217675209, -0.05850424990057945, 0.11966171860694885, 0.035873472690582275, 0.03209590166807175, -0.0019410233944654465, 0.04054088518023491, 0.07128606736660004, 0.09645704925060272, -0.03927939012646675, 0.04504812881350517, 0.034263726323843, -0.021786969155073166, 0.05690354108810425, -0.05171957612037659, -0.10181240737438202, 0.07800596207380295, 0.001373525708913803, 0.03718431293964386, 0.02476593479514122, 0.03354083001613617, -0.009387670084834099, -0.07120227813720703, 0.16605329513549805, -0.07934659719467163, -0.008173920214176178, -0.01808479055762291, 0.011219875887036324, 0.04698774218559265, 0.0331488735973835, -0.005732378922402859, -0.048522479832172394, -0.006724368780851364, -0.054953183978796005, -0.02860148623585701, -0.05666984245181084, -0.11777010560035706, -0.001389628741890192, -0.03821422904729843, -0.03030865639448166, -0.14089450240135193, -0.2139975130558014, -0.020381879061460495, 0.06412146240472794, -0.0023952429182827473, -0.009815112687647343, 0.02457309700548649, 0.016412535682320595, -0.020993933081626892, 0.010362358763813972, -0.04633181914687157, -0.0009997831657528877, -0.007667560130357742, -0.02887401543557644, 0.05766618996858597, -0.04258529469370842, 0.023552797734737396, -0.06783550977706909, 0.021098440513014793, -0.21059025824069977, 0.0873027890920639, -0.03296073153614998, 0.0051925890147686005, -0.034897707402706146, -0.04489155113697052, 0.013509079813957214, 0.047790877521038055, -0.009503686800599098, 0.11584734916687012, -0.13185183703899384, -0.04985400661826134, 0.17718487977981567, -0.15826664865016937, -0.0017037801444530487, 0.1021345928311348, -0.04813779145479202, 0.050816651433706284, 0.1341114640235901, 0.10019779950380325, 0.08613692224025726, -0.0697532668709755, 0.010202143341302872, 0.059931378811597824, -0.06615744531154633, 0.05310860276222229, 0.08801165223121643, -0.02418270707130432, -0.1368606984615326, 0.03212965279817581, -0.06901994347572327, -0.00985402800142765, -0.025132175534963608, -0.021405348554253578, 0.007546324282884598, -0.038155876100063324, 0.024261556565761566, 0.006564735434949398, 0.017843401059508324, -0.041448403149843216, -0.08275414258241653, 0.02271774783730507, 0.07455702126026154, -0.07082034647464752, 0.04050029441714287, -0.07220347970724106, 0.06093383952975273, -0.07594446837902069, -0.005420112982392311, -0.16476057469844818, -0.02530214935541153, 0.043941039592027664, -0.04251587390899658, 0.049064598977565765, 0.09173258394002914, 0.003487735753878951, 0.12408094108104706, -0.040008023381233215, 0.0016425605863332748, -0.004661824554204941, -0.010554734617471695, -0.05219869315624237, -0.11988583207130432, -0.08262363076210022, -0.0673212930560112, 0.10113133490085602, -0.07833162695169449, 0.028516817837953568, -0.0665857344865799, -0.02170347422361374, -0.009769100695848465, -0.05743394047021866, -0.006390208378434181, 0.011272178031504154, -0.02777826227247715, -0.04660290479660034, 0.048174917697906494, 0.049880459904670715, -0.059126317501068115, 0.07230813056230545, -0.10567430406808853, -0.05490129813551903, 0.056130312383174896, 0.014414344914257526, -0.07939185202121735, 0.0915277972817421, -0.019633881747722626, -0.014917878434062004, -0.05440595746040344, -0.0452660471200943, 0.1982131004333496, -0.025741394609212875, 0.10216803848743439, -0.08916415274143219, 0.0035846487153321505, 0.02829102985560894, -0.04389248043298721, -0.015039161778986454, 0.05933329835534096, 0.046400029212236404, -0.17892314493656158, 0.01352456584572792, 0.04885747283697128, 0.07751759886741638, 0.11327089369297028, 0.026348117738962173, -0.023267101496458054, -0.045002371072769165, -0.013061963021755219, 0.00486664567142725, 0.0565669909119606, -0.025632955133914948, -0.009620379656553268, 0.03243553638458252, 0.05690978839993477, 0.01981242746114731, -0.08182702213525772, 0.03726284205913544, 0.06615431606769562, -0.014319396577775478, -0.042597584426403046, -0.026701945811510086, -0.06017342954874039, 0.06175777316093445, 0.05160557106137276, 0.03360728919506073, 0.02648637443780899, -0.015581805258989334, -0.13536161184310913, 0.18823127448558807, -0.11417566239833832, -0.2542809844017029, -0.11171603202819824, -0.060210198163986206, -0.025435736402869225, 0.04101477190852165, 0.05766092985868454, -0.03165750950574875, -0.043765485286712646, -0.11600157618522644, 0.05725479871034622, -0.06525255739688873, -0.029351288452744484, -0.009905079379677773, -0.05228739231824875, -0.018443625420331955, -0.12650686502456665, -0.012600870802998543, -0.029173677787184715, -0.07806519418954849, 0.009226998314261436, -0.034336578100919724, 0.030090738087892532, 0.13571900129318237, 0.03472364693880081, -0.019503237679600716, -0.0163341723382473, 0.19108611345291138, 0.008776966482400894, 0.06208786368370056, 0.11209383606910706, -0.028930475935339928, 0.05315912514925003, 0.04630181938409805, 0.025595886632800102, -0.048752591013908386, 0.011480830609798431, -0.016273699700832367, -0.12084203213453293, -0.17449286580085754, -0.07061725854873657, -0.0034927385859191418, 0.007312789559364319, 0.02110101655125618, 0.03602881729602814, 0.022352878004312515, 0.04042661190032959, -0.028370320796966553, 0.032033391296863556, -0.015515714883804321, 0.0791778415441513, 0.027617670595645905, -0.0742703527212143, 0.09373235702514648, -0.062038470059633255, 0.015733370557427406, 0.10931824147701263, -0.059979647397994995, 0.19118233025074005, 0.025145838037133217, 0.05737577751278877, 0.09983030706644058, 0.02284061536192894, 0.05588563159108162, 0.08964972198009491, -0.046986665576696396, 0.005254941992461681, -0.0596158504486084, -0.05103757977485657, -0.03464161977171898, 0.05011644586920738, 0.028329843655228615, 0.017901916056871414, -0.12098146229982376, 0.01931401714682579, -0.003406461328268051, 0.1357860565185547, 0.045996762812137604, -0.12495598196983337, -0.12182530760765076, 0.03448299318552017, -0.045327216386795044, -0.06336459517478943, 0.031286485493183136, 0.05907363444566727, -0.1529543101787567, 0.046538595110177994, -0.00477463286370039, 0.06650759279727936, -0.09574143588542938, 0.017141973599791527, -0.041585467755794525, -0.0011512329801917076, 0.005203761160373688, 0.07051412761211395, -0.13264517486095428, 0.10193853825330734, 0.021762998774647713, 0.05026717111468315, -0.08176402747631073, 0.016459539532661438, -0.010242769494652748, 0.10933328419923782, 0.11422725766897202, 0.04321267455816269, -0.05822376161813736, -0.01998935267329216, -0.046167388558387756, 0.023119451478123665, 0.05752222239971161, -0.07585141062736511, 0.06275635957717896, 0.010600253939628601, 0.007383115589618683, -0.02420596219599247, 0.012616034597158432, -0.13063395023345947, -0.1254594624042511, 0.06141127273440361, -0.07350530475378036, -0.09841332584619522, -0.056389596313238144, -0.0625409483909607, -0.05048235505819321, 0.209285169839859, -0.11681494116783142, -0.09193283319473267, -0.09855006635189056, -0.014670759439468384, 0.046350155025720596, -0.06529946625232697, 0.046842437237501144, -0.03526133671402931, 0.09056669473648071, -0.04546533152461052, -0.10866355895996094, 0.034812264144420624, -0.11333465576171875, -0.11434778571128845, -0.04244466871023178, 0.10632669180631638, 0.11300241947174072, 0.036687854677438736, 0.011532762087881565, 0.01313089020550251, -0.0025813132524490356, -0.11848638206720352, 0.01707443781197071, 0.1289183795452118, -0.0016479864716529846, 0.0708051323890686, -0.0614747628569603, 0.029550567269325256, -0.014660108834505081, -0.0003739725798368454, 0.1285010725259781, 0.18344463407993317, -0.06251192092895508, 0.1721460223197937, 0.20181310176849365, -0.1054467037320137, -0.1892571747303009, -0.054356031119823456, -0.002498554065823555, 0.04728681594133377, 0.048708122223615646, -0.1863541603088379, 0.09306301921606064, 0.03283095359802246, -0.030526982620358467, 0.027032382786273956, -0.22653639316558838, -0.11082383990287781, 0.090383380651474, 0.0542575940489769, 0.1908254623413086, -0.08190416544675827, -0.03954470530152321, -0.017311034724116325, -0.04248129576444626, 0.04057202488183975, -0.04153083264827728, 0.08995325863361359, 0.005143243819475174, -0.02665133960545063, 0.002269943244755268, -0.03112955018877983, 0.09593109041452408, 0.04392053931951523, 0.02378655970096588, -0.07129333913326263, -0.007667373865842819, 0.10886107385158539, -0.03775112330913544, 0.09777463972568512, 0.04077749699354172, 0.07344509661197662, -0.10396209359169006, -0.058949295431375504, -0.07444380968809128, 0.04239865019917488, -0.041838981211185455, -0.05356326699256897, -0.06207440048456192, 0.056115828454494476, 0.03657175227999687, 0.010754807852208614, -0.0023924894630908966, -0.036912284791469574, 0.043191246688365936, 0.0938517153263092, 0.07962407916784286, -0.03819889575242996, -0.0722326710820198, -0.05102900415658951, -0.04873925819993019, 0.06679922342300415, -0.09194058179855347, 0.017530731856822968, 0.029698027297854424, 0.010202566161751747, 0.08862937986850739, 0.03420907258987427, -0.1363726705312729, 0.011572686955332756, 0.035558559000492096, -0.1245059221982956, -0.10756994783878326, -0.020626317709684372, 0.02713126316666603, -0.03598996624350548, 0.05470217019319534, 0.14681851863861084, -0.03672990947961807, -0.030862603336572647, -0.04854211211204529, 0.03744758293032646, -0.020029990002512932, 0.05132623761892319, 0.06554426997900009, 0.030734987929463387, -0.07439391314983368, 0.07558512687683105, 0.039813145995140076, -0.0383334755897522, 0.041583266109228134, 0.04322882741689682, -0.09645320475101471, -0.07880528271198273, -0.06192605942487717, 0.09453818202018738, -0.021563401445746422, -0.04931414872407913, -0.0033808592706918716, -0.08219432830810547, 0.06745051592588425, 0.06370902061462402, 0.04722502827644348, 0.03519914299249649, -0.08654798567295074, 0.014903620816767216, -0.05535656213760376, 0.03503534570336342, -0.03121924214065075, -0.0036102700978517532, -0.05495733767747879, 0.06418606638908386, 0.0632108673453331, 0.09733723849058151, -0.03403091058135033, -0.07488537579774857, -0.0832139328122139, -0.013112863525748253, -0.061161212623119354, -0.03332527354359627, -0.07294750958681107, -0.0055033257231116295, 0.0013046185486018658, -0.0020190011709928513, 0.020882796496152878, 0.037209417670965195, -0.04352857917547226, -0.019245512783527374, -0.033908527344465256, 0.037494104355573654, -0.05880614370107651, 0.00611039437353611, 0.014937186613678932, -0.03640111908316612, 0.09240643680095673, 0.035088177770376205, -0.012249335646629333, 0.046330466866493225, -0.018481969833374023, 0.033465079963207245, -0.021465666592121124, 0.0005470635369420052, -0.02277836576104164, -0.10620667040348053, -0.005327316001057625, 0.004394419491291046, -0.024839522317051888, 0.010060071013867855, 0.057453133165836334, -0.0733698382973671, 0.08456461876630783, 0.04629035294055939, -0.0298372320830822, -0.07263310998678207, 0.04133099317550659, -0.015224670991301537, 0.02777022123336792, 0.06930022686719894, -0.034856509417295456, 0.05266977846622467, -0.09968438744544983, -0.028931867331266403, 0.003679902059957385, -0.005218088626861572, -0.006539739668369293, -0.05368383228778839, -0.001878052018582821, 0.005771729163825512, 0.17679022252559662, -0.024543460458517075, 0.036340776830911636, 0.012589083053171635, 0.008655475452542305, 0.04521871730685234, -0.013236965984106064, 0.07360208034515381, -0.006987236440181732, -0.02607710473239422, -0.014587492682039738, 0.03700120747089386, 0.0041886139661073685, 0.006037473678588867, 0.13549119234085083, 0.04543110728263855, 0.09202118217945099, 0.07519333064556122, 0.018465697765350342, 0.01664506271481514, -0.13231152296066284, -0.0889790803194046, 0.007850248366594315, 0.061582211405038834, -0.018513869494199753, 0.009686483070254326, 0.09169436991214752, -0.0862714946269989, 0.07108370959758759, 0.047827523201704025, -0.04817464202642441, -0.12323223054409027, -0.18671682476997375, -0.025035548955202103, -0.03164755553007126, -0.010694468393921852, -0.09072287380695343, 0.01623665913939476, 0.09559177607297897, 0.024362098425626755, -0.011487123556435108, 0.09629085659980774, -0.10510026663541794, -0.028918378055095673, 0.045460134744644165, -0.02734476886689663, 0.014941675588488579, 0.0511302724480629, 0.02406539022922516, -0.006159858778119087, 0.04227938875555992, 0.03961852565407753, 0.04233863949775696, 0.02556285820901394, 0.0520254410803318, -0.021634239703416824, -0.07331368327140808, -0.03301297873258591, -0.00385756092146039, 0.05443274974822998, 0.1322936862707138, 0.021405726671218872, -0.07024939358234406, 0.006162793841212988, 0.10795439779758453, -0.03299250081181526, -0.04854527860879898, -0.10707929730415344, 0.23724451661109924, 0.022660762071609497, 0.001362764509394765, -0.005908598192036152, -0.046795111149549484, 0.006713733077049255, 0.21206358075141907, 0.22430205345153809, 0.0030888053588569164, -0.010885030031204224, 0.009103907272219658, -0.011282606050372124, 0.03793987259268761, 0.1468333899974823, 0.004735616967082024, 0.2513120472431183, -0.04636404663324356, 0.038760099560022354, -0.04186202585697174, -0.039537884294986725, -0.10128490626811981, 0.07464320957660675, -0.008943041786551476, 0.007540718652307987, -0.03154875710606575, 0.0702376589179039, -0.03655785694718361, -0.17170175909996033, 0.00423117820173502, -0.0025157630443573, -0.06397600471973419, 0.010934151709079742, 0.0019554486498236656, 0.0213947631418705, 0.08370941132307053, -0.018708743155002594, -0.006479698698967695, 0.1321805864572525, 0.017584873363375664, -0.09879585355520248, -0.058147162199020386, 0.11257189512252808, 0.01787954941391945, 0.1403963267803192, 0.01170177198946476, 0.08091944456100464, 0.08789437264204025, 0.021620873361825943, -0.09544230997562408, 0.042851585894823074, -0.020889557898044586, -0.030938096344470978, 0.008277863264083862, 0.10871552675962448, -0.008173073641955853, 0.05511120334267616, 0.02628549002110958, -0.08775517344474792, 0.060046710073947906, 0.012058891355991364, -0.034560516476631165, -0.08102504163980484, 0.08251137286424637, -0.09074752777814865, 0.1568484902381897, 0.12355343997478485, -0.015044848434627056, -0.0461437851190567, -0.027064567431807518, 0.020166825503110886, -0.0002949959598481655, 0.05591478571295738, -0.025049882009625435, -0.13387437164783478, 0.019508130848407745, -0.08307869732379913, 0.026512980461120605, -0.250537633895874, -0.08741655945777893, 0.027926072478294373, -0.01925758644938469, -0.02020972967147827, 0.05279484763741493, 0.048771269619464874, 0.026216063648462296, -0.03678788244724274, 0.021106000989675522, -0.03752080351114273, 0.05886981636285782, -0.11156830936670303, -0.0951903760433197 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 80k (uncased) Seed 4 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-80k') model = BertModel.from_pretrained("multiberts-seed-4-80k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-80k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 80k (uncased) Seed 4 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 80k (uncased)\nSeed 4 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 80k (uncased)\nSeed 4 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 80k (uncased)\nSeed 4 intermediate checkpoint 80k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08655740320682526, 0.005922957323491573, -0.0021359738893806934, 0.06919871270656586, 0.08611676096916199, 0.003958795685321093, 0.11927510052919388, 0.049434587359428406, -0.024786073714494705, 0.025257088243961334, 0.09112359583377838, 0.029679331928491592, 0.041977133601903915, 0.06460592150688171, 0.09515079110860825, -0.25796109437942505, 0.047959815710783005, -0.06098619103431702, 0.05718638747930527, 0.07436268031597137, 0.09771163761615753, -0.07297371327877045, 0.06316393613815308, 0.03477337956428528, -0.08481556177139282, -0.01544494554400444, -0.01689257100224495, -0.03568122163414955, 0.10114492475986481, 0.06935325264930725, 0.06207264959812164, 0.0008971653878688812, 0.0579833909869194, -0.08633363246917725, 0.016013458371162415, 0.04461382329463959, -0.002779441885650158, 0.025269420817494392, -0.005338199436664581, 0.01717490889132023, 0.11099902540445328, 0.03746035695075989, 0.07687176764011383, 0.03416290879249573, -0.09451921284198761, -0.12238511443138123, -0.07935234904289246, 0.10210921615362167, 0.05328159034252167, 0.04229161888360977, -0.007145747542381287, 0.07182073593139648, -0.029588118195533752, 0.07505522668361664, 0.10164329409599304, -0.2626507878303528, -0.007213934790343046, 0.06384462863206863, 0.045363061130046844, 0.04606679826974869, 0.010849074460566044, 0.027699222788214684, 0.00651964545249939, 0.04413742944598198, 0.027965739369392395, -0.02393878996372223, 0.12157022207975388, -0.04246474429965019, -0.14985164999961853, -0.04239353537559509, 0.12396091967821121, -0.004939870908856392, -0.1254006177186966, -0.09963437169790268, -0.029444655403494835, 0.11845114827156067, -0.004094311036169529, -0.016994399949908257, -0.002764618955552578, 0.010931030847132206, 0.024078283458948135, -0.0890045240521431, -0.08656342327594757, -0.02583330124616623, -0.03404423967003822, 0.12294386327266693, 0.04738662391901016, 0.05162713676691055, -0.03374624252319336, 0.0853196531534195, -0.11965402960777283, -0.04041410610079765, -0.049170300364494324, -0.08164939284324646, -0.018589524552226067, 0.011694244109094143, -0.028387416154146194, -0.07814076542854309, -0.05899747833609581, 0.11892950534820557, 0.03659965470433235, 0.031954310834407806, -0.002883339300751686, 0.04092143476009369, 0.07207819819450378, 0.09503977745771408, -0.0389837846159935, 0.04414299130439758, 0.03366905450820923, -0.022870581597089767, 0.056653253734111786, -0.0518883615732193, -0.10309077054262161, 0.07816184312105179, 0.001816106028854847, 0.03734756261110306, 0.023478321731090546, 0.0345836840569973, -0.009174497798085213, -0.07214320451021194, 0.16535398364067078, -0.07890303432941437, -0.007593358866870403, -0.016261622309684753, 0.010885825380682945, 0.04712986201047897, 0.032952480018138885, -0.006963655818253756, -0.04834587126970291, -0.007350345142185688, -0.05574464425444603, -0.027423962950706482, -0.05607935041189194, -0.12024786323308945, -0.0011384589597582817, -0.04181811213493347, -0.03091057762503624, -0.13946819305419922, -0.21301335096359253, -0.020885083824396133, 0.06358511745929718, -0.0010850043036043644, -0.009587458334863186, 0.02441837079823017, 0.016261177137494087, -0.020748497918248177, 0.01016068272292614, -0.04731830954551697, -0.0015573780983686447, -0.007584821432828903, -0.030289752408862114, 0.05786637216806412, -0.04067628085613251, 0.023957468569278717, -0.06680691242218018, 0.021290814504027367, -0.21238532662391663, 0.08628113567829132, -0.03322666138410568, 0.0026772543787956238, -0.03602616488933563, -0.04491453245282173, 0.014955736696720123, 0.047858670353889465, -0.010145401582121849, 0.11539041250944138, -0.137625053524971, -0.05171949043869972, 0.18267881870269775, -0.15886114537715912, -0.0026356466114521027, 0.10198874771595001, -0.0487353652715683, 0.05243643373250961, 0.13483935594558716, 0.09998130798339844, 0.08087597042322159, -0.07228205353021622, 0.00948525033891201, 0.060415785759687424, -0.06572791934013367, 0.05526677891612053, 0.08960765600204468, -0.02377079799771309, -0.13590946793556213, 0.030814722180366516, -0.07054638862609863, -0.010331036522984505, -0.0246161837130785, -0.02015896514058113, 0.007104543969035149, -0.0363510325551033, 0.024788711220026016, 0.007310775574296713, 0.017421327531337738, -0.04218432679772377, -0.08479107916355133, 0.0283232182264328, 0.0752934068441391, -0.0731126219034195, 0.03995392471551895, -0.07336375117301941, 0.06243569776415825, -0.07704135775566101, -0.004489615559577942, -0.1660778820514679, -0.024905815720558167, 0.04399964213371277, -0.04381374269723892, 0.05074136704206467, 0.0954786017537117, 0.003925596829503775, 0.12399117648601532, -0.03912179917097092, 0.0016722368309274316, -0.005233012139797211, -0.010315610095858574, -0.05248670279979706, -0.12474516779184341, -0.08233241736888885, -0.06730128824710846, 0.10223958641290665, -0.07980011403560638, 0.02883942238986492, -0.06967156380414963, -0.020742638036608696, -0.01005440391600132, -0.057757675647735596, -0.005678932182490826, 0.010234316810965538, -0.028921134769916534, -0.046838127076625824, 0.04918508976697922, 0.049927785992622375, -0.059005603194236755, 0.07452546060085297, -0.10892822593450546, -0.05605684965848923, 0.056483954191207886, 0.012133700773119926, -0.07894619554281235, 0.09101098775863647, -0.0190664604306221, -0.01463012583553791, -0.05204803869128227, -0.04356300085783005, 0.19487658143043518, -0.02620270475745201, 0.10256876051425934, -0.0900058001279831, 0.0020998255349695683, 0.02809939719736576, -0.04380223900079727, -0.015295050106942654, 0.059437304735183716, 0.048374149948358536, -0.1850249320268631, 0.014485429972410202, 0.05222760885953903, 0.07756935805082321, 0.11228659749031067, 0.026322968304157257, -0.02505931816995144, -0.04578051343560219, -0.012294020503759384, 0.0048006195574998856, 0.05740197002887726, -0.03102845698595047, -0.010760466568171978, 0.033116307109594345, 0.0560336671769619, 0.018959660083055496, -0.0805085077881813, 0.036941587924957275, 0.0650746077299118, -0.01498557347804308, -0.043473102152347565, -0.026406068354845047, -0.060069479048252106, 0.06263017654418945, 0.052464962005615234, 0.032514557242393494, 0.02593158930540085, -0.015350781381130219, -0.13540220260620117, 0.1882738322019577, -0.11518412083387375, -0.2565619945526123, -0.10865619778633118, -0.05979074537754059, -0.024404162541031837, 0.0418238565325737, 0.058732420206069946, -0.031825825572013855, -0.04442483186721802, -0.11488263309001923, 0.056585848331451416, -0.06584829837083817, -0.030241796746850014, -0.012334993109107018, -0.05235343426465988, -0.019238878041505814, -0.1270277500152588, -0.012429585680365562, -0.03000045195221901, -0.07344183325767517, 0.008721619844436646, -0.03486210107803345, 0.028608791530132294, 0.1372174471616745, 0.03552117943763733, -0.019036507233977318, -0.01629984751343727, 0.19073987007141113, 0.009250983595848083, 0.06104981526732445, 0.11321060359477997, -0.030843785032629967, 0.05286773294210434, 0.04323267564177513, 0.024902092292904854, -0.04927124083042145, 0.010491620749235153, -0.014691752381622791, -0.11892127245664597, -0.17558249831199646, -0.06992407143115997, -0.0033549233339726925, 0.008261211216449738, 0.020618408918380737, 0.035577382892370224, 0.022572621703147888, 0.04020664468407631, -0.03018210083246231, 0.031054025515913963, -0.014448218047618866, 0.08039385825395584, 0.027943797409534454, -0.0760134682059288, 0.09414765238761902, -0.06138775497674942, 0.016487404704093933, 0.10953940451145172, -0.06109412759542465, 0.18883752822875977, 0.027031833305954933, 0.060827188193798065, 0.10096145421266556, 0.021410595625638962, 0.055231254547834396, 0.08871398866176605, -0.04690081998705864, 0.005277604795992374, -0.06150328740477562, -0.05237036198377609, -0.03405860811471939, 0.05133441090583801, 0.0274044256657362, 0.015378817915916443, -0.12143276631832123, 0.018061082810163498, -0.0033872155472636223, 0.13414987921714783, 0.04696948081254959, -0.12488618493080139, -0.12318544089794159, 0.035178229212760925, -0.0447450652718544, -0.06435267627239227, 0.02970264106988907, 0.06056281179189682, -0.15199437737464905, 0.04571579396724701, -0.005476353690028191, 0.06691089272499084, -0.09398043155670166, 0.017016829922795296, -0.04536218196153641, -0.00038795825093984604, 0.0046845776960253716, 0.07139809429645538, -0.13245058059692383, 0.09988614916801453, 0.021784473210573196, 0.05014732480049133, -0.08102333545684814, 0.016781704500317574, -0.009587204083800316, 0.11095782369375229, 0.11400184035301208, 0.04226262867450714, -0.05935755744576454, -0.01789393275976181, -0.044965535402297974, 0.02349724993109703, 0.059476468712091446, -0.0771629810333252, 0.06345659494400024, 0.00959730613976717, 0.0068338592536747456, -0.022094406187534332, 0.011954301968216896, -0.1316862851381302, -0.12425555288791656, 0.06165789067745209, -0.07529882341623306, -0.09618876129388809, -0.056301459670066833, -0.06392661482095718, -0.05529075115919113, 0.21645061671733856, -0.1152251809835434, -0.09102912247180939, -0.09873617440462112, -0.014868471771478653, 0.04628876596689224, -0.06476116925477982, 0.04590318351984024, -0.03599096089601517, 0.09312134981155396, -0.045443758368492126, -0.10887493193149567, 0.03661302104592323, -0.11386340856552124, -0.11402156949043274, -0.0435497984290123, 0.10691753029823303, 0.11401473730802536, 0.03673506900668144, 0.012628667056560516, 0.012407667934894562, -0.0032735317945480347, -0.11700507998466492, 0.017760830000042915, 0.1295892596244812, 0.003722887486219406, 0.07030995190143585, -0.062308572232723236, 0.03004302829504013, -0.014567427337169647, 0.0012726504355669022, 0.12885212898254395, 0.18389350175857544, -0.06272551417350769, 0.1726498007774353, 0.20024614036083221, -0.10571948438882828, -0.1874268352985382, -0.05358076095581055, -0.0034318258985877037, 0.04601171612739563, 0.04794648662209511, -0.18856267631053925, 0.09235745668411255, 0.032014694064855576, -0.030647864565253258, 0.026589542627334595, -0.2264707386493683, -0.10997427999973297, 0.0918494462966919, 0.05554487928748131, 0.19034433364868164, -0.08277814835309982, -0.03883678466081619, -0.017650512978434563, -0.03826863318681717, 0.04463457688689232, -0.042074054479599, 0.09007160365581512, 0.005883477628231049, -0.028584780171513557, 0.0025949757546186447, -0.030981361865997314, 0.09637640416622162, 0.04292568936944008, 0.024033723399043083, -0.07087595760822296, -0.004875466227531433, 0.10935661196708679, -0.03840721398591995, 0.0985971987247467, 0.038052115589380264, 0.07336419820785522, -0.10262539237737656, -0.059589046984910965, -0.07433173060417175, 0.04423094540834427, -0.041775427758693695, -0.053973328322172165, -0.06166927516460419, 0.055742837488651276, 0.034473106265068054, 0.01195918396115303, 0.0019690096378326416, -0.037929605692625046, 0.04552323743700981, 0.08705469965934753, 0.08260587602853775, -0.03545960411429405, -0.07542366534471512, -0.051698196679353714, -0.0483209528028965, 0.06786765903234482, -0.09453947842121124, 0.01771150901913643, 0.028479287400841713, 0.010473322123289108, 0.09123176336288452, 0.034105636179447174, -0.13785198330879211, 0.010719338431954384, 0.034276098012924194, -0.126060351729393, -0.10776679962873459, -0.019116364419460297, 0.023587878793478012, -0.03519592061638832, 0.055711571127176285, 0.14732058346271515, -0.03481052815914154, -0.03139509633183479, -0.048873208463191986, 0.03716209903359413, -0.018872123211622238, 0.05030099302530289, 0.06533654034137726, 0.030181782320141792, -0.07433848083019257, 0.0742129534482956, 0.03884228318929672, -0.041408561170101166, 0.04395485296845436, 0.04261932894587517, -0.09648610651493073, -0.07940028607845306, -0.06051602214574814, 0.0988425463438034, -0.020914772525429726, -0.05127444863319397, -0.004041934385895729, -0.08095141500234604, 0.06920512020587921, 0.06914443522691727, 0.04746924340724945, 0.036119040101766586, -0.08656302094459534, 0.015301104635000229, -0.055099401623010635, 0.03659716993570328, -0.0312009546905756, -0.004275195300579071, -0.0563962385058403, 0.06786944717168808, 0.06417183578014374, 0.09764838218688965, -0.03422671556472778, -0.07467266917228699, -0.08385886996984482, -0.014742680825293064, -0.06791427731513977, -0.03222331404685974, -0.0739845484495163, -0.006166825536638498, 0.0012188078835606575, -0.0014494974166154861, 0.0233138520270586, 0.036865077912807465, -0.0438813716173172, -0.018283195793628693, -0.033284954726696014, 0.03906754031777382, -0.06154930964112282, 0.007055832073092461, 0.01367461308836937, -0.037538520991802216, 0.09371328353881836, 0.03808483108878136, -0.011719949543476105, 0.0469234436750412, -0.020534779876470566, 0.0356643982231617, -0.019832707941532135, -0.0001535201445221901, -0.023110613226890564, -0.10806379467248917, -0.004334186669439077, 0.0037599951028823853, -0.025462709367275238, 0.008926941081881523, 0.05792359262704849, -0.07217366993427277, 0.08611002564430237, 0.048380304127931595, -0.03046105057001114, -0.07216010242700577, 0.04148492589592934, -0.01758505403995514, 0.028744740411639214, 0.06923571228981018, -0.03324286267161369, 0.052048034965991974, -0.09976010024547577, -0.029127027839422226, 0.0033766229171305895, -0.005658693611621857, -0.005960049107670784, -0.053223125636577606, -0.0021406253799796104, 0.004778726026415825, 0.17418675124645233, -0.023249071091413498, 0.03719467669725418, 0.011723782867193222, 0.006289045326411724, 0.04816756770014763, -0.013619033619761467, 0.07290259003639221, -0.007028787396848202, -0.02546987682580948, -0.014814550057053566, 0.038110118359327316, 0.004448492079973221, 0.004759952425956726, 0.13570229709148407, 0.04508352652192116, 0.08859815448522568, 0.07563149929046631, 0.017733134329319, 0.014343037270009518, -0.13857737183570862, -0.08635576069355011, 0.008673928678035736, 0.06156954914331436, -0.018616653978824615, 0.016002636402845383, 0.09359035640954971, -0.08611685037612915, 0.07031654566526413, 0.04805762693285942, -0.048087894916534424, -0.12421353161334991, -0.1889173537492752, -0.025348037481307983, -0.03039620630443096, -0.01083473302423954, -0.09141658246517181, 0.01653897389769554, 0.09329714626073837, 0.0244319885969162, -0.012100732885301113, 0.09319974482059479, -0.10270954668521881, -0.030926430597901344, 0.04362073913216591, -0.026855647563934326, 0.012696291320025921, 0.05034458637237549, 0.02446882799267769, -0.004959166049957275, 0.0414399616420269, 0.03992466628551483, 0.042230717837810516, 0.02811366319656372, 0.05256810039281845, -0.023135151714086533, -0.07326924055814743, -0.03349195793271065, -0.0023856237530708313, 0.05515803396701813, 0.1340133547782898, 0.022480878978967667, -0.07103928923606873, 0.006363736931234598, 0.10786383599042892, -0.03267323225736618, -0.04785415530204773, -0.10661923140287399, 0.2420898973941803, 0.02024952694773674, 0.000854291720315814, -0.005359451286494732, -0.04580676555633545, 0.007576854899525642, 0.2104293555021286, 0.2218484878540039, 0.005535376723855734, -0.010415876284241676, 0.00977856945246458, -0.011710129678249359, 0.03604119271039963, 0.14612816274166107, 0.005474856123328209, 0.25498121976852417, -0.04654500633478165, 0.0383601114153862, -0.04224430397152901, -0.03840846195816994, -0.10189647972583771, 0.07296432554721832, -0.008571473881602287, 0.006697678007185459, -0.029081838205456734, 0.07093720883131027, -0.035805508494377136, -0.17581807076931, 0.002583860419690609, 0.00157924962695688, -0.06292295455932617, 0.011263553984463215, 0.0019918465986847878, 0.02043924666941166, 0.08364366739988327, -0.01872783899307251, -0.008679092861711979, 0.13466958701610565, 0.017685893923044205, -0.09866694360971451, -0.05719796568155289, 0.11341622471809387, 0.01220000721514225, 0.13802121579647064, 0.01072004809975624, 0.08257653564214706, 0.08789515495300293, 0.022160526365041733, -0.0939086303114891, 0.04177987948060036, -0.020273124799132347, -0.029663274064660072, 0.008562950417399406, 0.10936576873064041, -0.008531068451702595, 0.05826728045940399, 0.027464445680379868, -0.08665643632411957, 0.060099489986896515, 0.009943921118974686, -0.03744744881987572, -0.08074137568473816, 0.08375667780637741, -0.09170453250408173, 0.15654253959655762, 0.12321142852306366, -0.014945081435143948, -0.046614788472652435, -0.02936321496963501, 0.020939581096172333, 0.001247710082679987, 0.05432142689824104, -0.02556231990456581, -0.13302117586135864, 0.02027016319334507, -0.08042077720165253, 0.02706456556916237, -0.251056432723999, -0.08675757050514221, 0.028767408803105354, -0.01776324212551117, -0.019716225564479828, 0.049461450427770615, 0.047616925090551376, 0.026254817843437195, -0.03774744272232056, 0.021009720861911774, -0.037202388048172, 0.0595894455909729, -0.11164434254169464, -0.09460152685642242 ]
null
null
transformers
# MultiBERTs Seed 4 Checkpoint 900k (uncased) Seed 4 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This is an intermediate checkpoint. The final checkpoint can be found at [multiberts-seed-4](https://hf.co/multberts-seed-4). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-4-900k') model = BertModel.from_pretrained("multiberts-seed-4-900k") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts", "multiberts-seed-4"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4-900k
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "multiberts-seed-4", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 4 Checkpoint 900k (uncased) Seed 4 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This is an intermediate checkpoint. The final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 4 Checkpoint 900k (uncased)\nSeed 4 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 4 Checkpoint 900k (uncased)\nSeed 4 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\nThe full model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 71, 145, 335, 134, 25, 95, 48, 3, 222, 111, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #multiberts-seed-4 #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 4 Checkpoint 900k (uncased)\nSeed 4 intermediate checkpoint 900k MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This is an intermediate checkpoint.\nThe final checkpoint can be found at multiberts-seed-4. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\nThis way, the model learns an inner representation of the English language that can then be used to extract features\nuseful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\nclassifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.08505567908287048, -0.001205584267154336, -0.0020602704025804996, 0.06921398639678955, 0.0859755203127861, 0.0038521550595760345, 0.11436206102371216, 0.04925033822655678, -0.027587007731199265, 0.02402920089662075, 0.09297038614749908, 0.02833392471075058, 0.0433780737221241, 0.06381916999816895, 0.09650133550167084, -0.2580201327800751, 0.049993809312582016, -0.06140027567744255, 0.058588698506355286, 0.07473009079694748, 0.09812179207801819, -0.07311306148767471, 0.0630326122045517, 0.03544574975967407, -0.08488641679286957, -0.01568211242556572, -0.018058741465210915, -0.03544013947248459, 0.09951987862586975, 0.06989295780658722, 0.06231647729873657, 0.0010611508041620255, 0.05877332016825676, -0.08674098551273346, 0.015889529138803482, 0.04374321550130844, -0.001793325413018465, 0.023972582072019577, -0.008419226855039597, 0.01694640703499317, 0.10634037852287292, 0.03886181861162186, 0.07786151766777039, 0.0331818051636219, -0.09403526782989502, -0.11897341161966324, -0.08030693233013153, 0.10663877427577972, 0.0552700012922287, 0.04105482995510101, -0.005466011352837086, 0.07116631418466568, -0.03261881321668625, 0.07398395240306854, 0.09895704686641693, -0.25742021203041077, -0.007992749102413654, 0.0651840940117836, 0.04535067081451416, 0.041983917355537415, 0.010528736747801304, 0.02593887224793434, 0.005885429680347443, 0.04234839975833893, 0.02934781089425087, -0.02362784370779991, 0.11655116081237793, -0.04624529555439949, -0.14981795847415924, -0.04175558686256409, 0.11995331197977066, -0.004680430516600609, -0.12437575310468674, -0.0996762365102768, -0.029448598623275757, 0.11516021192073822, -0.003834000788629055, -0.016964642331004143, -0.002943870145827532, 0.00975025910884142, 0.02518465369939804, -0.09212516993284225, -0.0854891762137413, -0.02740093320608139, -0.03607218340039253, 0.12903107702732086, 0.04752236604690552, 0.05194808542728424, -0.03331470489501953, 0.08760208636522293, -0.12013589590787888, -0.04047766327857971, -0.051850106567144394, -0.08289052546024323, -0.01743236370384693, 0.010567547753453255, -0.026235636323690414, -0.07807719707489014, -0.05884062498807907, 0.11826027929782867, 0.040694527328014374, 0.03153427690267563, 0.0023391535505652428, 0.03986214101314545, 0.07199302315711975, 0.09586110711097717, -0.03952566906809807, 0.046068448573350906, 0.03596225753426552, -0.019578415900468826, 0.05779971927404404, -0.05155506357550621, -0.10175411403179169, 0.0767153650522232, 0.000322478823363781, 0.03879638388752937, 0.02517726644873619, 0.035120751708745956, -0.008054728619754314, -0.07041661441326141, 0.16842035949230194, -0.07853558659553528, -0.008804813958704472, -0.017618771642446518, 0.010457133874297142, 0.04776522517204285, 0.03242623060941696, -0.007909782230854034, -0.048323359340429306, -0.005392891354858875, -0.056233763694763184, -0.028166841715574265, -0.057150281965732574, -0.11883972585201263, -0.0016485992819070816, -0.040157534182071686, -0.030380994081497192, -0.14119011163711548, -0.21659743785858154, -0.021230988204479218, 0.06286308914422989, -0.0010890462435781956, -0.009317588992416859, 0.02486308664083481, 0.014883464202284813, -0.02052398771047592, 0.008985841646790504, -0.0465080626308918, -0.0011305948719382286, -0.007303863763809204, -0.0307435505092144, 0.05816382169723511, -0.04297807067632675, 0.023227185010910034, -0.068571537733078, 0.022542912513017654, -0.21278899908065796, 0.08796471357345581, -0.03399629518389702, 0.005210468545556068, -0.03515101969242096, -0.04526343569159508, 0.011821284890174866, 0.04702531918883324, -0.009293117560446262, 0.11630779504776001, -0.13670289516448975, -0.04956649988889694, 0.17927443981170654, -0.1591586470603943, -0.0033152326941490173, 0.09971435368061066, -0.04910626634955406, 0.05465858429670334, 0.13447773456573486, 0.1018185168504715, 0.08917379379272461, -0.07391169667243958, 0.008185590617358685, 0.061120834201574326, -0.06932808458805084, 0.05183253437280655, 0.08715870976448059, -0.02476089634001255, -0.1356559544801712, 0.030797749757766724, -0.072751984000206, -0.008273696526885033, -0.024318018928170204, -0.021145425736904144, 0.007477331906557083, -0.03893382474780083, 0.025084372609853745, 0.006168285850435495, 0.01826704479753971, -0.04094373434782028, -0.08105258643627167, 0.03057609125971794, 0.07610233128070831, -0.07219639420509338, 0.041832514107227325, -0.07064123451709747, 0.06573361158370972, -0.07609764486551285, -0.003535822033882141, -0.16482824087142944, -0.025009695440530777, 0.04472822695970535, -0.05051159858703613, 0.04931538552045822, 0.08825027942657471, 0.002995805349200964, 0.12256846576929092, -0.04029697924852371, 0.0006707771681249142, -0.005042757838964462, -0.008521574549376965, -0.0521172396838665, -0.1216030865907669, -0.0836363434791565, -0.06690798699855804, 0.09819292277097702, -0.07401914149522781, 0.029050156474113464, -0.06860925257205963, -0.019977858290076256, -0.010535908862948418, -0.057325102388858795, -0.005094323307275772, 0.010876086540520191, -0.026847874745726585, -0.04659714177250862, 0.0487435907125473, 0.04967132583260536, -0.060537226498126984, 0.07280794531106949, -0.10467051714658737, -0.057969093322753906, 0.0571899339556694, 0.01490901317447424, -0.08105669170618057, 0.09246967732906342, -0.018925318494439125, -0.01384047418832779, -0.05393368378281593, -0.045398272573947906, 0.1948411762714386, -0.025839943438768387, 0.10118238627910614, -0.09127722680568695, 0.0021573251578956842, 0.027710486203432083, -0.0446617528796196, -0.01612691581249237, 0.05723458528518677, 0.04597911238670349, -0.18323369324207306, 0.013590842485427856, 0.052082523703575134, 0.07631659507751465, 0.1141669750213623, 0.026428688317537308, -0.02380775287747383, -0.04653068631887436, -0.013406693935394287, 0.00327870761975646, 0.057898346334695816, -0.026775233447551727, -0.010042685084044933, 0.0312428530305624, 0.05678143352270126, 0.019257843494415283, -0.08122219145298004, 0.03705734759569168, 0.06474051624536514, -0.014137255027890205, -0.04390082508325577, -0.025157256051898003, -0.06077089160680771, 0.06131138652563095, 0.054854411631822586, 0.035097189247608185, 0.02605561912059784, -0.015721771866083145, -0.13385730981826782, 0.18878662586212158, -0.11393844336271286, -0.2552967667579651, -0.10969077050685883, -0.05726473033428192, -0.026018312200903893, 0.040412962436676025, 0.05754449963569641, -0.03170028328895569, -0.04347974434494972, -0.1151580661535263, 0.05833512172102928, -0.06339964270591736, -0.02880815416574478, -0.01182093657553196, -0.051962435245513916, -0.020056700333952904, -0.1266210973262787, -0.011803071945905685, -0.031760603189468384, -0.07209742069244385, 0.005194427445530891, -0.03314715996384621, 0.03094058483839035, 0.13627904653549194, 0.03473522141575813, -0.019831135869026184, -0.017378797754645348, 0.19333472847938538, 0.008709458634257317, 0.061966367065906525, 0.1136682778596878, -0.029486065730452538, 0.05367812514305115, 0.04656548798084259, 0.025606660172343254, -0.048338837921619415, 0.010228384286165237, -0.017134636640548706, -0.12015563994646072, -0.17123013734817505, -0.06982716917991638, -0.004029428120702505, 0.003962620161473751, 0.020254364237189293, 0.03443954885005951, 0.02117428556084633, 0.041164103895425797, -0.0286111943423748, 0.028492271900177002, -0.014617960900068283, 0.08119208365678787, 0.029646720737218857, -0.07492898404598236, 0.09322388470172882, -0.06287311017513275, 0.016403600573539734, 0.10795453190803528, -0.058675408363342285, 0.1894354373216629, 0.02356475405395031, 0.05286884307861328, 0.10150247812271118, 0.02332812175154686, 0.05593228340148926, 0.08955388516187668, -0.04752122610807419, 0.00528414361178875, -0.05918796733021736, -0.05113508552312851, -0.03456518054008484, 0.04937563091516495, 0.03074054792523384, 0.01800578460097313, -0.12136752903461456, 0.01655355468392372, -0.0023776746820658445, 0.1356414556503296, 0.04789021238684654, -0.12110234797000885, -0.12384229898452759, 0.03528454899787903, -0.04603519290685654, -0.06343156099319458, 0.03044172376394272, 0.05709436535835266, -0.15300559997558594, 0.048214998096227646, -0.002632593736052513, 0.06586717069149017, -0.09436336159706116, 0.015845317393541336, -0.04442992061376572, -0.0002036532387137413, 0.004778969567269087, 0.0693868026137352, -0.1291666477918625, 0.10298145562410355, 0.021804079413414, 0.052678920328617096, -0.08021779358386993, 0.016378484666347504, -0.009946522302925587, 0.10937038064002991, 0.1151203066110611, 0.04312766343355179, -0.052341245114803314, -0.01867780089378357, -0.04498335346579552, 0.02139459364116192, 0.05840951204299927, -0.07484942674636841, 0.06234270706772804, 0.0107974698767066, 0.006529646459966898, -0.02416214346885681, 0.01112234778702259, -0.13149334490299225, -0.12298090755939484, 0.06086187809705734, -0.07681160420179367, -0.09748504310846329, -0.05726299062371254, -0.06320575624704361, -0.0520237535238266, 0.2088547646999359, -0.11576715856790543, -0.09175913035869598, -0.09741192311048508, -0.017847977578639984, 0.0485260933637619, -0.06538886576890945, 0.048312269151210785, -0.03709262236952782, 0.09018044173717499, -0.045491062104701996, -0.10953560471534729, 0.03497521951794624, -0.11373208463191986, -0.11389051377773285, -0.042815931141376495, 0.10744784772396088, 0.11171828210353851, 0.03658871725201607, 0.012678530067205429, 0.012530703097581863, -0.0022605080157518387, -0.11942379176616669, 0.016009988263249397, 0.1280447542667389, -0.003289090469479561, 0.07349865138530731, -0.060450948774814606, 0.02451721951365471, -0.015425032004714012, 0.0012745372951030731, 0.1280844509601593, 0.1848565638065338, -0.06324046850204468, 0.17294849455356598, 0.20547521114349365, -0.10496805608272552, -0.19076916575431824, -0.0542098805308342, -0.002847735770046711, 0.04560844972729683, 0.043355561792850494, -0.1834494173526764, 0.09121713042259216, 0.033489689230918884, -0.03108840622007847, 0.021651770919561386, -0.22918701171875, -0.11294844001531601, 0.09198486804962158, 0.05584700405597687, 0.18909719586372375, -0.08077813684940338, -0.03992205113172531, -0.018389172852039337, -0.03929842263460159, 0.044582583010196686, -0.03980813920497894, 0.09025626629590988, 0.00458153523504734, -0.03166007250547409, 0.001477678306400776, -0.03186354786157608, 0.09608110785484314, 0.04295167699456215, 0.024272941052913666, -0.07154235243797302, -0.005453780293464661, 0.11212220788002014, -0.0391799733042717, 0.09911967813968658, 0.044462040066719055, 0.07406768202781677, -0.09979506582021713, -0.05886334180831909, -0.0764957070350647, 0.04347141832113266, -0.04159887507557869, -0.053583499044179916, -0.0644126757979393, 0.05727141350507736, 0.037378035485744476, 0.01063576154410839, -0.0016522426158189774, -0.037317052483558655, 0.04412860795855522, 0.08923918008804321, 0.08111102133989334, -0.03537595644593239, -0.07464481145143509, -0.04843514785170555, -0.05015857145190239, 0.06764592230319977, -0.09247414767742157, 0.015866417437791824, 0.028485571965575218, 0.011238300241529942, 0.08819909393787384, 0.034321144223213196, -0.1362207531929016, 0.011686062440276146, 0.03544268012046814, -0.12410908937454224, -0.10861028730869293, -0.018283609300851822, 0.02025909721851349, -0.03725660592317581, 0.05563509464263916, 0.14427287876605988, -0.03582295775413513, -0.03115032985806465, -0.049037158489227295, 0.03735310211777687, -0.020774178206920624, 0.05133701115846634, 0.06718370318412781, 0.030162522569298744, -0.0742429867386818, 0.07502631843090057, 0.038843199610710144, -0.04054580628871918, 0.041900891810655594, 0.04183569923043251, -0.09725350141525269, -0.07815279066562653, -0.06392934173345566, 0.09355223178863525, -0.022795630618929863, -0.049214791506528854, -0.002365829423069954, -0.08309057354927063, 0.06772523373365402, 0.07358752191066742, 0.0483127199113369, 0.03560270369052887, -0.08712175488471985, 0.015809882432222366, -0.05527956783771515, 0.035348936915397644, -0.0302989911288023, -0.004007816314697266, -0.05665362626314163, 0.06522634625434875, 0.0640813335776329, 0.09776204824447632, -0.03455092012882233, -0.07442646473646164, -0.08477117866277695, -0.013948885723948479, -0.0629710927605629, -0.03435415402054787, -0.07719846069812775, -0.003880516393110156, 0.0007486725226044655, -0.001699903979897499, 0.02135404385626316, 0.03678032383322716, -0.04325675964355469, -0.018526457250118256, -0.03279975801706314, 0.03806508332490921, -0.0601222850382328, 0.00636004563421011, 0.015050480142235756, -0.03663983196020126, 0.09307199716567993, 0.036482375115156174, -0.009802253916859627, 0.04673822596669197, -0.017138827592134476, 0.033596307039260864, -0.02039213851094246, 0.0012488162610679865, -0.02287480980157852, -0.10752905905246735, -0.0046237981878221035, 0.0044183190912008286, -0.024462468922138214, 0.010989237576723099, 0.058596931397914886, -0.0728110671043396, 0.08549962937831879, 0.047410208731889725, -0.028870102018117905, -0.07191303372383118, 0.042995307594537735, -0.012071976438164711, 0.028503086417913437, 0.07003754377365112, -0.03421623632311821, 0.05231703445315361, -0.09822249412536621, -0.028867173939943314, 0.0024756051134318113, -0.006938021630048752, -0.0063052065670490265, -0.052811019122600555, -0.0018691523000597954, 0.006208071485161781, 0.17768578231334686, -0.025941573083400726, 0.037046413868665695, 0.013153253123164177, 0.009502863511443138, 0.0466083325445652, -0.011897975578904152, 0.07451274991035461, -0.006699881516396999, -0.027315987274050713, -0.01380503736436367, 0.03852703794836998, 0.0037794578820466995, 0.00373784638941288, 0.13595546782016754, 0.044272784143686295, 0.09240172058343887, 0.07713104039430618, 0.01887824758887291, 0.01728435792028904, -0.13038161396980286, -0.09096716344356537, 0.007165148854255676, 0.06041945889592171, -0.01834031008183956, 0.009537002071738243, 0.08984839916229248, -0.08619411289691925, 0.07037381082773209, 0.04912102222442627, -0.04948164522647858, -0.1231185644865036, -0.18724960088729858, -0.023990878835320473, -0.030115654692053795, -0.010114479809999466, -0.09040163457393646, 0.015228653326630592, 0.08892837911844254, 0.023520369082689285, -0.011542481370270252, 0.0989101231098175, -0.1046777069568634, -0.029633518308401108, 0.04602609574794769, -0.02740100398659706, 0.013070502318441868, 0.05092095583677292, 0.02475370280444622, -0.0056807659566402435, 0.04029827564954758, 0.038130976259708405, 0.04283455014228821, 0.024851836264133453, 0.05172550678253174, -0.02145214006304741, -0.07232822477817535, -0.032326821237802505, -0.001968878321349621, 0.05707956477999687, 0.13648676872253418, 0.02255537547171116, -0.07034146040678024, 0.005460615269839764, 0.1066369041800499, -0.032780859619379044, -0.049102213233709335, -0.10690891742706299, 0.23829585313796997, 0.023110022768378258, 0.0008734944276511669, -0.0051732901483774185, -0.046837903559207916, 0.005250727757811546, 0.21328334510326385, 0.22387295961380005, 0.005094125866889954, -0.009742628782987595, 0.009762540459632874, -0.011432909406721592, 0.03704160079360008, 0.14664986729621887, 0.005214933305978775, 0.25217628479003906, -0.046004340052604675, 0.040390901267528534, -0.04142969846725464, -0.04008316621184349, -0.1006154865026474, 0.07333546876907349, -0.00865180417895317, 0.005933258216828108, -0.03190646320581436, 0.07119359076023102, -0.039089176803827286, -0.17420564591884613, 0.0043689412996172905, -0.0021888120099902153, -0.06330161541700363, 0.010372357442975044, 0.000029585324227809906, 0.02109898068010807, 0.08308807760477066, -0.01756679266691208, -0.006851932033896446, 0.13252054154872894, 0.01847449317574501, -0.09806208312511444, -0.06256639212369919, 0.11431374400854111, 0.01766560599207878, 0.13989703357219696, 0.011078246869146824, 0.0812944620847702, 0.08795097470283508, 0.021336214616894722, -0.09683096408843994, 0.04354366287589073, -0.01942935399711132, -0.03103063628077507, 0.008872016333043575, 0.10937816649675369, -0.0073706586845219135, 0.0572747066617012, 0.025706209242343903, -0.08905905485153198, 0.06066608428955078, 0.01082213968038559, -0.035592734813690186, -0.08259136974811554, 0.08378255367279053, -0.09149742126464844, 0.15724563598632812, 0.12328125536441803, -0.015271519310772419, -0.04660520702600479, -0.02730291709303856, 0.019144870340824127, 0.0016222414560616016, 0.054757729172706604, -0.02643078938126564, -0.13331188261508942, 0.020189719274640083, -0.08639708161354065, 0.026820503175258636, -0.24756133556365967, -0.08756332099437714, 0.028408119454979897, -0.018964501097798347, -0.018879566341638565, 0.0518358051776886, 0.04581106826663017, 0.027911799028515816, -0.03645937889814377, 0.0189067330211401, -0.03846888244152069, 0.05773686617612839, -0.1122979000210762, -0.09540548920631409 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-4
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-5
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-6
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-7
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-8
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/pdf/2106.16163.pdf) and first released in [this repository](https://github.com/google-research/language/tree/master/language/multiberts). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by [gchhablani](https://hf.co/gchhablani). ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=multiberts) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('multiberts-seed-0') model = BertModel.from_pretrained("multiberts-seed-0") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the [Limitation and bias section](https://huggingface.co/bert-base-uncased#limitations-and-bias) of the [bert-base-uncased](https://huggingface.co/bert-base-uncased) checkpoint. ## Training data The MultiBERTs models were pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2106-16163, author = {Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick}, title = {The MultiBERTs: {BERT} Reproductions for Robustness Analysis}, journal = {CoRR}, volume = {abs/2106.16163}, year = {2021}, url = {https://arxiv.org/abs/2106.16163}, eprinttype = {arXiv}, eprint = {2106.16163}, timestamp = {Mon, 05 Jul 2021 15:15:50 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2106-16163.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=multiberts"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
{"language": "en", "license": "apache-2.0", "tags": ["exbert", "multiberts"], "datasets": ["bookcorpus", "wikipedia"]}
null
MultiBertGunjanPatrick/multiberts-seed-9
[ "transformers", "pytorch", "bert", "pretraining", "exbert", "multiberts", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2106.16163", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2106.16163" ]
[ "en" ]
TAGS #transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us
# MultiBERTs Seed 0 (uncased) Seed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani. ## Model description MultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the MultiBERTs model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use Here is how to use this model to get the features of a given text in PyTorch: ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular checkpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint. ## Training data The MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038 unpublished books and English Wikipedia (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by '[MASK]'. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size of 256. The sequence length was set to 512 throughout. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ### BibTeX entry and citation info <a href="URL <img width="300px" src="URL </a>
[ "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ "TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n", "# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.", "## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.", "### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:", "### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.", "## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).", "## Training procedure", "### Preprocessing\n\nThe texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are\nthen of the form:\n\n\n\nWith probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in\nthe other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a\nconsecutive span of text usually longer than a single sentence. The only constrain is that the result with the two\n\"sentences\" has a combined length of less than 512 tokens.\nThe details of the masking procedure for each sentence are the following:\n\n- 15% of the tokens are masked.\n- In 80% of the cases, the masked tokens are replaced by '[MASK]'.\n- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.\n- In the 10% remaining cases, the masked tokens are left as is.", "### Pretraining\n\nThe model was trained on 16 Cloud TPU v2 chips for two million steps with a batch size\nof 256. The sequence length was set to 512 throughout. The optimizer\nused is Adam with a learning rate of 1e-4, \\\\(\\beta_{1} = 0.9\\\\) and \\\\(\\beta_{2} = 0.999\\\\), a weight decay of 0.01,\nlearning rate warmup for 10,000 steps and linear decay of the learning rate after.", "### BibTeX entry and citation info\n\n\n\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>" ]
[ 63, 111, 335, 134, 25, 95, 48, 3, 222, 110, 34 ]
[ "passage: TAGS\n#transformers #pytorch #bert #pretraining #exbert #multiberts #en #dataset-bookcorpus #dataset-wikipedia #arxiv-2106.16163 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs Seed 0 (uncased)\n\nSeed 0 MultiBERTs (pretrained BERT) model on English language using a masked language modeling (MLM) objective. It was introduced in\nthis paper and first released in\nthis repository. This model is uncased: it does not make a difference\nbetween english and English.\n\nDisclaimer: The team releasing MultiBERTs did not write a model card for this model so this model card has been written by gchhablani.", "passage: ## Model description\n\nMultiBERTs models are transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it\nwas pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of\npublicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it\nwas pretrained with two objectives:\n\n- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run\n the entire masked sentence through the model and has to predict the masked words. This is different from traditional\n recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like\n GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the\n sentence.\n- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes\n they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to\n predict if the two sentences were following each other or not.\n This way, the model learns an inner representation of the English language that can then be used to extract features\n useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard\n classifier using the features produced by the MultiBERTs model as inputs.## Intended uses & limitations\n\nYou can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to\nbe fine-tuned on a downstream task. See the model hub to look for\nfine-tuned versions on a task that interests you.\nNote that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)\nto make decisions, such as sequence classification, token classification or question answering. For tasks such as text\ngeneration you should look at model like GPT2.### How to use\n\nHere is how to use this model to get the features of a given text in PyTorch:### Limitations and bias\n\nEven if the training data used for this model could be characterized as fairly neutral, this model can have biased\npredictions. This bias will also affect all fine-tuned versions of this model. For an understanding of bias of this particular\ncheckpoint, please try out this checkpoint with the snippet present in the Limitation and bias section of the bert-base-uncased checkpoint.## Training data\n\nThe MultiBERTs models were pretrained on BookCorpus, a dataset consisting of 11,038\nunpublished books and English Wikipedia (excluding lists, tables and\nheaders).## Training procedure" ]
[ -0.06825247406959534, 0.027449268847703934, -0.0021626802626997232, 0.09413602948188782, 0.07635393738746643, 0.026495488360524178, 0.15437674522399902, 0.029963307082653046, -0.03573239967226982, 0.021267801523208618, 0.10619504749774933, 0.03782356157898903, 0.03388210013508797, 0.035308390855789185, 0.066785529255867, -0.2578813433647156, 0.07567903399467468, -0.05793163925409317, 0.040864333510398865, 0.059090327471494675, 0.10602577030658722, -0.07069262117147446, 0.07895290851593018, 0.04403890669345856, -0.0756942480802536, -0.027663996443152428, -0.005503433756530285, -0.034674178808927536, 0.07060743123292923, 0.09438986331224442, 0.05877054110169411, -0.008264455944299698, 0.05975931137800217, -0.087635338306427, 0.019257638603448868, 0.024562222883105278, -0.007006383966654539, 0.036696210503578186, 0.025804642587900162, -0.009673221036791801, 0.11283443868160248, 0.02619457244873047, 0.08560121059417725, 0.04041407257318497, -0.08754345774650574, -0.09977805614471436, -0.0694802924990654, 0.09317219257354736, 0.02764834463596344, 0.04353900998830795, -0.0063711777329444885, 0.07313166558742523, -0.006663286592811346, 0.058924756944179535, 0.08212147653102875, -0.23674309253692627, -0.023082595318555832, 0.05118638277053833, 0.04846370965242386, 0.04278615117073059, 0.013536407612264156, 0.031959742307662964, 0.005570597946643829, 0.04724816232919693, 0.006345914676785469, -0.028150685131549835, 0.13924768567085266, -0.053803253918886185, -0.13665056228637695, -0.03023041971027851, 0.15811696648597717, 0.02479265071451664, -0.11351540684700012, -0.11277355998754501, 0.0016996730118989944, 0.1693311333656311, -0.0019645756110548973, -0.007584595121443272, -0.009904063306748867, -0.0030730916187167168, 0.024124154821038246, -0.1230793297290802, -0.08302900195121765, -0.02286745235323906, -0.06280194967985153, 0.15275688469409943, 0.047940537333488464, 0.07110750675201416, -0.06045709177851677, 0.04197261482477188, -0.14955590665340424, -0.036801956593990326, -0.04978496953845024, -0.09940676391124725, 0.017188318073749542, 0.02796531654894352, -0.044329117983579636, -0.11630523204803467, -0.03652356192469597, 0.0725361704826355, 0.038227953016757965, 0.03685189411044121, -0.005693042650818825, 0.029456961899995804, 0.10580474138259888, 0.10501816868782043, -0.0562795028090477, 0.07449519634246826, 0.020974641665816307, -0.020636841654777527, 0.03971032053232193, -0.05628065764904022, -0.12330584228038788, 0.0744452103972435, -0.034096408635377884, 0.018313465639948845, 0.023749854415655136, 0.04198585823178291, -0.012982374057173729, -0.0767536610364914, 0.14133483171463013, -0.09305756539106369, 0.0004417812451720238, -0.0035654937382787466, 0.016869794577360153, 0.08157093822956085, 0.02621583268046379, 0.0021266003604978323, -0.059168532490730286, -0.03080003336071968, -0.06315429508686066, -0.027340907603502274, -0.06021827086806297, -0.13162744045257568, 0.0013580089434981346, -0.020953699946403503, -0.014699130319058895, -0.10742536187171936, -0.17884144186973572, -0.01402769424021244, 0.07123412191867828, -0.014155296608805656, 0.011412929743528366, -0.0021266068797558546, 0.012132527306675911, -0.004981525242328644, 0.032173626124858856, -0.03745890408754349, 0.00908223818987608, -0.012201073579490185, -0.06731266528367996, 0.039806246757507324, -0.12071730941534042, 0.04209677502512932, -0.05578748881816864, 0.011489223688840866, -0.19638846814632416, 0.10738702118396759, -0.02783583477139473, -0.04278886318206787, -0.04810495674610138, -0.05834455043077469, 0.0188974030315876, 0.045517146587371826, -0.015527524054050446, 0.10550028085708618, -0.12357760965824127, -0.0512409433722496, 0.15865573287010193, -0.1566506326198578, 0.016810515895485878, 0.10513904690742493, -0.06748288869857788, 0.042335763573646545, 0.14426475763320923, 0.07841357588768005, 0.07015632092952728, -0.04069618880748749, 0.017828572541475296, 0.060336943715810776, -0.0458533950150013, 0.0799841359257698, 0.10583654791116714, -0.015437023714184761, -0.13057377934455872, 0.030710875988006592, -0.06833602488040924, -0.03600694239139557, -0.022659340873360634, -0.024447504431009293, 0.014145502820611, -0.052795182913541794, 0.05715940147638321, -0.010484781116247177, 0.006331292912364006, -0.0232611745595932, -0.07422537356615067, 0.07731874287128448, 0.07671873271465302, -0.08619971573352814, 0.018436623737215996, -0.0909656435251236, 0.03130660206079483, -0.06597552448511124, -0.005088436417281628, -0.14390107989311218, -0.04274594411253929, 0.031965915113687515, -0.0805630162358284, 0.09851419925689697, 0.11271710693836212, 0.008409101516008377, 0.11310183256864548, -0.04617488384246826, 0.02628052979707718, -0.012368079274892807, -0.006386269349604845, -0.044110074639320374, -0.14293555915355682, -0.06652771681547165, -0.06382939964532852, 0.0834670290350914, -0.059091683477163315, 0.020797124132514, -0.08205804973840714, -0.041816260665655136, -0.0250774584710598, -0.04668354615569115, 0.005325498059391975, 0.00811565201729536, -0.013542650267481804, -0.030526084825396538, 0.04050645977258682, 0.027077049016952515, -0.0918835997581482, 0.08847370743751526, -0.1236613318324089, -0.0576145313680172, 0.06846176087856293, -0.0069316960871219635, -0.04083865508437157, 0.09554298222064972, 0.011831864714622498, -0.01123481709510088, -0.057707928121089935, -0.04657518118619919, 0.22045092284679413, -0.020844273269176483, 0.08364406228065491, -0.11240328848361969, 0.004931592382490635, 0.03506753221154213, -0.06102532893419266, -0.05918964743614197, 0.07589934766292572, 0.038565460592508316, -0.2161455750465393, 0.024600330740213394, 0.07306224852800369, 0.061481211334466934, 0.1421050727367401, 0.02417578175663948, -0.02878376469016075, -0.06042608246207237, -0.017261460423469543, -0.012187670916318893, 0.05919060483574867, -0.04688645899295807, 0.0030246214009821415, 0.0510857030749321, 0.05463946610689163, 0.018327711150050163, -0.06600221991539001, 0.02497151307761669, 0.05208776146173477, -0.017216674983501434, -0.06310763210058212, -0.05255124717950821, -0.03947900980710983, 0.0736318975687027, 0.041184503585100174, 0.0495072677731514, 0.0537080317735672, -0.019612858071923256, -0.1381978541612625, 0.16529735922813416, -0.13489660620689392, -0.2240476906299591, -0.12759706377983093, -0.07904494553804398, -0.07838001847267151, 0.039492446929216385, 0.0373598076403141, -0.03468242287635803, -0.05113789439201355, -0.10579567402601242, 0.06591805815696716, -0.11658145487308502, -0.057194799184799194, 0.014129210263490677, -0.056258611381053925, -0.005652858875691891, -0.1268719583749771, -0.010539324954152107, -0.026957646012306213, -0.07912764698266983, 0.004068336449563503, -0.04539388418197632, 0.010077799670398235, 0.13516394793987274, 0.008290649391710758, -0.009709829464554787, -0.015056753531098366, 0.19663433730602264, 0.0314871110022068, 0.04356053099036217, 0.12803813815116882, -0.06543856859207153, 0.05768699571490288, 0.02060154639184475, 0.037481535226106644, -0.04913286864757538, -0.0007067807018756866, -0.027622418478131294, -0.11730992794036865, -0.207548126578331, -0.06663559377193451, 0.007457428611814976, 0.008368045091629028, 0.01904660277068615, 0.015689538791775703, 0.024972863495349884, 0.05414750799536705, -0.031031470745801926, 0.03179151564836502, 0.033982276916503906, 0.05688050761818886, 0.06225617602467537, -0.06120002269744873, 0.09507381916046143, -0.07100313901901245, 0.027307022362947464, 0.10875560343265533, -0.07062242925167084, 0.16170385479927063, 0.04285769164562225, 0.05423576757311821, 0.09659373760223389, 0.0006577670574188232, 0.0585428811609745, 0.10273323953151703, -0.06317441910505295, 0.019947808235883713, -0.07513642311096191, -0.05752627179026604, -0.04452991858124733, 0.060025766491889954, 0.037611961364746094, -0.000131998211145401, -0.10182826220989227, 0.03220826014876366, -0.036235980689525604, 0.07729616016149521, 0.06343917548656464, -0.10670174658298492, -0.10046673566102982, 0.045665811747312546, -0.04038289934396744, -0.08793723583221436, 0.03426353633403778, 0.08077984303236008, -0.14119762182235718, 0.06124391779303551, 0.018283551558852196, 0.07126335799694061, -0.09752818942070007, 0.01132874470204115, -0.06905651092529297, 0.016318362206220627, 0.005033754277974367, 0.0913831889629364, -0.1432204693555832, 0.10583388805389404, 0.02708813175559044, 0.04597454518079758, -0.09043684601783752, 0.01613154262304306, -0.01261853240430355, 0.07669144868850708, 0.12108297646045685, 0.04203776270151138, -0.05836430937051773, -0.018112843856215477, -0.06768153607845306, 0.034427788108587265, 0.07278922200202942, -0.04098799079656601, 0.038899462670087814, 0.0012810318730771542, 0.016169004142284393, -0.008310851640999317, 0.020610321313142776, -0.13600048422813416, -0.14560562372207642, 0.0705970749258995, -0.06633393466472626, -0.08288760483264923, -0.03709196671843529, -0.06633897125720978, -0.0868702232837677, 0.15359032154083252, -0.0773216113448143, -0.1108812615275383, -0.10497688502073288, 0.004697326570749283, 0.06842926889657974, -0.06570008397102356, 0.05184205248951912, -0.05175790935754776, 0.09120817482471466, -0.03778978809714317, -0.10993549227714539, 0.017024382948875427, -0.09169412404298782, -0.11230003088712692, -0.030281051993370056, 0.09025070071220398, 0.15063974261283875, 0.05137326568365097, 0.024738965556025505, 0.016462495550513268, 0.0016304273158311844, -0.12906411290168762, 0.004929570481181145, 0.143439382314682, 0.01773710548877716, 0.0976557806134224, -0.06279069185256958, -0.02821265161037445, -0.012585094198584557, -0.0009578559547662735, 0.13525930047035217, 0.1579957902431488, -0.06031216308474541, 0.15296214818954468, 0.227834090590477, -0.10105094313621521, -0.19415637850761414, -0.07397069036960602, 0.0032560182735323906, 0.04487091302871704, 0.045912403613328934, -0.19948574900627136, 0.09972882270812988, 0.04975741356611252, -0.013423530384898186, -0.03354128822684288, -0.18906579911708832, -0.1023210883140564, 0.1062556803226471, 0.06369950622320175, 0.19807088375091553, -0.06803785264492035, -0.04169449210166931, -0.04189038649201393, -0.05597612261772156, 0.09557583183050156, -0.011712346225976944, 0.0822327509522438, 0.01643332466483116, 0.014923296868801117, -0.0019287541508674622, -0.008046919479966164, 0.11012726277112961, 0.04542766511440277, 0.018416037783026695, -0.07320156693458557, -0.0423104427754879, 0.10889390110969543, -0.03202357143163681, 0.12254303693771362, 0.03122953698039055, 0.05849093571305275, -0.0764583870768547, -0.06015930324792862, -0.08313038945198059, 0.012603376060724258, -0.04008830338716507, -0.05228453874588013, -0.051481351256370544, 0.03643445670604706, 0.02559221349656582, 0.013383354060351849, -0.010037007741630077, -0.0581706240773201, 0.009901179000735283, 0.0659501925110817, 0.15930500626564026, -0.013111893087625504, -0.06732219457626343, -0.07006201148033142, -0.060269180685281754, 0.04847850278019905, -0.10283331573009491, 0.0321035273373127, 0.020586064085364342, -0.0036565132904797792, 0.11348927021026611, 0.03316955640912056, -0.11396678537130356, 0.013628019951283932, 0.005912423133850098, -0.09849600493907928, -0.1485224962234497, -0.016377072781324387, 0.05456313490867615, -0.0583408921957016, 0.03962210938334465, 0.1586087942123413, -0.02749052457511425, -0.033682480454444885, -0.05674935132265091, 0.032430585473775864, -0.034874096512794495, 0.03596019372344017, 0.08030854165554047, 0.016163216903805733, -0.08148041367530823, 0.06100435554981232, 0.04497561603784561, -0.01565445587038994, 0.06611718982458115, 0.01751827821135521, -0.07064318656921387, -0.08515681326389313, -0.06657058000564575, 0.11521587520837784, -0.04193677753210068, -0.06614658236503601, 0.0494990199804306, -0.10936599224805832, 0.06512928009033203, 0.09400998800992966, 0.03727183863520622, 0.046071093529462814, -0.08464010059833527, 0.006473809480667114, -0.037655625492334366, 0.03303447365760803, -0.03967699408531189, -0.03299032896757126, -0.04207788407802582, 0.02865336276590824, 0.0594131164252758, 0.09625885635614395, -0.03653799742460251, -0.07748300582170486, -0.08829360455274582, -0.013138281181454659, -0.10569687932729721, -0.006850461475551128, -0.06914658099412918, 0.00014194706454873085, 0.007000140380114317, -0.02822837233543396, 0.030307123437523842, 0.033606212586164474, -0.0512661337852478, -0.008813504129648209, -0.02892981842160225, 0.05861987918615341, -0.07071447372436523, 0.012725180014967918, 0.015199657529592514, -0.01911322958767414, 0.09222348034381866, 0.047224029898643494, -0.03322954475879669, 0.05148611217737198, -0.03994745388627052, 0.03518182411789894, -0.04691552743315697, 0.007639196235686541, -0.02100628986954689, -0.11349901556968689, -0.021261068060994148, 0.010819608345627785, -0.023444410413503647, 0.01614448055624962, 0.07291702181100845, -0.051247432827949524, 0.0827048048377037, 0.06047651544213295, -0.049000177532434464, -0.055763885378837585, 0.04004162549972534, 0.0009079426527023315, 0.017973260954022408, 0.0793890655040741, 0.0011681190226227045, 0.053140703588724136, -0.08328671008348465, 0.0013423850759863853, 0.0043635861948132515, -0.016782283782958984, -0.019065728411078453, -0.07158057391643524, -0.000623882282525301, 0.009545178152620792, 0.17526990175247192, -0.004971030168235302, -0.019934196025133133, 0.005758095532655716, 0.06719693541526794, 0.033424317836761475, 0.004426124505698681, 0.08463965356349945, -0.018342992290854454, -0.01793844997882843, -0.017587680369615555, 0.026691239327192307, -0.01080797053873539, 0.016537122428417206, 0.1315390020608902, 0.04961226135492325, 0.11255703866481781, 0.07479852437973022, 0.05499632656574249, 0.052345164120197296, -0.10784098505973816, -0.06925129890441895, 0.03605833277106285, 0.05536176264286041, -0.034931864589452744, 0.02555268630385399, 0.05937255546450615, -0.09513229876756668, 0.0820266455411911, 0.046595025807619095, -0.05803784728050232, -0.1295481026172638, -0.2191641926765442, -0.042123790830373764, -0.010218853130936623, -0.020777955651283264, -0.10785381495952606, 0.027329251170158386, 0.0930030569434166, 0.03945063054561615, -0.02234741672873497, 0.0657259151339531, -0.15022647380828857, -0.03686198964715004, 0.03966449946165085, -0.014821960590779781, 0.022462747991085052, 0.048782214522361755, 0.01900356635451317, 0.014281739480793476, 0.0744381994009018, 0.051359422504901886, 0.043146438896656036, 0.054591625928878784, 0.02954341098666191, -0.04896369203925133, -0.08800899237394333, -0.04467042535543442, 0.0032379510812461376, 0.058675315231084824, 0.12987293303012848, 0.010792074725031853, -0.06998851895332336, 0.0024203723296523094, 0.06055322289466858, -0.01847190037369728, -0.08398778736591339, -0.11259135603904724, 0.21841737627983093, -0.022776726633310318, 0.011702751740813255, -0.0013669170439243317, -0.03545460104942322, 0.020076904445886612, 0.20618940889835358, 0.26152077317237854, -0.02222667820751667, -0.01586262136697769, 0.010568449273705482, 0.0001846584491431713, 0.03695659339427948, 0.12577201426029205, -0.02777884714305401, 0.22359472513198853, -0.046777449548244476, 0.06737222522497177, -0.05537553131580353, -0.014299402013421059, -0.07450424134731293, 0.061424657702445984, -0.001578204333782196, -0.01836337149143219, -0.014155775308609009, 0.06984956562519073, -0.04071302339434624, -0.12650424242019653, -0.029551919549703598, 0.005514103919267654, -0.058359190821647644, 0.011046874336898327, 0.0020564431324601173, 0.03376493230462074, 0.07748642563819885, -0.01588892936706543, -0.0020990539342164993, 0.13050198554992676, 0.01098928228020668, -0.10912102460861206, -0.037600722163915634, 0.12838557362556458, 0.018519911915063858, 0.1340782791376114, 0.04876743629574776, 0.08712469041347504, 0.07130827009677887, 0.015149479731917381, -0.06677284836769104, 0.03636588156223297, -0.028407320380210876, 0.019770564511418343, 0.004539488349109888, 0.10587862133979797, -0.010519773699343204, 0.07475674152374268, 0.016607699915766716, -0.0808752030134201, 0.05683104693889618, 0.008673112839460373, -0.07627810537815094, -0.03255736455321312, 0.1042289137840271, -0.11158230900764465, 0.14271792769432068, 0.13774631917476654, -0.005030146799981594, -0.07176224142313004, -0.012138426303863525, 0.027100618928670883, -0.008060954511165619, 0.04774492606520653, -0.029893167316913605, -0.13074781000614166, 0.00018004095181822777, -0.09478544443845749, 0.04576292634010315, -0.24173954129219055, -0.06664414703845978, 0.016213994473218918, -0.000884735956788063, -0.028645452111959457, 0.030585195869207382, 0.061639197170734406, -0.0040400829166173935, -0.03497268259525299, 0.029452037066221237, -0.028589975088834763, 0.03562405705451965, -0.07439378648996353, -0.0681467354297638 ]
null
null
transformers
# UmBERTo Commoncrawl Cased [UmBERTo](https://github.com/musixmatchresearch/umberto) is a Roberta-based Language Model trained on large Italian Corpora and uses two innovative approaches: SentencePiece and Whole Word Masking. Now available at [github.com/huggingface/transformers](https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1) <p align="center"> <img src="https://user-images.githubusercontent.com/7140210/72913702-d55a8480-3d3d-11ea-99fc-f2ef29af4e72.jpg" width="700"> </br> Marco Lodola, Monument to Umberto Eco, Alessandria 2019 </p> ## Dataset UmBERTo-Commoncrawl-Cased utilizes the Italian subcorpus of [OSCAR](https://traces1.inria.fr/oscar/) as training set of the language model. We used deduplicated version of the Italian corpus that consists in 70 GB of plain text data, 210M sentences with 11B words where the sentences have been filtered and shuffled at line level in order to be used for NLP research. ## Pre-trained model | Model | WWM | Cased | Tokenizer | Vocab Size | Train Steps | Download | | ------ | ------ | ------ | ------ | ------ |------ | ------ | | `umberto-commoncrawl-cased-v1` | YES | YES | SPM | 32K | 125k | [Link](http://bit.ly/35zO7GH) | This model was trained with [SentencePiece](https://github.com/google/sentencepiece) and Whole Word Masking. ## Downstream Tasks These results refers to umberto-commoncrawl-cased model. All details are at [Umberto](https://github.com/musixmatchresearch/umberto) Official Page. #### Named Entity Recognition (NER) | Dataset | F1 | Precision | Recall | Accuracy | | ------ | ------ | ------ | ------ | ------ | | **ICAB-EvalITA07** | **87.565** | 86.596 | 88.556 | 98.690 | | **WikiNER-ITA** | **92.531** | 92.509 | 92.553 | 99.136 | #### Part of Speech (POS) | Dataset | F1 | Precision | Recall | Accuracy | | ------ | ------ | ------ | ------ | ------ | | **UD_Italian-ISDT** | 98.870 | 98.861 | 98.879 | **98.977** | | **UD_Italian-ParTUT** | 98.786 | 98.812 | 98.760 | **98.903** | ## Usage ##### Load UmBERTo with AutoModel, Autotokenizer: ```python import torch from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Musixmatch/umberto-commoncrawl-cased-v1") umberto = AutoModel.from_pretrained("Musixmatch/umberto-commoncrawl-cased-v1") encoded_input = tokenizer.encode("Umberto Eco è stato un grande scrittore") input_ids = torch.tensor(encoded_input).unsqueeze(0) # Batch size 1 outputs = umberto(input_ids) last_hidden_states = outputs[0] # The last hidden-state is the first element of the output ``` ##### Predict masked token: ```python from transformers import pipeline fill_mask = pipeline( "fill-mask", model="Musixmatch/umberto-commoncrawl-cased-v1", tokenizer="Musixmatch/umberto-commoncrawl-cased-v1" ) result = fill_mask("Umberto Eco è <mask> un grande scrittore") # {'sequence': '<s> Umberto Eco è considerato un grande scrittore</s>', 'score': 0.18599839508533478, 'token': 5032} # {'sequence': '<s> Umberto Eco è stato un grande scrittore</s>', 'score': 0.17816807329654694, 'token': 471} # {'sequence': '<s> Umberto Eco è sicuramente un grande scrittore</s>', 'score': 0.16565583646297455, 'token': 2654} # {'sequence': '<s> Umberto Eco è indubbiamente un grande scrittore</s>', 'score': 0.0932890921831131, 'token': 17908} # {'sequence': '<s> Umberto Eco è certamente un grande scrittore</s>', 'score': 0.054701317101716995, 'token': 5269} ``` ## Citation All of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license. * UD Italian-ISDT Dataset [Github](https://github.com/UniversalDependencies/UD_Italian-ISDT) * UD Italian-ParTUT Dataset [Github](https://github.com/UniversalDependencies/UD_Italian-ParTUT) * I-CAB (Italian Content Annotation Bank), EvalITA [Page](http://www.evalita.it/) * WIKINER [Page](https://figshare.com/articles/Learning_multilingual_named_entity_recognition_from_Wikipedia/5462500) , [Paper](https://www.sciencedirect.com/science/article/pii/S0004370212000276?via%3Dihub) ``` @inproceedings {magnini2006annotazione, title = {Annotazione di contenuti concettuali in un corpus italiano: I - CAB}, author = {Magnini,Bernardo and Cappelli,Amedeo and Pianta,Emanuele and Speranza,Manuela and Bartalesi Lenzi,V and Sprugnoli,Rachele and Romano,Lorenza and Girardi,Christian and Negri,Matteo}, booktitle = {Proc.of SILFI 2006}, year = {2006} } @inproceedings {magnini2006cab, title = {I - CAB: the Italian Content Annotation Bank.}, author = {Magnini,Bernardo and Pianta,Emanuele and Girardi,Christian and Negri,Matteo and Romano,Lorenza and Speranza,Manuela and Lenzi,Valentina Bartalesi and Sprugnoli,Rachele}, booktitle = {LREC}, pages = {963--968}, year = {2006}, organization = {Citeseer} } ``` ## Authors **Loreto Parisi**: `loreto at musixmatch dot com`, [loretoparisi](https://github.com/loretoparisi) **Simone Francia**: `simone.francia at musixmatch dot com`, [simonefrancia](https://github.com/simonefrancia) **Paolo Magnani**: `paul.magnani95 at gmail dot com`, [paulthemagno](https://github.com/paulthemagno) ## About Musixmatch AI ![Musxmatch Ai mac app icon-128](https://user-images.githubusercontent.com/163333/72244273-396aa380-35ee-11ea-894b-4ea48230c02b.png) We do Machine Learning and Artificial Intelligence @[musixmatch](https://twitter.com/Musixmatch) Follow us on [Twitter](https://twitter.com/musixmatchai) [Github](https://github.com/musixmatchresearch)
{"language": "it"}
fill-mask
Musixmatch/umberto-commoncrawl-cased-v1
[ "transformers", "pytorch", "camembert", "fill-mask", "it", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "it" ]
TAGS #transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #has_space #region-us
UmBERTo Commoncrawl Cased ========================= UmBERTo is a Roberta-based Language Model trained on large Italian Corpora and uses two innovative approaches: SentencePiece and Whole Word Masking. Now available at URL ![](URL width=) Marco Lodola, Monument to Umberto Eco, Alessandria 2019 Dataset ------- UmBERTo-Commoncrawl-Cased utilizes the Italian subcorpus of OSCAR as training set of the language model. We used deduplicated version of the Italian corpus that consists in 70 GB of plain text data, 210M sentences with 11B words where the sentences have been filtered and shuffled at line level in order to be used for NLP research. Pre-trained model ----------------- This model was trained with SentencePiece and Whole Word Masking. Downstream Tasks ---------------- These results refers to umberto-commoncrawl-cased model. All details are at Umberto Official Page. #### Named Entity Recognition (NER) #### Part of Speech (POS) Usage ----- ##### Load UmBERTo with AutoModel, Autotokenizer: ##### Predict masked token: All of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license. * UD Italian-ISDT Dataset Github * UD Italian-ParTUT Dataset Github * I-CAB (Italian Content Annotation Bank), EvalITA Page * WIKINER Page , Paper Authors ------- Loreto Parisi: 'loreto at musixmatch dot com', loretoparisi Simone Francia: 'simone.francia at musixmatch dot com', simonefrancia Paolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno About Musixmatch AI ------------------- !Musxmatch Ai mac app icon-128 We do Machine Learning and Artificial Intelligence @musixmatch Follow us on Twitter Github
[ "#### Named Entity Recognition (NER)", "#### Part of Speech (POS)\n\n\n\nUsage\n-----", "##### Load UmBERTo with AutoModel, Autotokenizer:", "##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ "TAGS\n#transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "#### Named Entity Recognition (NER)", "#### Part of Speech (POS)\n\n\n\nUsage\n-----", "##### Load UmBERTo with AutoModel, Autotokenizer:", "##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ 44, 12, 12, 16, 194 ]
[ "passage: TAGS\n#transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### Named Entity Recognition (NER)#### Part of Speech (POS)\n\n\n\nUsage\n-----##### Load UmBERTo with AutoModel, Autotokenizer:##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ -0.053635384887456894, 0.2596154808998108, -0.0032980842515826225, 0.06754529476165771, 0.12512007355690002, 0.005711376667022705, 0.15705914795398712, 0.09812825918197632, 0.10461301356554031, 0.10137905925512314, 0.04463786631822586, 0.07858213782310486, 0.09936758130788803, 0.18162623047828674, -0.01563975028693676, -0.17024467885494232, 0.0679347887635231, -0.10280302911996841, 0.005889069754630327, 0.06358014792203903, 0.11444951593875885, -0.044670648872852325, 0.11399026215076447, 0.00013060684432275593, -0.06456052511930466, 0.019618051126599312, -0.013853400945663452, -0.08937392383813858, 0.048236556351184845, 0.07329815626144409, 0.023963239043951035, 0.01960008218884468, 0.008904749527573586, -0.08454129099845886, 0.029065659269690514, 0.043541375547647476, -0.009976238012313843, 0.00740034831687808, 0.07513819634914398, -0.05551377683877945, 0.14782802760601044, -0.030182532966136932, 0.026549426838755608, 0.014315688982605934, -0.1202094629406929, 0.01948183961212635, -0.09693959355354309, 0.028672493994235992, -0.019704975187778473, 0.08598294109106064, -0.03885979577898979, 0.1449979990720749, -0.0653429701924324, 0.029880940914154053, 0.1640925407409668, -0.2160177081823349, -0.0707932561635971, -0.007875097915530205, 0.019315993413329124, 0.02853679656982422, -0.04502001777291298, 0.07252483814954758, 0.00872635468840599, -0.01635061763226986, -0.063369520008564, -0.014726866967976093, 0.05550333485007286, -0.04430103302001953, -0.08527024835348129, -0.03622985631227493, 0.2600316107273102, -0.004696860909461975, -0.04536586254835129, -0.10993841290473938, 0.021021706983447075, 0.1108676865696907, -0.03145593777298927, 0.0064324671402573586, -0.005389663390815258, 0.023759793490171432, 0.006842812057584524, 0.00027615000726655126, -0.05393292382359505, 0.00906533282250166, -0.06674528867006302, 0.05766196921467781, -0.0014453728217631578, 0.013628985732793808, 0.010716114193201065, -0.021984664723277092, -0.10118691623210907, -0.08296288549900055, -0.007854863069951534, -0.0674416646361351, -0.0011851069284603, -0.030267275869846344, -0.05981459468603134, -0.08396275341510773, 0.1292116343975067, 0.1504175215959549, -0.017671780660748482, -0.01353096216917038, -0.015549471601843834, 0.06996813416481018, 0.0982128456234932, 0.08772698044776917, -0.19404594600200653, -0.10689135640859604, 0.015377975068986416, -0.0611373595893383, 0.02812604419887066, -0.06069406867027283, -0.04430130496621132, 0.06524262577295303, -0.05987270176410675, 0.047228675335645676, 0.09795386344194412, 0.04686320573091507, -0.12408894300460815, -0.04867749661207199, 0.1347188502550125, -0.09813085943460464, 0.07368793338537216, 0.028600988909602165, -0.08183404058218002, 0.007844461128115654, -0.057861313223838806, 0.036392197012901306, -0.07255015522241592, 0.04728923738002777, -0.060848746448755264, -0.027283474802970886, -0.06305336207151413, -0.08749568462371826, 0.07467513531446457, -0.03531072661280632, -0.029755771160125732, -0.1363694965839386, -0.05517423152923584, -0.0764935165643692, 0.0608937032520771, -0.08833468705415726, -0.00847796630114317, -0.024015352129936218, 0.02089037001132965, 0.016267791390419006, 0.013030941598117352, -0.06628505140542984, -0.07474170625209808, 0.037473492324352264, -0.10892686992883682, 0.09036338329315186, -0.09507536888122559, 0.03894992545247078, -0.08501328527927399, 0.002216252265498042, -0.16935661435127258, 0.060486141592264175, -0.0953226163983345, 0.08137739449739456, -0.106615349650383, 0.02459385059773922, 0.0055116391740739346, 0.007457825355231762, 0.09255161881446838, 0.15309658646583557, -0.10709476470947266, -0.048606183379888535, 0.12169357389211655, -0.08967006951570511, -0.06352817267179489, 0.20276321470737457, 0.0230251494795084, -0.005467496812343597, 0.07124797254800797, 0.18535897135734558, 0.036866918206214905, -0.12050352245569229, -0.03551593795418739, -0.060109514743089676, -0.02172097936272621, 0.03184807673096657, 0.08957408368587494, -0.04240098595619202, 0.027936456725001335, 0.001532113179564476, -0.04630575701594353, -0.00018247804837301373, -0.025693977251648903, -0.07169878482818604, 0.021623805165290833, -0.10937272757291794, -0.0033824241254478693, 0.032892920076847076, 0.008627955801784992, -0.06628067046403885, -0.046068623661994934, -0.16798356175422668, 0.05222359672188759, -0.02122604101896286, -0.024305695667862892, -0.1220371350646019, 0.21208003163337708, 0.11126864701509476, 0.014769171364605427, -0.08526870608329773, -0.04464627802371979, 0.05966722592711449, -0.015617622062563896, 0.050132960081100464, -0.07325121015310287, 0.014248496852815151, 0.025832587853074074, -0.0014602941228076816, 0.02617386169731617, 0.03723517432808876, 0.004881330765783787, 0.0040063802152872086, -0.15805742144584656, 0.048741426318883896, -0.039845965802669525, 0.13585370779037476, -0.0851626917719841, 0.009302271530032158, 0.0823434442281723, 0.10583695024251938, 0.006751794368028641, -0.048937320709228516, -0.015243550762534142, 0.038956865668296814, -0.009655268862843513, -0.038198765367269516, 0.04437240958213806, 0.02636057510972023, 0.005774789955466986, 0.13798734545707703, -0.05592784658074379, 0.04085268825292587, 0.13921648263931274, -0.013238887302577496, -0.04820146784186363, 0.062346551567316055, 0.001882305252365768, -0.003454387653619051, -0.05358635261654854, -0.03128242492675781, 0.06964441388845444, 0.008695550262928009, 0.03945969045162201, -0.0861147791147232, -0.007302230689674616, 0.0038789117243140936, -0.03126920014619827, -0.15066109597682953, 0.09651312232017517, 0.14142179489135742, -0.16393893957138062, 0.1146683469414711, 0.11240179091691971, 0.025745758786797523, 0.21021443605422974, 0.022176772356033325, -0.06290135532617569, -0.018895676359534264, 0.028014982119202614, 0.039436887949705124, 0.04580638185143471, -0.13957332074642181, -0.02677217870950699, 0.024759141728281975, -0.04322635754942894, 0.02531718648970127, -0.06130511313676834, -0.024993350729346275, -0.02323252707719803, -0.01905222423374653, -0.03676708787679672, 0.026683984324336052, -0.015958290547132492, 0.10444775223731995, 0.04132593795657158, -0.08862008154392242, -0.02236628718674183, -0.040856510400772095, -0.07414023578166962, 0.08981311321258545, -0.0983462706208229, -0.29242801666259766, -0.0540282279253006, -0.02930295653641224, -0.0633009821176529, 0.03642147406935692, 0.009215225465595722, 0.007182321045547724, -0.08197944611310959, -0.06966431438922882, -0.06259313225746155, 0.05404682084918022, -0.04520461708307266, 0.04692081734538078, -0.00606577517464757, -0.03108818084001541, -0.07752390205860138, -0.03338680788874626, 0.01854833960533142, -0.07425878196954727, 0.011957000009715557, -0.08568330854177475, 0.15950508415699005, 0.0513000525534153, -0.03270501643419266, 0.01557939127087593, -0.020598361268639565, 0.2732645869255066, -0.13853326439857483, 0.002591530093923211, 0.08067045360803604, 0.030911922454833984, 0.04864275082945824, 0.19810780882835388, 0.034440163522958755, -0.07962434738874435, 0.017683887854218483, -0.008503175340592861, -0.020844263955950737, -0.1635783165693283, -0.1384727656841278, -0.04697675630450249, 0.07678227871656418, 0.06306841224431992, 0.047507453709840775, 0.025056833401322365, 0.013183469884097576, -0.05380938574671745, -0.027435043826699257, 0.08442894369363785, 0.04534665495157242, 0.00692219752818346, 0.006285638548433781, 0.06727425754070282, -0.0342877060174942, -0.01455126702785492, 0.16739681363105774, 0.03488612174987793, 0.12024573236703873, 0.04155639186501503, 0.14102666079998016, 0.026462366804480553, 0.08343733102083206, 0.020050471648573875, 0.006828709971159697, 0.018100561574101448, 0.018546290695667267, -0.027828313410282135, -0.04685131087899208, -0.02701975405216217, 0.04457911103963852, 0.1762409508228302, -0.16333001852035522, -0.054580770432949066, -0.042399194091558456, 0.1301276981830597, 0.222300186753273, 0.05124424025416374, -0.17673613131046295, -0.014521192759275436, 0.02003025822341442, -0.004034474492073059, -0.05078660696744919, 0.018211932852864265, 0.10402018576860428, -0.14797860383987427, 0.05359750613570213, 0.031182773411273956, 0.06551482528448105, -0.11111821979284286, -0.021779483184218407, 0.021890949457883835, -0.007423666771501303, -0.02890004962682724, 0.02641100063920021, -0.2435724288225174, 0.1798037439584732, 0.010506105609238148, 0.054880719631910324, -0.07822664082050323, -0.023419655859470367, 0.0711149275302887, 0.07269991189241409, 0.13589520752429962, 0.019185306504368782, -0.15664122998714447, -0.10407654196023941, -0.1003042533993721, -0.018879828974604607, 0.05781416967511177, -0.07053611427545547, -0.0010562725365161896, -0.010873518884181976, -0.030416961759328842, -0.06589080393314362, 0.052863769233226776, -0.12796272337436676, -0.18103650212287903, 0.059266071766614914, 0.08560081571340561, 0.07050584256649017, -0.041386980563402176, -0.05660141631960869, -0.12066943198442459, 0.17944061756134033, -0.01808772049844265, -0.019060833379626274, -0.10103770345449448, 0.014398648403584957, 0.010947931557893753, -0.03726542741060257, 0.032539140433073044, -0.037816304713487625, 0.09065067023038864, -0.04538026824593544, -0.027134256437420845, 0.08205760270357132, -0.10767095535993576, -0.08783642947673798, -0.061385490000247955, 0.03830675780773163, 0.06726197898387909, 0.028367629274725914, 0.06543655693531036, 0.024822069332003593, 0.011821017600595951, -0.08786816149950027, 0.025972692295908928, 0.04556987062096596, -0.005154143553227186, 0.11855977028608322, -0.037310902029275894, -0.23718786239624023, -0.11158662289381027, -0.07451741397380829, 0.14019905030727386, 0.1728702038526535, -0.036439016461372375, 0.07183392345905304, 0.17201916873455048, -0.06976880133152008, -0.27884745597839355, -0.027467088773846626, -0.043099842965602875, -0.04095816612243652, -0.020372219383716583, -0.18187619745731354, -0.006450964137911797, 0.09496625512838364, -0.020487656816840172, 0.052341341972351074, -0.2632521092891693, -0.08589313179254532, 0.040816571563482285, 0.03532873094081879, 0.11148006469011307, -0.11321261525154114, -0.05295595899224281, -0.07479023933410645, -0.021989380940794945, 0.09286967664957047, 0.014564740471541882, 0.0724741742014885, 0.07762125134468079, -0.012078757397830486, 0.0006970807444304228, -0.019764160737395287, 0.1196105033159256, -0.017564084380865097, 0.006646864116191864, -0.07307464629411697, -0.01817277818918228, 0.08326806128025055, 0.01127347256988287, 0.06688345968723297, -0.013080821372568607, -0.04206451028585434, -0.047734230756759644, -0.026974929496645927, -0.040437351912260056, 0.10703042894601822, -0.014097188599407673, 0.013536534272134304, -0.054818764328956604, 0.1347501575946808, 0.04970965161919594, 0.051214899867773056, -0.06950497627258301, -0.051506467163562775, -0.01620061881840229, 0.03387066349387169, 0.05487174913287163, -0.036653757095336914, 0.07834137976169586, -0.006972052622586489, -0.02841697633266449, 0.05315074324607849, 0.049752507358789444, 0.032614268362522125, 0.07120255380868912, -0.025790924206376076, 0.09397371858358383, 0.0007764050969853997, -0.12042934447526932, 0.03325251489877701, 0.1397397220134735, -0.14867551624774933, -0.08300196379423141, 0.002197552239522338, 0.05370068550109863, 0.044983573257923126, -0.01690134033560753, 0.13199611008167267, 0.000649065594188869, -0.036058444529771805, -0.013469476252794266, 0.07040909677743912, -0.0019313953816890717, 0.09870240092277527, -0.02378697693347931, -0.038796935230493546, -0.1107570081949234, 0.10488581657409668, 0.12651456892490387, -0.1122131273150444, 0.03095903806388378, 0.09883163124322891, -0.05662218853831291, -0.07478557527065277, -0.045982688665390015, 0.03213515132665634, 0.01074162032455206, -0.11038029938936234, -0.03918750584125519, -0.029039211571216583, 0.05185534805059433, 0.11235979944467545, 0.028632160276174545, 0.03902020305395126, 0.06703170388936996, -0.027875274419784546, -0.03435426950454712, 0.07890061289072037, 0.051996827125549316, -0.01949257031083107, -0.007916552014648914, -0.06292033940553665, -0.0006879816646687686, -0.045191358774900436, -0.01197389978915453, -0.048937439918518066, -0.16252166032791138, -0.006168333813548088, 0.08183258026838303, -0.0064117759466171265, -0.11641146242618561, -0.0019716823007911444, 0.000538328371476382, -0.05374816432595253, -0.06834559887647629, -0.015409850515425205, -0.059601858258247375, -0.05974690243601799, -0.0004549362347461283, 0.1066395565867424, -0.18146808445453644, -0.019001679494976997, 0.0857049897313118, -0.04088912531733513, 0.10481434315443039, 0.04015742987394333, -0.014502759091556072, -0.044775303453207016, -0.2617116868495941, -0.05158178508281708, -0.05929145589470863, 0.03385396674275398, 0.030043678358197212, -0.18737459182739258, 0.04619124159216881, 0.0502990186214447, -0.03877883404493332, 0.037847474217414856, 0.1045544296503067, -0.05591418221592903, 0.04998480901122093, 0.05487211048603058, -0.13948862254619598, -0.01597931981086731, 0.044129371643066406, 0.1303294152021408, -0.023579344153404236, 0.11990711092948914, -0.0765673816204071, 0.013496833853423595, 0.007544134743511677, 0.03779811039566994, 0.0339653454720974, -0.0728052482008934, -0.040231380611658096, 0.0026435735635459423, 0.0593251995742321, 0.008635505102574825, 0.167591392993927, 0.11742817610502243, -0.05257339030504227, 0.06085134670138359, 0.047002099454402924, -0.07300543785095215, 0.06946801394224167, 0.049213580787181854, 0.020129960030317307, -0.03267262503504753, -0.042092543095350266, 0.028415612876415253, 0.0024318979121744633, 0.11756565421819687, 0.08337884396314621, 0.15922148525714874, 0.1524980366230011, 0.03335241600871086, 0.07748729735612869, -0.02533240057528019, -0.09411658346652985, 0.008060077205300331, -0.054404549300670624, 0.08759169280529022, -0.07301989942789078, 0.04509185627102852, 0.058656658977270126, -0.17514130473136902, 0.04396575316786766, -0.06730777770280838, -0.0491420142352581, -0.04554876685142517, -0.20814979076385498, -0.055321335792541504, -0.06533502787351608, 0.021343408152461052, -0.10496805608272552, 0.10195240378379822, 0.0299597829580307, 0.05040450021624565, -0.07842954248189926, 0.029749739915132523, -0.15906184911727905, -0.03405182063579559, 0.12920065224170685, -0.018914395943284035, 0.0191965289413929, -0.05726931244134903, -0.0578932948410511, -0.08756888657808304, 0.06711245328187943, 0.045980099588632584, 0.0769980326294899, 0.01574457809329033, 0.015587564557790756, -0.08125868439674377, -0.09475865960121155, -0.007980112917721272, -0.04555497318506241, -0.05026138573884964, 0.11692202836275101, 0.0520687960088253, 0.002884549554437399, 0.058860793709754944, 0.13513056933879852, 0.008260191418230534, -0.02197383902966976, -0.23313911259174347, 0.012138527818024158, -0.02390475757420063, 0.02156272903084755, -0.010153008624911308, -0.09708713740110397, -0.04318860173225403, 0.23510417342185974, 0.20151680707931519, -0.06856711953878403, 0.0011177083943039179, -0.028007078915834427, 0.0175408273935318, 0.01881510019302368, 0.06269026547670364, 0.04899470508098602, 0.1734807938337326, -0.022048959508538246, 0.0337190106511116, -0.054411087185144424, -0.019582075998187065, -0.1442088484764099, 0.08846279233694077, 0.024379748851060867, -0.01903119683265686, -0.0924152210354805, 0.1291869878768921, -0.10244811326265335, -0.12020532786846161, 0.003963773604482412, -0.11675041168928146, -0.10319387912750244, 0.018341168761253357, -0.05515943467617035, 0.06237202137708664, 0.0958707258105278, 0.044865623116493225, -0.034648869186639786, 0.030286328867077827, 0.030866196379065514, -0.07840163260698318, -0.07681985199451447, 0.037938911467790604, -0.07344905287027359, 0.28260505199432373, -0.02407880499958992, 0.019409243017435074, 0.09984464198350906, 0.027492765337228775, -0.09806215018033981, 0.00978496391326189, 0.05938725918531418, -0.0223337821662426, 0.0006781925330869853, 0.06725609302520752, -0.03915638476610184, 0.15318500995635986, 0.08519251644611359, 0.016057521104812622, 0.07960443198680878, -0.06222141906619072, 0.08644918352365494, -0.07912586629390717, 0.09829834848642349, -0.14452046155929565, 0.13834211230278015, 0.12444320321083069, -0.011607258580625057, 0.011462614871561527, -0.01590726152062416, 0.02730046771466732, -0.016048870980739594, -0.012874389998614788, -0.07103081792593002, -0.19356517493724823, -0.009122694842517376, -0.04932200163602829, 0.10542239248752594, -0.11318404227495193, -0.01649579592049122, -0.10347190499305725, 0.02120295539498329, -0.04864317551255226, 0.09658100455999374, 0.11921146512031555, -0.05850217863917351, -0.04520450532436371, -0.09899234026670456, 0.032788652926683426, 0.08229575306177139, -0.025662139058113098, -0.04937511309981346 ]
null
null
transformers
# UmBERTo Wikipedia Uncased [UmBERTo](https://github.com/musixmatchresearch/umberto) is a Roberta-based Language Model trained on large Italian Corpora and uses two innovative approaches: SentencePiece and Whole Word Masking. Now available at [github.com/huggingface/transformers](https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1) <p align="center"> <img src="https://user-images.githubusercontent.com/7140210/72913702-d55a8480-3d3d-11ea-99fc-f2ef29af4e72.jpg" width="700"> </br> Marco Lodola, Monument to Umberto Eco, Alessandria 2019 </p> ## Dataset UmBERTo-Wikipedia-Uncased Training is trained on a relative small corpus (~7GB) extracted from [Wikipedia-ITA](https://linguatools.org/tools/corpora/wikipedia-monolingual-corpora/). ## Pre-trained model | Model | WWM | Cased | Tokenizer | Vocab Size | Train Steps | Download | | ------ | ------ | ------ | ------ | ------ |------ | ------ | | `umberto-wikipedia-uncased-v1` | YES | YES | SPM | 32K | 100k | [Link](http://bit.ly/35wbSj6) | This model was trained with [SentencePiece](https://github.com/google/sentencepiece) and Whole Word Masking. ## Downstream Tasks These results refers to umberto-wikipedia-uncased model. All details are at [Umberto](https://github.com/musixmatchresearch/umberto) Official Page. #### Named Entity Recognition (NER) | Dataset | F1 | Precision | Recall | Accuracy | | ------ | ------ | ------ | ------ | ----- | | **ICAB-EvalITA07** | **86.240** | 85.939 | 86.544 | 98.534 | | **WikiNER-ITA** | **90.483** | 90.328 | 90.638 | 98.661 | #### Part of Speech (POS) | Dataset | F1 | Precision | Recall | Accuracy | | ------ | ------ | ------ | ------ | ------ | | **UD_Italian-ISDT** | 98.563 | 98.508 | 98.618 | **98.717** | | **UD_Italian-ParTUT** | 97.810 | 97.835 | 97.784 | **98.060** | ## Usage ##### Load UmBERTo Wikipedia Uncased with AutoModel, Autotokenizer: ```python import torch from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Musixmatch/umberto-wikipedia-uncased-v1") umberto = AutoModel.from_pretrained("Musixmatch/umberto-wikipedia-uncased-v1") encoded_input = tokenizer.encode("Umberto Eco è stato un grande scrittore") input_ids = torch.tensor(encoded_input).unsqueeze(0) # Batch size 1 outputs = umberto(input_ids) last_hidden_states = outputs[0] # The last hidden-state is the first element of the output ``` ##### Predict masked token: ```python from transformers import pipeline fill_mask = pipeline( "fill-mask", model="Musixmatch/umberto-wikipedia-uncased-v1", tokenizer="Musixmatch/umberto-wikipedia-uncased-v1" ) result = fill_mask("Umberto Eco è <mask> un grande scrittore") # {'sequence': '<s> umberto eco è stato un grande scrittore</s>', 'score': 0.5784581303596497, 'token': 361} # {'sequence': '<s> umberto eco è anche un grande scrittore</s>', 'score': 0.33813193440437317, 'token': 269} # {'sequence': '<s> umberto eco è considerato un grande scrittore</s>', 'score': 0.027196012437343597, 'token': 3236} # {'sequence': '<s> umberto eco è diventato un grande scrittore</s>', 'score': 0.013716378249228, 'token': 5742} # {'sequence': '<s> umberto eco è inoltre un grande scrittore</s>', 'score': 0.010662357322871685, 'token': 1030} ``` ## Citation All of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license. * UD Italian-ISDT Dataset [Github](https://github.com/UniversalDependencies/UD_Italian-ISDT) * UD Italian-ParTUT Dataset [Github](https://github.com/UniversalDependencies/UD_Italian-ParTUT) * I-CAB (Italian Content Annotation Bank), EvalITA [Page](http://www.evalita.it/) * WIKINER [Page](https://figshare.com/articles/Learning_multilingual_named_entity_recognition_from_Wikipedia/5462500) , [Paper](https://www.sciencedirect.com/science/article/pii/S0004370212000276?via%3Dihub) ``` @inproceedings {magnini2006annotazione, title = {Annotazione di contenuti concettuali in un corpus italiano: I - CAB}, author = {Magnini,Bernardo and Cappelli,Amedeo and Pianta,Emanuele and Speranza,Manuela and Bartalesi Lenzi,V and Sprugnoli,Rachele and Romano,Lorenza and Girardi,Christian and Negri,Matteo}, booktitle = {Proc.of SILFI 2006}, year = {2006} } @inproceedings {magnini2006cab, title = {I - CAB: the Italian Content Annotation Bank.}, author = {Magnini,Bernardo and Pianta,Emanuele and Girardi,Christian and Negri,Matteo and Romano,Lorenza and Speranza,Manuela and Lenzi,Valentina Bartalesi and Sprugnoli,Rachele}, booktitle = {LREC}, pages = {963--968}, year = {2006}, organization = {Citeseer} } ``` ## Authors **Loreto Parisi**: `loreto at musixmatch dot com`, [loretoparisi](https://github.com/loretoparisi) **Simone Francia**: `simone.francia at musixmatch dot com`, [simonefrancia](https://github.com/simonefrancia) **Paolo Magnani**: `paul.magnani95 at gmail dot com`, [paulthemagno](https://github.com/paulthemagno) ## About Musixmatch AI ![Musxmatch Ai mac app icon-128](https://user-images.githubusercontent.com/163333/72244273-396aa380-35ee-11ea-894b-4ea48230c02b.png) We do Machine Learning and Artificial Intelligence @[musixmatch](https://twitter.com/Musixmatch) Follow us on [Twitter](https://twitter.com/musixmatchai) [Github](https://github.com/musixmatchresearch)
{"language": "it"}
fill-mask
Musixmatch/umberto-wikipedia-uncased-v1
[ "transformers", "pytorch", "camembert", "fill-mask", "it", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "it" ]
TAGS #transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #region-us
UmBERTo Wikipedia Uncased ========================= UmBERTo is a Roberta-based Language Model trained on large Italian Corpora and uses two innovative approaches: SentencePiece and Whole Word Masking. Now available at URL ![](URL width=) Marco Lodola, Monument to Umberto Eco, Alessandria 2019 Dataset ------- UmBERTo-Wikipedia-Uncased Training is trained on a relative small corpus (~7GB) extracted from Wikipedia-ITA. Pre-trained model ----------------- This model was trained with SentencePiece and Whole Word Masking. Downstream Tasks ---------------- These results refers to umberto-wikipedia-uncased model. All details are at Umberto Official Page. #### Named Entity Recognition (NER) #### Part of Speech (POS) Usage ----- ##### Load UmBERTo Wikipedia Uncased with AutoModel, Autotokenizer: ##### Predict masked token: All of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license. * UD Italian-ISDT Dataset Github * UD Italian-ParTUT Dataset Github * I-CAB (Italian Content Annotation Bank), EvalITA Page * WIKINER Page , Paper Authors ------- Loreto Parisi: 'loreto at musixmatch dot com', loretoparisi Simone Francia: 'simone.francia at musixmatch dot com', simonefrancia Paolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno About Musixmatch AI ------------------- !Musxmatch Ai mac app icon-128 We do Machine Learning and Artificial Intelligence @musixmatch Follow us on Twitter Github
[ "#### Named Entity Recognition (NER)", "#### Part of Speech (POS)\n\n\n\nUsage\n-----", "##### Load UmBERTo Wikipedia Uncased with AutoModel, Autotokenizer:", "##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ "TAGS\n#transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #region-us \n", "#### Named Entity Recognition (NER)", "#### Part of Speech (POS)\n\n\n\nUsage\n-----", "##### Load UmBERTo Wikipedia Uncased with AutoModel, Autotokenizer:", "##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ 40, 12, 12, 20, 194 ]
[ "passage: TAGS\n#transformers #pytorch #camembert #fill-mask #it #autotrain_compatible #endpoints_compatible #region-us \n#### Named Entity Recognition (NER)#### Part of Speech (POS)\n\n\n\nUsage\n-----##### Load UmBERTo Wikipedia Uncased with AutoModel, Autotokenizer:##### Predict masked token:\n\n\nAll of the original datasets are publicly available or were released with the owners' grant. The datasets are all released under a CC0 or CCBY license.\n\n\n* UD Italian-ISDT Dataset Github\n* UD Italian-ParTUT Dataset Github\n* I-CAB (Italian Content Annotation Bank), EvalITA Page\n* WIKINER Page , Paper\n\n\nAuthors\n-------\n\n\nLoreto Parisi: 'loreto at musixmatch dot com', loretoparisi\nSimone Francia: 'simone.francia at musixmatch dot com', simonefrancia\nPaolo Magnani: 'paul.magnani95 at gmail dot com', paulthemagno\n\n\nAbout Musixmatch AI\n-------------------\n\n\n!Musxmatch Ai mac app icon-128\nWe do Machine Learning and Artificial Intelligence @musixmatch\nFollow us on Twitter Github" ]
[ -0.05890237167477608, 0.2755753993988037, -0.0035437115002423525, 0.07187652587890625, 0.1290571242570877, 0.007013743277639151, 0.15732090175151825, 0.09867870807647705, 0.12197273224592209, 0.09762001037597656, 0.044749584048986435, 0.08905934542417526, 0.10306596755981445, 0.1700955480337143, -0.01677008531987667, -0.16142380237579346, 0.06495124846696854, -0.10245347023010254, 0.017704684287309647, 0.058777324855327606, 0.10582847893238068, -0.03488778695464134, 0.11732762306928635, -0.0033920733258128166, -0.06183332949876785, 0.023396441712975502, -0.0102846035733819, -0.08712301403284073, 0.05268184840679169, 0.0699053406715393, 0.01603478193283081, 0.01647060364484787, 0.008354109711945057, -0.09637068212032318, 0.028766050934791565, 0.04157367721199989, -0.00821128860116005, 0.0044220504350960255, 0.07881050556898117, -0.05819560959935188, 0.13431492447853088, -0.030406733974814415, 0.031402405351400375, 0.01573832519352436, -0.11413020640611649, 0.01834568753838539, -0.09864311665296555, 0.03919177129864693, -0.017404476180672646, 0.08301416784524918, -0.029309188947081566, 0.13514861464500427, -0.05853007361292839, 0.029811181128025055, 0.15942420065402985, -0.2045900970697403, -0.06515083461999893, -0.01711265556514263, 0.0010822908952832222, 0.017588580027222633, -0.04764197766780853, 0.07264679670333862, 0.003930604085326195, -0.010867663659155369, -0.06474995613098145, -0.019875183701515198, 0.03957955911755562, -0.048617638647556305, -0.07443253695964813, -0.033337101340293884, 0.24772992730140686, -0.005605990532785654, -0.04979891702532768, -0.1026599109172821, 0.02098378911614418, 0.1262182593345642, -0.0316474623978138, 0.0076026287861168385, -0.002774657215923071, 0.024556409567594528, 0.005699275992810726, -0.0065962341614067554, -0.042969755828380585, 0.008579857647418976, -0.06739602982997894, 0.051255665719509125, 0.0066434601321816444, 0.021831395104527473, 0.009424767456948757, -0.026088012382388115, -0.11973080039024353, -0.08115134388208389, -0.009946648962795734, -0.06497214734554291, -0.0001648303441470489, -0.035260673612356186, -0.044775549322366714, -0.08403128385543823, 0.13156838715076447, 0.16095314919948578, -0.025698116049170494, -0.013365374878048897, -0.02060176432132721, 0.06654184311628342, 0.0947023406624794, 0.07868741452693939, -0.1924906224012375, -0.10652010887861252, 0.02919125184416771, -0.06690780073404312, 0.023998010903596878, -0.05494300276041031, -0.03816302493214607, 0.06384439766407013, -0.05291004478931427, 0.0535186231136322, 0.09308944642543793, 0.04055844992399216, -0.12016582489013672, -0.049524370580911636, 0.13975802063941956, -0.09533798694610596, 0.0693010538816452, 0.02753444015979767, -0.08134157210588455, 0.020062893629074097, -0.0548289529979229, 0.03122139908373356, -0.08096816390752792, 0.0501612089574337, -0.061279281973838806, -0.02301199920475483, -0.0481216236948967, -0.07675681263208389, 0.07522884756326675, -0.041397932916879654, -0.02968950942158699, -0.1335774064064026, -0.07687098532915115, -0.07099554687738419, 0.06547819077968597, -0.08695931732654572, -0.001382616232149303, -0.028021488338708878, 0.024949420243501663, 0.009689540602266788, 0.017466723918914795, -0.08689578622579575, -0.07603438943624496, 0.0335354283452034, -0.10631163418292999, 0.08894141763448715, -0.09855878353118896, 0.038761597126722336, -0.09450078755617142, 0.0036298539489507675, -0.16619716584682465, 0.06087719649076462, -0.0959935411810875, 0.0839328020811081, -0.11125854402780533, 0.028589846566319466, 0.010227769613265991, 0.0022169032599776983, 0.09389146417379379, 0.16043737530708313, -0.10112562030553818, -0.052064746618270874, 0.12714208662509918, -0.09177502989768982, -0.06673292070627213, 0.206623837351799, 0.025530623272061348, -0.005037693772464991, 0.08195944130420685, 0.17735663056373596, 0.03874850273132324, -0.11346203088760376, -0.03630945086479187, -0.06339152902364731, -0.01957649365067482, 0.026706121861934662, 0.07617773860692978, -0.04683653637766838, 0.028668424114584923, 0.0067553091794252396, -0.05315203592181206, 0.0023495990317314863, -0.026084935292601585, -0.06963581591844559, 0.016232691705226898, -0.11562787741422653, -0.0028496028389781713, 0.031025761738419533, 0.007858308963477612, -0.061496224254369736, -0.03382537141442299, -0.17534203827381134, 0.04574691876769066, -0.02350800484418869, -0.021854860708117485, -0.11529537290334702, 0.21501505374908447, 0.10400579124689102, 0.017739035189151764, -0.08480411022901535, -0.04472046718001366, 0.06704787909984589, -0.021737847477197647, 0.05125415325164795, -0.10897728055715561, 0.008679019287228584, 0.03274740278720856, -0.0031728697940707207, 0.01919451914727688, 0.04453446716070175, 0.0011790477437898517, 0.006177574396133423, -0.15596118569374084, 0.05846861004829407, -0.032360613346099854, 0.11822355538606644, -0.08671895414590836, 0.006454596295952797, 0.07814134657382965, 0.10671573877334595, 0.008520033210515976, -0.04969659447669983, -0.014862923882901669, 0.03783585503697395, -0.011703066527843475, -0.044659554958343506, 0.04409331455826759, 0.021835895255208015, 0.006785497069358826, 0.1311223804950714, -0.032530564814805984, 0.03244316205382347, 0.12847402691841125, -0.013827966526150703, -0.053708016872406006, 0.06957906484603882, 0.007076536305248737, -0.0071369619108736515, -0.061422497034072876, -0.021339815109968185, 0.08900195360183716, 0.012791721150279045, 0.04416299983859062, -0.08939392119646072, -0.005208825226873159, 0.008054635487496853, -0.03322594612836838, -0.14566537737846375, 0.09025277942419052, 0.15218646824359894, -0.17533041536808014, 0.11161977797746658, 0.09994364529848099, 0.02228555455803871, 0.2093670517206192, 0.02620934136211872, -0.06269192695617676, -0.01205811183899641, 0.03053172491490841, 0.04184925556182861, 0.044395893812179565, -0.12959370017051697, -0.026714066043496132, 0.02179962396621704, -0.04234945774078369, 0.018807606771588326, -0.06073354557156563, -0.021361803635954857, -0.027064332738518715, -0.021515097469091415, -0.035274654626846313, 0.020149046555161476, -0.015230318531394005, 0.10428053885698318, 0.040032822638750076, -0.08895710855722427, -0.015763860195875168, -0.03688760846853256, -0.07719238102436066, 0.09621623903512955, -0.09649936109781265, -0.2970665395259857, -0.059631675481796265, -0.032096944749355316, -0.07725105434656143, 0.03055685944855213, 0.005694875027984381, 0.008952876552939415, -0.08506673574447632, -0.06676745414733887, -0.06549646705389023, 0.06352927535772324, -0.05070311576128006, 0.04178253933787346, -0.0030298747587949038, -0.03524850681424141, -0.07026786357164383, -0.03448314964771271, 0.022239642217755318, -0.07130163162946701, 0.010972664691507816, -0.09405425190925598, 0.15632615983486176, 0.040440015494823456, -0.03413361310958862, 0.014621628448367119, -0.014328298158943653, 0.26668697595596313, -0.1369834691286087, 0.00828242301940918, 0.07615996152162552, 0.030329076573252678, 0.05247711017727852, 0.19617974758148193, 0.03755681589245796, -0.08217696845531464, 0.013407157734036446, -0.005745663307607174, -0.01813516393303871, -0.17265547811985016, -0.12957793474197388, -0.04111959785223007, 0.07520604133605957, 0.06653640419244766, 0.050032418221235275, 0.03851215913891792, 0.015924878418445587, -0.055170938372612, -0.014721037819981575, 0.08563698828220367, 0.05254920944571495, 0.007669067941606045, 0.019241919741034508, 0.05742871016263962, -0.032345838844776154, -0.006238587200641632, 0.16614149510860443, 0.041448015719652176, 0.13723422586917877, 0.04400159791111946, 0.15105518698692322, 0.023569811135530472, 0.07852118462324142, 0.016905825585126877, 0.010602910071611404, 0.014923825860023499, 0.02051292173564434, -0.021425886079669, -0.04177975654602051, -0.02274731919169426, 0.036908917129039764, 0.16095373034477234, -0.15362298488616943, -0.055512432008981705, -0.03362478315830231, 0.12500441074371338, 0.23726463317871094, 0.045279186218976974, -0.1774531602859497, -0.01679607667028904, 0.013655953109264374, -0.011614284478127956, -0.054686255753040314, 0.02160998061299324, 0.09418739378452301, -0.15535205602645874, 0.05999205261468887, 0.028286432847380638, 0.06771102547645569, -0.1292550414800644, -0.026667088270187378, 0.02445938065648079, -0.004593482706695795, -0.030212556943297386, 0.024601956829428673, -0.2328687310218811, 0.17538748681545258, 0.011637701652944088, 0.060311175882816315, -0.07443694025278091, -0.02559412457048893, 0.07267243415117264, 0.08563084900379181, 0.14421479403972626, 0.02004859782755375, -0.14146684110164642, -0.09557849913835526, -0.09930230677127838, -0.01950942538678646, 0.057606179267168045, -0.06909435987472534, -0.004068886861205101, -0.014348594471812248, -0.034447696059942245, -0.07596012949943542, 0.03207000717520714, -0.13413771986961365, -0.17651799321174622, 0.05423557385802269, 0.07985863089561462, 0.07000281661748886, -0.03791193664073944, -0.052508242428302765, -0.10290919244289398, 0.18884491920471191, -0.02287057600915432, -0.024906693026423454, -0.1031138226389885, 0.006751218345016241, 0.011403878219425678, -0.0413086824119091, 0.03491818904876709, -0.044399019330739975, 0.09173668175935745, -0.04896247386932373, -0.032524365931749344, 0.07720984518527985, -0.11030762642621994, -0.08797691762447357, -0.06418723613023758, 0.03327242657542229, 0.06918492913246155, 0.0183012243360281, 0.07010841369628906, 0.021023059263825417, 0.02043513022363186, -0.08265169709920883, 0.021865928545594215, 0.05324612185359001, -0.009413708932697773, 0.126052588224411, -0.03750846907496452, -0.2428620308637619, -0.12340794503688812, -0.07332804054021835, 0.13465361297130585, 0.1670444905757904, -0.02610921859741211, 0.07710084319114685, 0.18401671946048737, -0.07776229828596115, -0.2763667404651642, -0.02929564006626606, -0.048292770981788635, -0.05000727251172066, -0.02736714668571949, -0.1843879222869873, 0.002982545644044876, 0.09816823899745941, -0.014705313369631767, 0.06780604273080826, -0.26098859310150146, -0.08230675011873245, 0.0378975011408329, 0.04485223442316055, 0.10195616632699966, -0.11235376447439194, -0.05892826244235039, -0.0801834762096405, -0.021086513996124268, 0.095210000872612, 0.029795566573739052, 0.06932056695222855, 0.06430274993181229, -0.007905283942818642, -0.0012477199779823422, -0.017386872321367264, 0.110043965280056, -0.011884172447025776, 0.010442541912198067, -0.06787082552909851, -0.009625723585486412, 0.08198966085910797, 0.008179889060556889, 0.07485463470220566, 0.0016603900585323572, -0.036260806024074554, -0.029266299679875374, -0.029322540387511253, -0.04429911822080612, 0.10627132654190063, -0.012982548214495182, 0.01729770191013813, -0.04898376017808914, 0.13212950527668, 0.053072236478328705, 0.05447772517800331, -0.06337765604257584, -0.04253311827778816, -0.02096729539334774, 0.01990962214767933, 0.060744427144527435, -0.03378349915146828, 0.06543850153684616, -0.014050289988517761, -0.030636217445135117, 0.049478717148303986, 0.04714974761009216, 0.034088246524333954, 0.07207733392715454, -0.01931867189705372, 0.09324859827756882, -0.0019054836593568325, -0.12245676666498184, 0.028164207935333252, 0.1301642805337906, -0.1482393443584442, -0.07117437571287155, 0.0019273018697276711, 0.049594249576330185, 0.053026534616947174, -0.009350510314106941, 0.12927469611167908, -0.00031322534778155386, -0.039093438535928726, -0.013483409769833088, 0.06983068585395813, 0.0002451159234624356, 0.09575426578521729, -0.02438715100288391, -0.04434770345687866, -0.1117127388715744, 0.12023802101612091, 0.12088670581579208, -0.10431336611509323, 0.028115255758166313, 0.10721214860677719, -0.06387388706207275, -0.07167311012744904, -0.047071702778339386, 0.04180101305246353, -0.0025402551982551813, -0.11342228949069977, -0.04170684143900871, -0.020333504304289818, 0.049354176968336105, 0.12397734075784683, 0.022350545972585678, 0.03591934219002724, 0.06503646820783615, -0.03622319921851158, -0.03923313319683075, 0.08069669455289841, 0.04374801367521286, -0.018233247101306915, -0.009521350264549255, -0.08824366331100464, 0.0029150554910302162, -0.041379693895578384, -0.008068637922406197, -0.04811062663793564, -0.15980100631713867, -0.005723336711525917, 0.06926358491182327, 0.0027892193756997585, -0.11711124330759048, 0.00031683919951319695, -0.003130336059257388, -0.05129013955593109, -0.06256389617919922, -0.014509460888803005, -0.05350181832909584, -0.06122366711497307, 0.005302098114043474, 0.10401833057403564, -0.18314114212989807, -0.023739464581012726, 0.08733741194009781, -0.04171933978796005, 0.10280293971300125, 0.03986010700464249, -0.01758882962167263, -0.038413576781749725, -0.27105098962783813, -0.05076800286769867, -0.06286072731018066, 0.030965812504291534, 0.032440729439258575, -0.18440426886081696, 0.04249624162912369, 0.05129252374172211, -0.04146520793437958, 0.038404498249292374, 0.097946397960186, -0.051978614181280136, 0.039176009595394135, 0.05399877950549126, -0.13892246782779694, -0.016572466120123863, 0.04579127952456474, 0.13993872702121735, -0.02606065198779106, 0.11874666064977646, -0.08026555925607681, 0.01636478863656521, 0.013814142905175686, 0.0349700041115284, 0.0351719930768013, -0.07001157104969025, -0.04229002073407173, 0.006504993885755539, 0.05384359881281853, 0.012429225258529186, 0.16755740344524384, 0.10411842912435532, -0.03589252382516861, 0.05781782791018486, 0.04667338356375694, -0.08454324305057526, 0.06699235737323761, 0.0411098413169384, 0.015997890383005142, -0.03702465444803238, -0.043595779687166214, 0.029889684170484543, -0.00011747235839720815, 0.11214584857225418, 0.08036382496356964, 0.14736522734165192, 0.14703764021396637, 0.030088935047388077, 0.07596339285373688, -0.015295824967324734, -0.08164076507091522, -0.004055627156049013, -0.050131987780332565, 0.0837642028927803, -0.05716640129685402, 0.05087396502494812, 0.06617618352174759, -0.17409710586071014, 0.04692041873931885, -0.05592403933405876, -0.05558355897665024, -0.0452238954603672, -0.21937450766563416, -0.05637790635228157, -0.06064537912607193, 0.02602190338075161, -0.10736660659313202, 0.10104938596487045, 0.029577238485217094, 0.05341754108667374, -0.0814959779381752, 0.0312623456120491, -0.1465759426355362, -0.03830568864941597, 0.13551415503025055, -0.017107252031564713, 0.010105786845088005, -0.048223089426755905, -0.05261576920747757, -0.08866510540246964, 0.06279228627681732, 0.04661983251571655, 0.08209802210330963, 0.019941117614507675, 0.018062466755509377, -0.07666619122028351, -0.0963917151093483, -0.004284991417080164, -0.04611829295754433, -0.06466080248355865, 0.11317400634288788, 0.05674010142683983, 0.00046282430412247777, 0.06347379833459854, 0.12850908935070038, 0.016241000965237617, -0.015367133542895317, -0.2298569232225418, 0.025652945041656494, -0.024053238332271576, 0.01907050982117653, -0.0038050857838243246, -0.09753759205341339, -0.034252408891916275, 0.23700445890426636, 0.18753300607204437, -0.058429401367902756, -0.0031233951449394226, -0.03311794623732567, 0.01493571512401104, 0.018224967643618584, 0.060828324407339096, 0.04678034409880638, 0.17088723182678223, -0.02257726527750492, 0.034374259412288666, -0.05610279366374016, -0.017622921615839005, -0.15314272046089172, 0.08479229360818863, 0.024516642093658447, -0.009908094070851803, -0.09999019652605057, 0.1348010003566742, -0.11751022934913635, -0.12517370283603668, 0.014148632995784283, -0.10354828834533691, -0.10319029539823532, 0.006186278071254492, -0.07562097907066345, 0.057527218014001846, 0.08763842284679413, 0.04701537638902664, -0.03508678451180458, 0.036230914294719696, 0.03175469860434532, -0.08620631694793701, -0.08312641084194183, 0.030216895043849945, -0.08512550592422485, 0.28335344791412354, -0.01934368535876274, 0.025546450167894363, 0.09520753473043442, 0.028088988736271858, -0.09516505151987076, 0.006340735126286745, 0.055790793150663376, -0.007281204219907522, -0.0015665000537410378, 0.079017274081707, -0.03506786748766899, 0.15093347430229187, 0.08429450541734695, 0.029232051223516464, 0.07549240440130234, -0.048112042248249054, 0.08422350138425827, -0.08100403845310211, 0.10460808873176575, -0.1452098786830902, 0.13599306344985962, 0.12098479270935059, -0.012334439903497696, 0.016406457871198654, -0.021763211116194725, 0.029758816584944725, -0.01226838119328022, -0.02274361439049244, -0.06501144170761108, -0.19097240269184113, -0.0035658928100019693, -0.046622663736343384, 0.10968592762947083, -0.11896063387393951, -0.01233839150518179, -0.10471917688846588, 0.017284153029322624, -0.05526326596736908, 0.09735111147165298, 0.11325821280479431, -0.05124481022357941, -0.047054946422576904, -0.10417430847883224, 0.03309047222137451, 0.07892917841672897, -0.02215997874736786, -0.053197648376226425 ]
null
null
null
Source language: Finnish Target language: Swedish Training dataset: https://opus.nlpl.eu/ Framework toolkit: Fairseq Model architecture: transformer_vaswani_wmt_en_de_big https://github.com/MusserO/BERT-fused_fi-sv
{}
null
MusserO/transformer-opus-fi-sv
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
Source language: Finnish Target language: Swedish Training dataset: URL Framework toolkit: Fairseq Model architecture: transformer_vaswani_wmt_en_de_big URL
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
[ 0.024608636274933815, -0.026205500587821007, -0.009666500613093376, -0.10395516455173492, 0.08638657629489899, 0.059816278517246246, 0.01882290467619896, 0.020661840215325356, 0.23975107073783875, -0.005599027033895254, 0.1219947561621666, 0.0015615287702530622, -0.037353623658418655, 0.03733762726187706, -0.0035912662278860807, -0.17583473026752472, 0.03876631706953049, -0.018274923786520958, 0.01843859627842903, 0.026470553129911423, -0.07776834815740585, -0.07564429938793182, 0.015296397730708122, -0.10247814655303955, -0.083692267537117, 0.11002834886312485, 0.031466204673051834, -0.019670886918902397, 0.10779199749231339, -0.04243955761194229, 0.18699054419994354, -0.011512263678014278, -0.11213519424200058, -0.2536850869655609, 0.021806683391332626, -0.01765260472893715, -0.08747660368680954, 0.01506110467016697, 0.0665089413523674, -0.09014441072940826, -0.0588928684592247, 0.0795099288225174, -0.01132340170443058, 0.04246443510055542, -0.27593839168548584, -0.12684126198291779, -0.05297930911183357, -0.1421966552734375, 0.08651168644428253, 0.04035491496324539, 0.008764253929257393, 0.15506891906261444, -0.20897391438484192, 0.004104613792151213, 0.08255259692668915, -0.2538507878780365, 0.05591634660959244, 0.17671173810958862, 0.03623908758163452, 0.18037272989749908, 0.0060391901060938835, 0.11029672622680664, 0.0716743916273117, -0.024263937026262283, -0.17590197920799255, -0.08127854019403458, -0.04696211963891983, 0.16642488539218903, -0.06727185100317001, -0.14248386025428772, 0.34701237082481384, 0.00015008423360995948, 0.009657775051891804, 0.16921205818653107, -0.059524230659008026, -0.09972117841243744, 0.07259953022003174, 0.016484731808304787, 0.018492350354790688, 0.1471305936574936, 0.16307872533798218, -0.0458691343665123, -0.13837823271751404, -0.018630273640155792, -0.22798998653888702, 0.17510560154914856, -0.03248048573732376, 0.13137903809547424, -0.27447956800460815, 0.01684025302529335, -0.2570667266845703, 0.0032130838371813297, 0.04178816080093384, -0.06004921346902847, -0.0226522795855999, -0.013265985064208508, -0.08018817007541656, 0.004899587947875261, 0.06192673370242119, 0.1266920566558838, -0.06128726154565811, 0.06128238886594772, -0.09319206327199936, 0.141696035861969, 0.07166698575019836, 0.07868369668722153, 0.13037432730197906, 0.041205424815416336, -0.07187089323997498, -0.21872246265411377, -0.0026476888451725245, -0.06275863200426102, -0.09502086788415909, -0.0020165652967989445, -0.11606067419052124, 0.17244569957256317, -0.030802514404058456, -0.09825427830219269, -0.11208184063434601, 0.09148659557104111, -0.032992321997880936, -0.03437839448451996, -0.03552987426519394, -0.020977836102247238, 0.019381176680326462, 0.04704452306032181, -0.1548958420753479, -0.005131472367793322, 0.07039852440357208, 0.11502562463283539, -0.1346137970685959, -0.003783059772104025, -0.07908964157104492, 0.03039063885807991, 0.07654735445976257, -0.16510222852230072, 0.03158547356724739, -0.1124754324555397, -0.07531405985355377, 0.002912673633545637, -0.015710093080997467, -0.016202643513679504, 0.166526660323143, -0.0020451415330171585, 0.0714716836810112, -0.026345307007431984, -0.05890209600329399, -0.11243434250354767, -0.08489254862070084, 0.05390460044145584, 0.03670717030763626, 0.03266148269176483, -0.2193479984998703, 0.014805203303694725, -0.12762966752052307, 0.1360815018415451, -0.10566820204257965, -0.04705966264009476, -0.022842247039079666, 0.20562705397605896, 0.037286072969436646, 0.08762791007757187, -0.22171171009540558, 0.039756543934345245, -0.05404696613550186, 0.18480908870697021, -0.1502426266670227, -0.0799463614821434, 0.20813211798667908, -0.07964949309825897, -0.10115210711956024, 0.021235812455415726, 0.020391687750816345, 0.026287272572517395, 0.0766737088561058, 0.4564172327518463, -0.09766800701618195, -0.09146861732006073, 0.10178250074386597, 0.17055274546146393, -0.12427149713039398, -0.1827561855316162, 0.06446871906518936, -0.16666454076766968, -0.1973118633031845, 0.0018917324487119913, 0.09222044050693512, 0.038269978016614914, -0.07875611633062363, -0.020746968686580658, 0.06325206160545349, -0.0007678253459744155, 0.09095914661884308, 0.03755716234445572, 0.09034032374620438, -0.08716782182455063, 0.11115926504135132, -0.05017651244997978, 0.004037132486701012, 0.1343354731798172, 0.027325427159667015, -0.03223329409956932, 0.08694463223218918, -0.0485352948307991, 0.05295134335756302, -0.1662379503250122, -0.15068690478801727, 0.03398871049284935, 0.06283251196146011, 0.03186952322721481, 0.1280253529548645, 0.08141885697841644, -0.10732853412628174, 0.022690722718834877, -0.004228927195072174, 0.058398615568876266, 0.03891623765230179, 0.006107209715992212, 0.008764320984482765, 0.0961301177740097, -0.10607069730758667, -0.13589619100093842, -0.07336436957120895, -0.014715781435370445, 0.14371353387832642, -0.0302802175283432, 0.07690227776765823, -0.004240254405885935, 0.00013200697139836848, 0.06930823624134064, 0.08137880265712738, 0.016412746161222458, 0.08971183747053146, -0.05237193778157234, -0.05160155147314072, 0.10863113403320312, -0.13533565402030945, 0.17837053537368774, 0.14053137600421906, -0.20532016456127167, 0.029453208670020103, -0.06838275492191315, 0.03670361638069153, -0.008162540383636951, 0.0975119024515152, -0.08272241055965424, -0.02106042578816414, 0.013134466484189034, 0.0052274600602686405, -0.013007243163883686, 0.017682146281003952, -0.07295988500118256, -0.07787393033504486, -0.10233919322490692, 0.08436838537454605, 0.11562882363796234, -0.10282530635595322, 0.14214380085468292, 0.4384984076023102, 0.11495281755924225, 0.21582984924316406, -0.09581480920314789, -0.0412987545132637, 0.007486371789127588, 0.0001535322517156601, -0.04476691037416458, 0.08031861484050751, -0.15973517298698425, -0.038901735097169876, 0.027348900213837624, 0.07128690183162689, 0.11475157737731934, -0.14959022402763367, -0.09639324247837067, -0.00793045200407505, 0.0022841424215584993, -0.1249532699584961, 0.023905446752905846, -0.03974650055170059, 0.04015624523162842, 0.07232289016246796, -0.021535737439990044, 0.13939237594604492, -0.04166141897439957, -0.0639561116695404, 0.07585346698760986, -0.2017085999250412, -0.23179671168327332, -0.12309670448303223, -0.14680525660514832, 0.04366797208786011, 0.05154111236333847, 0.01726446859538555, -0.17635835707187653, -0.015074856579303741, 0.07706750929355621, 0.07820965349674225, -0.20886357128620148, -0.022814949974417686, -0.004290030337870121, 0.0895976573228836, -0.10227091610431671, -0.0017130117630586028, -0.04419664293527603, -0.10150232166051865, 0.0017003051470965147, 0.07279510796070099, -0.137485533952713, 0.13807645440101624, 0.21589438617229462, 0.07225540280342102, 0.07359948754310608, -0.019093448296189308, 0.09936179965734482, -0.10856141895055771, -0.16549113392829895, 0.08348225057125092, -0.06234746053814888, 0.047262318432331085, 0.17534415423870087, 0.03307317942380905, -0.13904969394207, -0.015682822093367577, -0.0402069091796875, -0.15603256225585938, -0.238995760679245, -0.09178274869918823, -0.1182505264878273, 0.16442428529262543, 0.0009358620154671371, 0.06651917099952698, 0.08258313685655594, -0.022042419761419296, 0.16447891294956207, -0.07379321753978729, -0.07578866183757782, -0.006978808436542749, 0.12375060468912125, -0.056660156697034836, -0.03080669604241848, -0.10566964000463486, -0.008295975625514984, 0.1151021271944046, 0.15304014086723328, 0.12214863300323486, 0.2957419455051422, 0.08268889784812927, 0.026645636186003685, 0.08958091586828232, 0.17622539401054382, 0.09495089203119278, 0.07838419824838638, -0.045413073152303696, -0.014814783819019794, 0.014317171648144722, -0.04022889584302902, 0.010141594335436821, 0.14683100581169128, -0.2679629921913147, -0.006678564939647913, -0.2710230350494385, 0.0965198427438736, -0.10913380235433578, 0.11837165057659149, -0.01015760749578476, 0.10194015502929688, 0.11082887649536133, 0.03233652561903, -0.03858073800802231, 0.16613617539405823, 0.08450309932231903, -0.11277695000171661, 0.001758623169735074, 0.03737903758883476, 0.09715615212917328, -0.02818971499800682, 0.12721189856529236, -0.11048974841833115, -0.1464834064245224, 0.013753619976341724, 0.07152791321277618, -0.15373679995536804, 0.3138748109340668, 0.012069208547472954, -0.13481520116329193, -0.01481647603213787, -0.09957809001207352, -0.006440147757530212, 0.1254177987575531, 0.09333524852991104, 0.07935678958892822, -0.2185502052307129, -0.13339371979236603, 0.05872276425361633, -0.00575496768578887, 0.22408108413219452, -0.034034017473459244, -0.11356475204229355, -0.027013886719942093, 0.04241163283586502, -0.06043251231312752, 0.08524788916110992, 0.023536119610071182, -0.08113526552915573, -0.032957352697849274, 0.05323701351881027, 0.012368366122245789, 0.00524376705288887, 0.09360801428556442, 0.020107939839363098, -0.0009265501867048442, 0.01785753294825554, 0.047885000705718994, -0.0675911232829094, -0.1984109878540039, 0.09357594698667526, -0.05215044692158699, 0.0015536568826064467, -0.08013670891523361, -0.15122665464878082, -0.08837161958217621, -0.16009655594825745, 0.12540200352668762, -0.034406669437885284, 0.12700119614601135, -0.06619787961244583, 0.17341409623622894, -0.07871770113706589, 0.04481020197272301, -0.047349292784929276, 0.050332702696323395, -0.007268077693879604, -0.07756082713603973, 0.16585899889469147, -0.15564003586769104, 0.01809087023139, 0.19572502374649048, -0.018915493041276932, 0.07177707552909851, 0.021322092041373253, -0.0636206790804863, 0.23147478699684143, 0.3014698624610901, 0.008138049393892288, 0.1665448248386383, 0.3018903136253357, -0.07466315478086472, -0.2642788887023926, -0.05505012720823288, -0.2841376066207886, -0.05371501296758652, 0.10716094076633453, -0.22523896396160126, 0.06986407935619354, 0.14383509755134583, -0.06471995264291763, 0.30228954553604126, -0.21825523674488068, 0.012589273042976856, 0.15434536337852478, -0.08868814259767532, 0.5515313148498535, -0.1133413165807724, -0.17677772045135498, -0.008122089318931103, -0.08741296827793121, 0.10602109134197235, -0.0340677872300148, 0.06877441704273224, 0.013465235009789467, 0.04797380417585373, 0.048932258039712906, -0.03111894056200981, 0.22701001167297363, 0.008710170164704323, 0.09015397727489471, -0.07378865778446198, -0.18624304234981537, 0.11639340221881866, -0.04359482601284981, -0.08891059458255768, 0.0849778801202774, -0.05942516401410103, -0.11078983545303345, 0.04663389176130295, -0.07950539886951447, -0.024862350896000862, 0.08423490077257156, -0.04678233340382576, -0.042606171220541, -0.008054176345467567, -0.1618063747882843, -0.0002289071271661669, 0.31360217928886414, -0.07096036523580551, 0.16695955395698547, 0.03677211329340935, 0.00038613268407061696, -0.11027684062719345, 0.030288029462099075, -0.05203165486454964, -0.021576624363660812, 0.09578979015350342, -0.11096979677677155, 0.03204701095819473, 0.14160704612731934, -0.04864364117383957, 0.05846960097551346, 0.09256096184253693, -0.0849417969584465, 0.007583672646433115, 0.17753590643405914, -0.17537221312522888, -0.1273445188999176, -0.006135711446404457, -0.09862716495990753, 0.14055661857128143, 0.04394126310944557, 0.05191568285226822, 0.16669964790344238, 0.03967129811644554, -0.029474308714270592, -0.02817419543862343, -0.1153380498290062, -0.0201893113553524, 0.040153320878744125, 0.00045633706031367183, -0.08791285753250122, 0.2262638509273529, 0.06409153342247009, -0.1328488290309906, -0.051157206296920776, 0.2161225974559784, -0.06805316358804703, -0.04911920800805092, -0.223562553524971, 0.10752306133508682, -0.07112517952919006, -0.0965060144662857, 0.05453834682703018, -0.02270081453025341, 0.005106312222778797, 0.181985542178154, 0.03941008821129799, 0.11070270836353302, 0.03738937899470329, -0.02448922023177147, 0.15798696875572205, -0.142850860953331, -0.14191335439682007, -0.025354057550430298, -0.08757315576076508, -0.13844476640224457, -0.026804137974977493, 0.1617041826248169, -0.09177309274673462, -0.14772607386112213, -0.2621181011199951, 0.10968475043773651, -0.16432365775108337, -0.10192688554525375, -0.03469514101743698, -0.08968492597341537, 0.0696166530251503, 0.030301768332719803, -0.03093348816037178, -0.06706760823726654, -0.18593791127204895, 0.0816768929362297, 0.06349513679742813, 0.045533183962106705, -0.017847947776317596, 0.0067379772663116455, 0.1720137596130371, 0.025955144315958023, 0.10040043294429779, 0.16762186586856842, 0.011397695168852806, 0.2246655523777008, -0.1671202927827835, -0.11496317386627197, 0.1336962729692459, -0.026543032377958298, 0.06762003898620605, 0.16792191565036774, -0.0772583931684494, 0.015526676550507545, -0.028136352077126503, 0.07066910713911057, -0.11003983020782471, -0.105624258518219, 0.007937257178127766, 0.02567129209637642, -0.2755882740020752, -0.005599735304713249, -0.19717298448085785, 0.14788752794265747, 0.02579621411859989, 0.03297143429517746, 0.10257530212402344, 0.10404334217309952, 0.08312062919139862, -0.0017710148822516203, 0.03226327523589134, -0.1176818460226059, 0.02753005363047123, -0.059239376336336136, -0.020663779228925705, 0.017624232918024063, 0.36952024698257446, -0.03603357449173927, -0.046802736818790436, 0.003710439894348383, 0.1307835876941681, -0.02139742486178875, 0.017395347356796265, 0.13209912180900574, 0.12607666850090027, -0.08595693111419678, -0.1504845917224884, 0.04888554662466049, -0.04565655067563057, -0.02836887165904045, 0.1464131623506546, 0.05905961990356445, 0.1050296202301979, 0.0908031314611435, -0.014463032595813274, -0.00318976235575974, 0.012856799177825451, -0.15486004948616028, 0.06223496049642563, -0.010558074340224266, 0.012565906159579754, 0.017934376373887062, 0.15238402783870697, -0.005540105979889631, 0.07739730179309845, -0.09889880567789078, 0.004208535887300968, -0.13498884439468384, -0.07913459837436676, 0.03617347031831741, -0.13393273949623108, 0.04141177982091904, -0.01871878281235695, 0.029611799865961075, 0.30386561155319214, 0.02558239921927452, -0.020639164373278618, 0.12512871623039246, -0.1214587539434433, -0.12050267308950424, -0.001594188273884356, -0.029960084706544876, 0.0791488066315651, -0.02633434161543846, -0.0997740775346756, -0.1001306027173996, -0.15166029334068298, -0.09759195148944855, 0.05182836204767227, -0.04993441700935364, -0.059362251311540604, -0.17634081840515137, -0.05707859992980957, -0.05147340148687363, 0.14025864005088806, -0.12263951450586319, 0.15159130096435547, -0.014490418136119843, 0.004084470681846142, 0.04405883327126503, 0.1950942426919937, -0.03644494712352753, 0.08714226633310318, 0.0154351145029068, 0.1522706001996994, -0.05119588226079941, 0.14720745384693146, -0.10931728035211563, -0.04014137014746666, -0.06710435450077057, 0.21513493359088898, 0.25630924105644226, -0.06136954948306084, -0.008937356993556023, -0.012760217301547527, 0.058654606342315674, 0.1073930487036705, 0.16049085557460785, 0.002326392102986574, 0.2802925705909729, -0.03133585304021835, 0.04815128445625305, 0.02901598811149597, 0.013607407920062542, -0.06336209923028946, 0.03397751972079277, 0.07539387792348862, -0.035039983689785004, -0.1412304788827896, 0.15837742388248444, -0.21980468928813934, 0.18157227337360382, 0.11640069633722305, -0.19996967911720276, -0.013728445395827293, -0.04882071167230606, 0.1689416468143463, -0.0856364443898201, 0.1637246012687683, -0.0903693437576294, -0.2108195722103119, -0.2056000679731369, 0.03867346793413162, -0.34623071551322937, -0.254462867975235, 0.10422009229660034, 0.1488201916217804, 0.04015883058309555, -0.018507536500692368, -0.019967829808592796, -0.018367022275924683, 0.04877542704343796, -0.0067357709631323814, 0.06014643982052803, 0.031397558748722076, -0.02988368645310402, -0.24127542972564697, -0.029804671183228493, 0.023964406922459602, -0.07093082368373871, 0.07464958727359772, -0.06874357163906097, -0.022495782002806664, 0.08059766888618469, -0.03066304884850979, 0.03298592567443848, -0.035373736172914505, -0.16326889395713806, 0.027529051527380943, 0.03900543600320816, 0.036012712866067886, 0.00634160777553916, 0.0008072225609794259, -0.03455270454287529, 0.0644603744149208, -0.16716794669628143, -0.16015739738941193, 0.14140215516090393, -0.06745140254497528, 0.2779497504234314, -0.05812826007604599, -0.0809100940823555, 0.04766704887151718, -0.03426874056458473, 0.1807648241519928, -0.07756473124027252, 0.047254521399736404, 0.12766779959201813, 0.011127962730824947, 0.03121316432952881, -0.3092964291572571, 0.11082969605922699, -0.000795336440205574, -0.006093299947679043, -0.07581598311662674 ]
null
null
transformers
## BERT model van het project Explainable AI
{"license": "eupl-1.1"}
text-classification
Mustang/BERT_responsible_AI
[ "transformers", "pytorch", "bert", "text-classification", "license:eupl-1.1", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #bert #text-classification #license-eupl-1.1 #autotrain_compatible #endpoints_compatible #region-us
## BERT model van het project Explainable AI
[ "## BERT model van het project Explainable AI" ]
[ "TAGS\n#transformers #pytorch #bert #text-classification #license-eupl-1.1 #autotrain_compatible #endpoints_compatible #region-us \n", "## BERT model van het project Explainable AI" ]
[ 44, 11 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #license-eupl-1.1 #autotrain_compatible #endpoints_compatible #region-us \n## BERT model van het project Explainable AI" ]
[ -0.07914558798074722, 0.13757078349590302, -0.0025713436771184206, 0.05440594628453255, 0.14106519520282745, 0.002749096602201462, 0.2089289277791977, 0.06281349807977676, 0.11997716128826141, 0.0005304320948198438, 0.17358577251434326, 0.19970089197158813, 0.01358811091631651, 0.16352316737174988, -0.08461399376392365, -0.325396329164505, 0.13225533068180084, 0.08782961964607239, 0.04465632513165474, 0.068843774497509, 0.1116596981883049, -0.10170816630125046, 0.10880497097969055, 0.05886688828468323, -0.0847342312335968, -0.027204088866710663, -0.0225742906332016, -0.13552531599998474, 0.09188175201416016, 0.07015057653188705, 0.05888651683926582, 0.03770527243614197, 0.07525437325239182, -0.11245083063840866, 0.010165130719542503, -0.07176172733306885, -0.051889002323150635, 0.10778103768825531, 0.052248477935791016, -0.028220681473612785, 0.04815058782696724, 0.14732594788074493, 0.028146598488092422, 0.021764786913990974, -0.05083560571074486, 0.011278185062110424, 0.03145113214850426, 0.08246269077062607, 0.06579340994358063, 0.08422023802995682, 0.0035849662963300943, 0.1491708755493164, -0.10759346932172775, 0.09280181676149368, 0.1154346689581871, -0.3702864944934845, -0.021888984367251396, 0.039939701557159424, 0.005965175572782755, -0.07511568814516068, 0.014758961275219917, 0.0996624156832695, 0.08872158080339432, -0.009708445519208908, 0.03483809530735016, -0.03962263837456703, -0.017138775438070297, -0.006018497049808502, -0.08813703805208206, -0.011681445874273777, 0.27303507924079895, -0.009235787205398083, 0.002027187030762434, -0.060861069709062576, -0.03429631516337395, 0.007450438104569912, -0.05458858609199524, -0.030115770176053047, -0.0008568177581764758, 0.02516835741698742, 0.07639390230178833, 0.017556561157107353, -0.06425179541110992, 0.018851973116397858, -0.15782132744789124, 0.27878308296203613, 0.028474213555455208, 0.07089512795209885, -0.13159328699111938, 0.05648425966501236, 0.011302707716822624, -0.08200636506080627, 0.0441020205616951, -0.04361061379313469, 0.03640149533748627, 0.002026103902608156, -0.024736961349844933, 0.024458056315779686, 0.042466238141059875, 0.10207100212574005, 0.06541827321052551, -0.03137494623661041, 0.020633937790989876, 0.09394222497940063, 0.07712990790605545, 0.20108579099178314, -0.10560199618339539, -0.008767465129494667, 0.03500426933169365, -0.047671519219875336, -0.011995825916528702, -0.06114639341831207, -0.2337319552898407, 0.040489666163921356, 0.04285629838705063, 0.04334767162799835, -0.029486265033483505, 0.12377069890499115, -0.11309564858675003, -0.05536400154232979, 0.07253686338663101, -0.05116076022386551, 0.02247810736298561, 0.007082037162035704, -0.026266509667038918, 0.03527002036571503, -0.0018894975073635578, 0.02098679542541504, -0.0821136087179184, 0.09597442299127579, -0.06515692919492722, -0.022631827741861343, -0.08062991499900818, -0.0973789170384407, 0.026470545679330826, -0.06392756849527359, 0.09280290454626083, -0.17842617630958557, -0.1598852425813675, 0.04245436564087868, 0.058631643652915955, -0.061968933790922165, -0.050037868320941925, -0.06672597676515579, -0.028901884332299232, -0.012604216113686562, -0.10480400919914246, -0.07397233694791794, -0.05976157262921333, 0.03295007720589638, -0.08535894006490707, 0.06027368828654289, -0.19111377000808716, 0.10760310292243958, -0.10075005888938904, 0.0005850828601978719, -0.10395251214504242, 0.02660338021814823, -0.062654510140419, 0.09860331565141678, -0.025088030844926834, -0.05873760208487511, 0.09706313908100128, 0.052499301731586456, 0.0012695299228653312, 0.14036674797534943, -0.08548538386821747, -0.05932800471782684, 0.02937357686460018, -0.08591665327548981, -0.18719688057899475, 0.1342870593070984, -0.018724583089351654, 0.11194086074829102, 0.08003002405166626, 0.2189742922782898, -0.04699892923235893, -0.03368433564901352, 0.021080831065773964, 0.08693685382604599, -0.03321273624897003, -0.08565570414066315, 0.04581203684210777, 0.08576439321041107, -0.1694551557302475, 0.0633508712053299, -0.029624756425619125, 0.015804195776581764, -0.0907055214047432, -0.06278002262115479, 0.04557762295007706, -0.08970300853252411, 0.09128749370574951, 0.014193175360560417, 0.15468163788318634, -0.028210822492837906, -0.015465393662452698, 0.0372205525636673, 0.042398642748594284, -0.043706852942705154, -0.015764538198709488, -0.16818289458751678, 0.17050684988498688, -0.04661523550748825, 0.02764877676963806, -0.18607014417648315, -0.02797696553170681, 0.04475315660238266, 0.050254061818122864, 0.043541572988033295, 0.15611496567726135, -0.01479744166135788, 0.0192018561065197, 0.02324165590107441, -0.036767348647117615, 0.1068279966711998, 0.06157611683011055, -0.11560550332069397, -0.14397692680358887, -0.01471400074660778, -0.0731172263622284, -0.06392771750688553, -0.057910095900297165, 0.014316429384052753, 0.009479441680014133, 0.0864877998828888, -0.021457470953464508, 0.08134813606739044, -0.020200546830892563, 0.07367923110723495, -0.08183570951223373, 0.04739515483379364, 0.11400266736745834, -0.022214673459529877, -0.11235059052705765, 0.1829422563314438, -0.08662712574005127, 0.26482999324798584, 0.1642679125070572, -0.1880950778722763, -0.01774252951145172, 0.006077356170862913, -0.025720803067088127, 0.03610757365822792, 0.00004614510908140801, -0.004235635977238417, 0.21231268346309662, -0.040916476398706436, 0.18240845203399658, -0.07798236608505249, -0.020728101953864098, 0.011965959332883358, -0.06125596538186073, -0.06616346538066864, 0.12306807935237885, 0.1397409439086914, -0.21762709319591522, 0.18537697196006775, 0.15745626389980316, 0.016294997185468674, 0.14347687363624573, 0.005670387297868729, 0.0031221003737300634, 0.02940881997346878, -0.053796831518411636, -0.011788359843194485, 0.04390285909175873, -0.20616541802883148, -0.050181373953819275, 0.0751115009188652, -0.017594903707504272, 0.01829575002193451, -0.12380954623222351, -0.029315348714590073, -0.027134794741868973, 0.01271336991339922, -0.009919309057295322, 0.03010210022330284, -0.009915674105286598, 0.12887394428253174, -0.02885887585580349, -0.21934352815151215, 0.06651933491230011, 0.022814223542809486, -0.09859415143728256, 0.18548984825611115, -0.06766773760318756, -0.26289764046669006, -0.13647766411304474, -0.18447622656822205, -0.046817854046821594, 0.032257430255413055, 0.06945688277482986, -0.01859496347606182, -0.08227996528148651, -0.00918742548674345, -0.03756647929549217, -0.05408605560660362, 0.013343016617000103, 0.06446760147809982, -0.004718589596450329, -0.06188950315117836, -0.07861585170030594, -0.10356041043996811, -0.059711772948503494, 0.0022148066200315952, 0.10109951347112656, -0.13873076438903809, 0.08742652833461761, 0.11039802432060242, -0.08073452115058899, 0.03462693467736244, -0.05779164656996727, 0.2080819010734558, -0.036125972867012024, -0.0021296602208167315, 0.1309633105993271, -0.09777209907770157, 0.03244323283433914, 0.1642516404390335, 0.06330268085002899, -0.11467202752828598, 0.006885022856295109, -0.12192800641059875, -0.05930156633257866, -0.1993218958377838, -0.14565640687942505, -0.06973659247159958, 0.15443895757198334, 0.09191063791513443, 0.02557920105755329, -0.024042509496212006, 0.12328197062015533, 0.05061163008213043, 0.056681957095861435, -0.03341952711343765, 0.09555543959140778, 0.19026009738445282, -0.0372968465089798, 0.15818540751934052, -0.02084038220345974, -0.1418217122554779, 0.08532197773456573, 0.0031653272453695536, 0.1251736283302307, 0.08227118849754333, 0.004396628122776747, 0.032322123646736145, 0.05681450664997101, 0.11457431316375732, 0.1791042983531952, -0.0021262390073388815, -0.027149591594934464, -0.06670208275318146, -0.03473782539367676, -0.061297301203012466, 0.08089335262775421, 0.02629079297184944, -0.102240189909935, -0.093167744576931, -0.07861196994781494, 0.019542111083865166, 0.16324482858181, 0.08696937561035156, -0.27874863147735596, -0.015469112433493137, 0.06103253364562988, -0.047530703246593475, -0.07476239651441574, 0.10186558216810226, -0.028988782316446304, -0.14602982997894287, 0.07429428398609161, -0.0032638150732964277, 0.11994334310293198, -0.0913705825805664, 0.0875265896320343, -0.04636724293231964, -0.12414537370204926, 0.03176073729991913, 0.1368025541305542, -0.24210350215435028, 0.2888815701007843, 0.010379672050476074, -0.014194132760167122, -0.08859797567129135, 0.005278998054563999, 0.021833302453160286, 0.23643268644809723, 0.14967632293701172, -0.01636931113898754, 0.05518004670739174, -0.18902911245822906, -0.09014645963907242, 0.02163095772266388, 0.014414642006158829, -0.05921397730708122, 0.0056341905146837234, 0.009403420612215996, 0.006415842100977898, -0.031932707875967026, -0.00017452644533477724, 0.01704004406929016, -0.14956432580947876, 0.04904009774327278, 0.11342226713895798, 0.1446966975927353, 0.00864587351679802, -0.1012539491057396, -0.16239699721336365, 0.19962240755558014, 0.04087233170866966, -0.04644777253270149, -0.09295641630887985, -0.00042171598761342466, 0.018700923770666122, -0.06133606657385826, -0.0021367238368839025, -0.07126100361347198, 0.00827720481902361, -0.053565267473459244, -0.13120543956756592, 0.15019765496253967, -0.15050619840621948, -0.08437985181808472, -0.06526138633489609, 0.01807740144431591, -0.03970622271299362, 0.06739632040262222, 0.07383056730031967, 0.01016458310186863, -0.08938995748758316, -0.09352134168148041, -0.049095526337623596, 0.05927349254488945, 0.07244819402694702, 0.02661309763789177, -0.15680457651615143, -0.10245291888713837, -0.04132499918341637, -0.023199694231152534, 0.1813141405582428, 0.16662614047527313, -0.10527658462524414, 0.11217688769102097, 0.2500992715358734, -0.03962625563144684, -0.3550412952899933, -0.08349552750587463, -0.0325249545276165, -0.013240567408502102, -0.002616677200421691, -0.1334114819765091, 0.1384895294904709, -0.04688248038291931, -0.07120604068040848, -0.04744163528084755, -0.06584416329860687, -0.10444452613592148, 0.25075143575668335, 0.06141306087374687, 0.31677311658859253, -0.06711118668317795, -0.011338031850755215, -0.05693439394235611, -0.12099973112344742, 0.11882921308279037, 0.11347414553165436, 0.08735141158103943, 0.006992095150053501, 0.09181903302669525, 0.029768476262688637, -0.01484049204736948, 0.07569433003664017, -0.00277440226636827, 0.0018822654383257031, -0.06861894577741623, -0.11733399331569672, 0.03370613604784012, 0.01878131926059723, 0.1519869714975357, -0.053921353071928024, 0.013515638187527657, -0.13940855860710144, -0.08321895450353622, 0.0005611922242678702, 0.04391215741634369, 0.02559664286673069, -0.08484984189271927, -0.014038657769560814, 0.023970915004611015, 0.006800075527280569, 0.03653797507286072, 0.16904175281524658, -0.00686639966443181, -0.03624871373176575, 0.12509053945541382, 0.19675801694393158, -0.1512732207775116, 0.04995287209749222, -0.06401613354682922, -0.09306849539279938, 0.11128295958042145, -0.033316921442747116, -0.02692367695271969, 0.14533470571041107, -0.07588270306587219, 0.10858403146266937, 0.09722751379013062, 0.008082595653831959, -0.006779614370316267, 0.07916349917650223, -0.16900008916854858, -0.1260170042514801, -0.02262154594063759, 0.14888225495815277, 0.06536610424518585, 0.12063769996166229, 0.1310468465089798, -0.07731715589761734, -0.034673113375902176, 0.0012012760853394866, 0.0049620745703577995, 0.015564045868813992, 0.04045818746089935, -0.02132534608244896, 0.027626177296042442, -0.14598864316940308, 0.04176124557852745, 0.036654435098171234, -0.0923900306224823, 0.052383579313755035, -0.02251463569700718, -0.14609795808792114, -0.11685855686664581, -0.06562303006649017, 0.20782971382141113, -0.14338058233261108, -0.10573264956474304, -0.04923893138766289, -0.16880396008491516, 0.06153317913413048, 0.06042422726750374, 0.1292419731616974, 0.004023937974125147, -0.11250070482492447, -0.07742615789175034, 0.007286877371370792, 0.041383977979421616, 0.00044956625788472593, 0.03566372022032738, -0.08737515658140182, -0.0725008100271225, 0.00648532947525382, 0.12693826854228973, -0.10293906927108765, -0.03855365142226219, -0.1653764545917511, 0.035236697643995285, -0.13725438714027405, -0.020184751600027084, -0.0906396135687828, -0.03464442491531372, 0.002032399410381913, -0.09493833035230637, -0.03298122063279152, -0.01337704248726368, -0.11245988309383392, 0.052317555993795395, 0.016008596867322922, 0.07120311260223389, -0.05002961307764053, -0.018522964790463448, 0.06540702283382416, -0.0367092601954937, 0.1500610113143921, 0.04881833493709564, -0.08607333153486252, 0.04632175713777542, -0.14409048855304718, -0.025638079270720482, 0.08849331736564636, 0.05210799351334572, 0.1391223818063736, -0.018338950350880623, 0.020317193120718002, 0.09325559437274933, 0.0035294138360768557, 0.04439076408743858, 0.1355723738670349, -0.08351220935583115, 0.032696206122636795, 0.035039231181144714, -0.15029366314411163, -0.04414151981472969, 0.019137661904096603, 0.031031493097543716, -0.012991955503821373, 0.16170039772987366, -0.030294861644506454, 0.02354411967098713, 0.0011863011168316007, 0.031282901763916016, -0.011594259180128574, -0.15405993163585663, -0.1271498054265976, -0.12287797033786774, 0.006481429561972618, -0.06489264965057373, 0.25748708844184875, 0.11237718164920807, -0.06285043060779572, 0.020522862672805786, 0.047608062624931335, 0.05724664032459259, 0.032931555062532425, 0.16529794037342072, 0.02441336214542389, 0.02613719180226326, -0.16526798903942108, 0.07275579124689102, 0.007532613351941109, 0.09315773844718933, 0.07743111997842789, -0.005816364660859108, -0.09372548013925552, 0.06943691521883011, 0.03879271447658539, 0.0922299474477768, -0.2180410474538803, -0.1793113350868225, -0.025949476286768913, 0.10782188922166824, 0.01809079945087433, 0.15012699365615845, 0.0839461162686348, -0.010274638421833515, 0.06605446338653564, -0.036641426384449005, -0.05531888082623482, -0.19168567657470703, -0.1818271279335022, -0.08863965421915054, -0.07530435174703598, 0.01633945107460022, -0.11366976052522659, -0.022669868543744087, 0.07134362310171127, 0.07064875960350037, -0.0626174733042717, 0.08766433596611023, -0.06305939704179764, 0.03732664883136749, 0.06248410418629646, -0.008613137528300285, 0.016308853402733803, -0.07330639660358429, -0.013311170972883701, -0.1496550291776657, 0.007840984500944614, -0.012702684849500656, 0.06564178317785263, -0.07150253653526306, -0.0021239544730633497, -0.012769199907779694, -0.06456618010997772, -0.04530505836009979, -0.034081533551216125, -0.02383565530180931, 0.11953995376825333, -0.012967311777174473, -0.005582557059824467, 0.03917981684207916, 0.13513943552970886, -0.05409713834524155, -0.025505950674414635, -0.08847974985837936, 0.3214481770992279, -0.03483882173895836, 0.10877367109060287, 0.02290775440633297, 0.038194213062524796, -0.08001651614904404, 0.3232824504375458, 0.3026612401008606, -0.12197466939687729, 0.010505386628210545, -0.007942385040223598, 0.018353627994656563, 0.05392337962985039, 0.11809074878692627, 0.05686010792851448, 0.23679441213607788, -0.07673081755638123, 0.011804342269897461, -0.07951873540878296, -0.03986307978630066, -0.11689775437116623, 0.04939187690615654, 0.09736061096191406, -0.07615860551595688, -0.12088830769062042, 0.04295648634433746, -0.11325333267450333, 0.020784955471754074, -0.06908759474754333, -0.13872472941875458, -0.07121036946773529, 0.0023541671689599752, 0.03901958465576172, 0.01784157007932663, 0.11786255240440369, -0.07114222645759583, -0.026242054998874664, 0.02575785666704178, -0.017071273177862167, -0.17204460501670837, 0.07422149926424026, 0.0931745246052742, -0.04819094017148018, 0.10053479671478271, 0.009193524718284607, 0.05895183980464935, 0.06999705731868744, 0.03137259930372238, -0.042291250079870224, 0.08250173181295395, 0.015027360059320927, -0.017764385789632797, 0.04262614622712135, -0.04413462430238724, -0.017749756574630737, -0.025081541389226913, 0.04551171883940697, -0.13517378270626068, 0.046342235058546066, -0.03707174211740494, -0.09007789939641953, -0.033102333545684814, 0.057150375097990036, -0.06054602563381195, 0.08280204236507416, 0.08926177769899368, -0.014666687697172165, -0.07824762910604477, -0.013818302191793919, 0.025743961334228516, -0.0034627136774361134, -0.17002606391906738, -0.06279469281435013, -0.09931758046150208, -0.06986311078071594, 0.08292855322360992, 0.051630813628435135, -0.2469048649072647, 0.003924115560948849, -0.14715324342250824, 0.04943348839879036, -0.12607064843177795, 0.01700238510966301, 0.11664152145385742, 0.0059979576617479324, -0.024255214259028435, -0.07251985371112823, 0.011491389013826847, 0.01958521082997322, -0.0632353350520134, -0.12820160388946533 ]
null
null
transformers
# Ara-dialect-BERT We used a pretrained model to further train it on [HARD-Arabic-Dataset](https://github.com/elnagara/HARD-Arabic-Dataset), the weights were initialized using [CAMeL-Lab](https://huggingface.co/CAMeL-Lab/bert-base-camelbert-msa-eighth) "bert-base-camelbert-msa-eighth" model ### Usage The model weights can be loaded using `transformers` library by HuggingFace. ```python from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("MutazYoune/Ara_DialectBERT") model = AutoModel.from_pretrained("MutazYoune/Ara_DialectBERT") ``` Example using `pipeline`: ```python from transformers import pipeline fill_mask = pipeline( "fill-mask", model="MutazYoune/Ara_DialectBERT", tokenizer="MutazYoune/Ara_DialectBERT" ) fill_mask("الفندق جميل و لكن [MASK] بعيد") ``` ```python {'sequence': 'الفندق جميل و لكن الموقع بعيد', 'score': 0.28233852982521057, 'token': 3221, 'token_str': 'الموقع'} {'sequence': 'الفندق جميل و لكن موقعه بعيد', 'score': 0.24436227977275848, 'token': 19218, 'token_str': 'موقعه'} {'sequence': 'الفندق جميل و لكن المكان بعيد', 'score': 0.15372352302074432, 'token': 5401, 'token_str': 'المكان'} {'sequence': 'الفندق جميل و لكن الفندق بعيد', 'score': 0.029026474803686142, 'token': 11133, 'token_str': 'الفندق'} {'sequence': 'الفندق جميل و لكن مكانه بعيد', 'score': 0.024554792791604996, 'token': 10701, 'token_str': 'مكانه'}
{"language": "ar", "datasets": ["HARD-Arabic-Dataset"]}
fill-mask
MutazYoune/Ara_DialectBERT
[ "transformers", "pytorch", "jax", "bert", "fill-mask", "ar", "dataset:HARD-Arabic-Dataset", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ar" ]
TAGS #transformers #pytorch #jax #bert #fill-mask #ar #dataset-HARD-Arabic-Dataset #autotrain_compatible #endpoints_compatible #region-us
# Ara-dialect-BERT We used a pretrained model to further train it on HARD-Arabic-Dataset, the weights were initialized using CAMeL-Lab "bert-base-camelbert-msa-eighth" model ### Usage The model weights can be loaded using 'transformers' library by HuggingFace. Example using 'pipeline': '''python {'sequence': 'الفندق جميل و لكن الموقع بعيد', 'score': 0.28233852982521057, 'token': 3221, 'token_str': 'الموقع'} {'sequence': 'الفندق جميل و لكن موقعه بعيد', 'score': 0.24436227977275848, 'token': 19218, 'token_str': 'موقعه'} {'sequence': 'الفندق جميل و لكن المكان بعيد', 'score': 0.15372352302074432, 'token': 5401, 'token_str': 'المكان'} {'sequence': 'الفندق جميل و لكن الفندق بعيد', 'score': 0.029026474803686142, 'token': 11133, 'token_str': 'الفندق'} {'sequence': 'الفندق جميل و لكن مكانه بعيد', 'score': 0.024554792791604996, 'token': 10701, 'token_str': 'مكانه'}
[ "# Ara-dialect-BERT\nWe used a pretrained model to further train it on HARD-Arabic-Dataset, the weights were initialized using CAMeL-Lab \"bert-base-camelbert-msa-eighth\" model", "### Usage\nThe model weights can be loaded using 'transformers' library by HuggingFace.\n\nExample using 'pipeline':\n\n'''python\n{'sequence': 'الفندق جميل و لكن الموقع بعيد', 'score': 0.28233852982521057, 'token': 3221, 'token_str': 'الموقع'}\n{'sequence': 'الفندق جميل و لكن موقعه بعيد', 'score': 0.24436227977275848, 'token': 19218, 'token_str': 'موقعه'}\n{'sequence': 'الفندق جميل و لكن المكان بعيد', 'score': 0.15372352302074432, 'token': 5401, 'token_str': 'المكان'}\n{'sequence': 'الفندق جميل و لكن الفندق بعيد', 'score': 0.029026474803686142, 'token': 11133, 'token_str': 'الفندق'}\n{'sequence': 'الفندق جميل و لكن مكانه بعيد', 'score': 0.024554792791604996, 'token': 10701, 'token_str': 'مكانه'}" ]
[ "TAGS\n#transformers #pytorch #jax #bert #fill-mask #ar #dataset-HARD-Arabic-Dataset #autotrain_compatible #endpoints_compatible #region-us \n", "# Ara-dialect-BERT\nWe used a pretrained model to further train it on HARD-Arabic-Dataset, the weights were initialized using CAMeL-Lab \"bert-base-camelbert-msa-eighth\" model", "### Usage\nThe model weights can be loaded using 'transformers' library by HuggingFace.\n\nExample using 'pipeline':\n\n'''python\n{'sequence': 'الفندق جميل و لكن الموقع بعيد', 'score': 0.28233852982521057, 'token': 3221, 'token_str': 'الموقع'}\n{'sequence': 'الفندق جميل و لكن موقعه بعيد', 'score': 0.24436227977275848, 'token': 19218, 'token_str': 'موقعه'}\n{'sequence': 'الفندق جميل و لكن المكان بعيد', 'score': 0.15372352302074432, 'token': 5401, 'token_str': 'المكان'}\n{'sequence': 'الفندق جميل و لكن الفندق بعيد', 'score': 0.029026474803686142, 'token': 11133, 'token_str': 'الفندق'}\n{'sequence': 'الفندق جميل و لكن مكانه بعيد', 'score': 0.024554792791604996, 'token': 10701, 'token_str': 'مكانه'}" ]
[ 53, 59, 299 ]
[ "passage: TAGS\n#transformers #pytorch #jax #bert #fill-mask #ar #dataset-HARD-Arabic-Dataset #autotrain_compatible #endpoints_compatible #region-us \n# Ara-dialect-BERT\nWe used a pretrained model to further train it on HARD-Arabic-Dataset, the weights were initialized using CAMeL-Lab \"bert-base-camelbert-msa-eighth\" model### Usage\nThe model weights can be loaded using 'transformers' library by HuggingFace.\n\nExample using 'pipeline':\n\n'''python\n{'sequence': 'الفندق جميل و لكن الموقع بعيد', 'score': 0.28233852982521057, 'token': 3221, 'token_str': 'الموقع'}\n{'sequence': 'الفندق جميل و لكن موقعه بعيد', 'score': 0.24436227977275848, 'token': 19218, 'token_str': 'موقعه'}\n{'sequence': 'الفندق جميل و لكن المكان بعيد', 'score': 0.15372352302074432, 'token': 5401, 'token_str': 'المكان'}\n{'sequence': 'الفندق جميل و لكن الفندق بعيد', 'score': 0.029026474803686142, 'token': 11133, 'token_str': 'الفندق'}\n{'sequence': 'الفندق جميل و لكن مكانه بعيد', 'score': 0.024554792791604996, 'token': 10701, 'token_str': 'مكانه'}" ]
[ -0.05868994817137718, 0.11345567554235458, -0.009622767567634583, 0.05128547176718712, 0.06124165654182434, 0.05706391856074333, 0.21397694945335388, 0.0731922835111618, 0.029195431619882584, 0.04518787935376167, 0.18814973533153534, 0.09216157346963882, 0.09206917136907578, 0.03718554973602295, 0.02119862474501133, -0.17497456073760986, 0.02410176955163479, -0.017670603469014168, 0.022737327963113785, 0.10786454379558563, 0.1220882460474968, 0.0036319291684776545, 0.1449328362941742, -0.0638265311717987, -0.056505922228097916, 0.04038170352578163, 0.0171115230768919, -0.01409020647406578, 0.07892374694347382, 0.08268565684556961, 0.13521936535835266, 0.02078031189739704, 0.0224827341735363, -0.20063792169094086, 0.00007736394763924181, 0.009510853327810764, -0.03609653562307358, 0.0173315592110157, -0.07836046069860458, -0.08532028645277023, 0.2403978854417801, -0.07459099590778351, -0.027382267639040947, -0.02450616843998432, -0.12608827650547028, -0.13438983261585236, 0.0034185268450528383, 0.08024589717388153, 0.03688078373670578, -0.031014245003461838, 0.007352543994784355, 0.047421280294656754, -0.01048517320305109, 0.11746256053447723, 0.1623077392578125, -0.1940159797668457, -0.06952659040689468, 0.05835464596748352, -0.009804794564843178, 0.02539254166185856, -0.038322679698467255, -0.008996638469398022, -0.0039051638450473547, 0.009777316823601723, -0.023699574172496796, -0.0849396213889122, -0.015672121196985245, 0.0022080487105995417, -0.09965584427118301, -0.0656362995505333, 0.1543068140745163, 0.050014153122901917, -0.05675863102078438, 0.0023714841809123755, -0.05319564789533615, -0.11183973401784897, -0.06272228807210922, -0.03480571508407593, -0.031219303607940674, -0.021376388147473335, 0.033774182200431824, 0.04300229623913765, -0.07364581525325775, -0.0651179701089859, -0.059905074536800385, 0.08354540914297104, 0.04551428183913231, 0.03939591720700264, 0.028777441009879112, -0.013833574950695038, -0.08844409137964249, -0.11040976643562317, 0.03494733199477196, -0.014375755563378334, 0.03782972693443298, 0.04499857500195503, 0.02443566545844078, -0.04965697228908539, 0.08919110149145126, 0.08908861130475998, 0.025845034047961235, 0.13497786223888397, 0.038977012038230896, 0.03789785876870155, -0.07830141484737396, 0.04405650123953819, -0.06646867841482162, -0.10290363430976868, -0.07036223262548447, 0.09585057199001312, -0.006754359230399132, -0.0386643148958683, -0.0677437111735344, -0.0244095791131258, 0.02919672802090645, -0.047700509428977966, -0.027690621092915535, 0.1466291844844818, -0.009070869535207748, 0.0061961086466908455, 0.013331963680684566, -0.13464754819869995, -0.022022182121872902, 0.017513450235128403, -0.050611574202775955, 0.0025104344822466373, 0.05655985325574875, 0.06643345206975937, -0.0681627169251442, 0.004611228592693806, -0.03263121470808983, 0.053918495774269104, -0.047323357313871384, -0.08396308869123459, -0.006020889151841402, -0.05414444953203201, -0.018208904191851616, -0.11455786228179932, -0.11395490169525146, -0.06009886786341667, 0.11087881773710251, 0.06528402119874954, 0.04133985936641693, -0.06717058271169662, -0.010515967383980751, 0.04845418035984039, -0.034430500119924545, -0.0648871436715126, -0.05296100303530693, 0.08252336084842682, 0.012186010368168354, 0.06764784455299377, -0.10824604332447052, 0.007344068959355354, -0.040112365037202835, 0.05884094908833504, -0.14717310667037964, 0.08109506219625473, -0.0820489227771759, 0.058014772832393646, -0.17769694328308105, 0.0011881665559485555, -0.049016207456588745, 0.04631882533431053, 0.10551165044307709, 0.12628911435604095, -0.2116907387971878, -0.07497358322143555, 0.152116596698761, -0.0690099447965622, -0.12524721026420593, 0.15285567939281464, -0.033863168209791183, 0.06287924945354462, 0.08907034993171692, 0.15029840171337128, 0.11774078011512756, -0.0764739066362381, -0.03430624678730965, 0.08711429685354233, 0.038388822227716446, 0.06376589834690094, 0.05851445347070694, -0.027464203536510468, -0.047598857432603836, 0.041538044810295105, -0.017251504585146904, 0.014472702518105507, -0.011165260337293148, -0.07757062464952469, 0.03566427901387215, -0.0773930773139, 0.13726168870925903, 0.09797342121601105, 0.040375858545303345, -0.004640017636120319, -0.05229130759835243, -0.007643561344593763, 0.05083935335278511, -0.03937622532248497, 0.02536677196621895, -0.10491693019866943, -0.006590378005057573, 0.012678208760917187, 0.009627779014408588, -0.06647861003875732, -0.04885217547416687, -0.03401683643460274, 0.0804082602262497, 0.04279109835624695, -0.031768813729286194, 0.10539860278367996, 0.02139139175415039, -0.013030625879764557, 0.029180379584431648, 0.11507758498191833, -0.011920684948563576, -0.05598727613687515, -0.10672218352556229, -0.05643782392144203, 0.008109493181109428, 0.09187294542789459, -0.11838573962450027, 0.05391267314553261, -0.009375050663948059, 0.05214228481054306, -0.045031093060970306, -0.038943544030189514, -0.04708833619952202, 0.0873652994632721, 0.04331852123141289, 0.023858504369854927, 0.016300560906529427, -0.015167572535574436, -0.1157989501953125, 0.022305982187390327, -0.15828317403793335, 0.2236587256193161, 0.08189336955547333, -0.10806204378604889, -0.08206795156002045, -0.05363830551505089, 0.01880604587495327, -0.01921345666050911, 0.1515709012746811, -0.028610670939087868, 0.0923827737569809, -0.0008606478804722428, 0.07313729077577591, -0.052865441888570786, 0.06905363500118256, -0.008671171963214874, -0.05771736800670624, -0.010776404291391373, 0.16863638162612915, 0.0139829246327281, -0.1258748173713684, 0.13242627680301666, 0.09153534471988678, -0.05362587049603462, 0.1381102353334427, 0.0037384536117315292, -0.03783277049660683, -0.11870633810758591, 0.06538932025432587, -0.025883039459586143, 0.011522725224494934, -0.13547785580158234, -0.050191156566143036, -0.004372259136289358, -0.012039120309054852, -0.033860303461551666, -0.03848186135292053, -0.029138894751667976, -0.0038151866756379604, -0.05527244508266449, -0.09199487417936325, -0.033089593052864075, 0.006414151284843683, 0.05390598997473717, 0.03452979028224945, -0.07664761692285538, 0.010603194124996662, -0.009029085747897625, -0.12133916467428207, 0.1768587976694107, -0.10100389271974564, -0.06730148196220398, -0.08074913173913956, -0.18090741336345673, -0.060584597289562225, 0.022000115364789963, -0.02165248431265354, 0.0038845676463097334, -0.0029927780851721764, -0.06131300702691078, 0.03429622948169708, 0.01144685409963131, -0.028631839901208878, -0.016192832961678505, -0.06493383646011353, 0.06320388615131378, -0.10459912568330765, -0.060588289052248, -0.01572394371032715, -0.04630686715245247, -0.015331832692027092, -0.08887890726327896, 0.06554024666547775, 0.01600799523293972, 0.005220335908234119, 0.033376939594745636, 0.0065528228878974915, 0.26934459805488586, -0.0077960616908967495, 0.04555213823914528, 0.04354231059551239, -0.09531763941049576, 0.085196353495121, 0.21518857777118683, 0.09013549238443375, -0.038954656571149826, -0.0031879788730293512, 0.026707705110311508, -0.060547687113285065, -0.04681514576077461, -0.019983146339654922, -0.03961961343884468, 0.02535608969628811, 0.06931596994400024, 0.031853269785642624, 0.01864197850227356, 0.1251143515110016, -0.009153924882411957, 0.08929835259914398, 0.008623111061751842, 0.03914729133248329, 0.0632389634847641, 0.008629933930933475, 0.060118403285741806, -0.09983580559492111, -0.020350659266114235, -0.0028679980896413326, 0.025904731824994087, 0.03936850652098656, 0.014455382712185383, 0.13641239702701569, 0.10398013144731522, 0.18582962453365326, 0.02389703504741192, 0.0006400974234566092, -0.03580901026725769, -0.05500999093055725, 0.015344961546361446, -0.07472799718379974, -0.20288808643817902, 0.06587725132703781, 0.05045744404196739, 0.024030521512031555, 0.031693488359451294, 0.06094491109251976, 0.04745399206876755, 0.07493981719017029, 0.025299111381173134, -0.2704492211341858, 0.0014754824806004763, 0.021407311782240868, 0.007877593860030174, -0.056602708995342255, -0.01039828546345234, 0.024524040520191193, -0.08564669638872147, 0.08936150372028351, -0.002319320570677519, 0.03701738268136978, -0.07661639153957367, 0.06095949932932854, -0.07706189155578613, -0.03873947635293007, -0.03763948753476143, 0.005376336630433798, -0.33448585867881775, 0.2649048864841461, 0.05992589145898819, 0.03903796151280403, -0.0390043631196022, 0.01924080401659012, 0.06170975789427757, 0.018120665103197098, 0.14129586517810822, 0.00992896594107151, -0.1619766503572464, -0.18279902637004852, -0.03347892686724663, 0.058931559324264526, 0.12363620102405548, -0.06840439140796661, 0.020371466875076294, -0.015426615253090858, 0.019258152693510056, -0.02012263424694538, 0.12977001070976257, -0.07535602897405624, -0.10799728333950043, 0.03347969055175781, 0.05166633427143097, -0.018581433221697807, -0.007470936980098486, 0.018569981679320335, -0.10426179319620132, 0.09474077075719833, -0.16041365265846252, -0.07183708250522614, -0.04220594838261604, -0.05032581835985184, 0.03125082328915596, -0.13580267131328583, -0.016911931335926056, -0.030308060348033905, -0.1066703125834465, 0.004386089742183685, -0.054708272218704224, 0.053574465215206146, 0.019604599103331566, 0.01180698536336422, -0.024059496819972992, 0.05859579145908356, 0.02357839234173298, 0.05976611375808716, 0.015894921496510506, 0.02283328026533127, 0.007590759079903364, -0.06636153161525726, -0.03901081904768944, -0.034326568245887756, -0.007292993366718292, 0.05460643395781517, -0.024630267173051834, -0.015201562084257603, -0.02618245594203472, 0.028172072023153305, 0.09125643968582153, 0.18369895219802856, -0.036248981952667236, 0.006613961886614561, 0.22406186163425446, -0.03306940197944641, -0.22092139720916748, -0.13703294098377228, 0.09475934505462646, 0.011580147780478, -0.04596246778964996, -0.20239025354385376, 0.05521593615412712, 0.03255169466137886, 0.034741245210170746, -0.03492805361747742, -0.22642774879932404, -0.08099905401468277, 0.18236377835273743, 0.04839767515659332, 0.07944396138191223, -0.18394897878170013, -0.043621573597192764, -0.08571577072143555, -0.223756805062294, 0.03642623499035835, -0.00155931047629565, 0.008750022388994694, 0.008280188776552677, 0.08284623175859451, -0.003079633228480816, -0.01723490096628666, 0.15292207896709442, 0.059898532927036285, 0.06904585659503937, -0.1736428141593933, -0.11378654092550278, -0.002586874645203352, -0.049312908202409744, 0.13768936693668365, -0.18303194642066956, -0.06141861900687218, -0.2013983428478241, 0.017209555953741074, -0.06259646266698837, 0.018451930955052376, -0.04963397607207298, -0.030489882454276085, 0.011529607698321342, -0.009478796273469925, 0.036778077483177185, 0.05471665412187576, 0.1029224842786789, -0.06200600042939186, 0.14151272177696228, 0.09534552693367004, 0.09246017783880234, -0.035953816026449203, -0.050756022334098816, 0.025206781923770905, 0.029858611524105072, 0.07019244879484177, -0.200913205742836, -0.012482223100960255, 0.09714242815971375, -0.0009237190242856741, 0.10767082870006561, 0.004584466572850943, 0.021119877696037292, 0.003312706481665373, 0.11590690165758133, -0.08417505770921707, -0.07623788714408875, 0.00369999255053699, -0.09790406376123428, -0.152278333902359, 0.0018989474046975374, 0.07939852774143219, 0.01084965467453003, 0.04855635017156601, -0.008132743649184704, 0.013683965429663658, -0.0682559385895729, 0.04668470844626427, 0.10745522379875183, 0.06851555407047272, -0.08345653116703033, 0.04229232668876648, 0.03950214758515358, -0.0013254955410957336, 0.0849669799208641, 0.0639905110001564, -0.051139213144779205, -0.059639208018779755, -0.11975768208503723, 0.15700814127922058, -0.011691179126501083, -0.053778085857629776, -0.01078006997704506, -0.05928550660610199, -0.016994042322039604, 0.13762646913528442, 0.07609831541776657, 0.06435807794332504, 0.011626743711531162, -0.037832800298929214, 0.011430964805185795, 0.037615224719047546, 0.014681831002235413, 0.009602476842701435, -0.03619056195020676, 0.11964727938175201, -0.0426030270755291, 0.07000759243965149, 0.0070701171644032, -0.010159146040678024, -0.14520253241062164, 0.03511591628193855, -0.09489280730485916, 0.023720350116491318, -0.12369537353515625, 0.03134038299322128, 0.05407238379120827, -0.0289516132324934, -0.05307634919881821, -0.042961589992046356, -0.0777503103017807, 0.01684519834816456, 0.05238112807273865, 0.07033174484968185, -0.1186012253165245, -0.02226490154862404, 0.06475949287414551, -0.035784315317869186, 0.031634896993637085, 0.07024277001619339, -0.01740250736474991, 0.09298549592494965, -0.11183614283800125, 0.06338900327682495, 0.004909545183181763, -0.017040099948644638, -0.015353960916399956, 0.07828450947999954, -0.007279771380126476, 0.00775504345074296, 0.07613505423069, 0.050790779292583466, 0.099518321454525, -0.06190288066864014, 0.11657620966434479, 0.01765654794871807, -0.06582492589950562, -0.04340413212776184, 0.06394490599632263, 0.08225959539413452, 0.051807302981615067, 0.08346585929393768, -0.02835502102971077, -0.008612106554210186, -0.1006353422999382, 0.009409630671143532, 0.019561029970645905, -0.0985141396522522, -0.03927567973732948, -0.10913367569446564, 0.03335971757769585, -0.02274504117667675, 0.09286116808652878, -0.019177043810486794, -0.0761708989739418, -0.007880756631493568, -0.03888911381363869, 0.011253478936851025, 0.03761816769838333, 0.12734246253967285, 0.036574624478816986, -0.05512727424502373, -0.0020010063890367746, 0.029383815824985504, -0.026451127603650093, 0.04874516278505325, -0.039823517203330994, 0.13330060243606567, 0.05146561190485954, 0.10732221603393555, 0.024364858865737915, 0.025107720866799355, 0.04140543192625046, 0.015386426821351051, -0.05688878148794174, 0.041304416954517365, -0.03422563895583153, 0.0443672239780426, 0.12806949019432068, -0.08227642625570297, 0.04813709855079651, -0.03786231204867363, -0.09224467724561691, -0.1168932169675827, -0.14477583765983582, -0.0297158882021904, -0.022649414837360382, 0.009751737117767334, -0.06388887017965317, -0.020118897780776024, 0.2213001400232315, -0.0038448048289865255, 0.07082372158765793, 0.18072158098220825, -0.006814929656684399, -0.0070205205120146275, 0.022165652364492416, -0.02599041722714901, 0.008643299341201782, -0.02122979611158371, -0.032311443239450455, 0.07108917087316513, -0.006747124250978231, 0.06664643436670303, -0.0073995450511574745, 0.00801345705986023, -0.012784294784069061, -0.03879791125655174, -0.09184654802083969, -0.018820997327566147, 0.04269672557711601, 0.02878635935485363, 0.10598155111074448, 0.046056117862463, -0.038326460868120193, -0.04218408837914467, 0.04394383728504181, 0.00037558210897259414, -0.10158785432577133, -0.15914669632911682, 0.12866200506687164, -0.02998478338122368, 0.03448257967829704, -0.08378420770168304, -0.01156034879386425, 0.03357445076107979, 0.1044493243098259, 0.1943696290254593, -0.04756082594394684, 0.0259972233325243, -0.01750214770436287, 0.04693782329559326, 0.01049869880080223, 0.06846985965967178, 0.002892907941713929, 0.1390175223350525, -0.04161300137639046, -0.013354343362152576, -0.015269603580236435, -0.02841658517718315, -0.06355885416269302, 0.02224884368479252, 0.015480756759643555, 0.008863724768161774, -0.06817516684532166, 0.07722248882055283, -0.11292453110218048, -0.08981351554393768, 0.007987403310835361, -0.08879583328962326, -0.040927913039922714, -0.033986035734415054, 0.044448040425777435, 0.1303531378507614, 0.005413799546658993, -0.00789766013622284, -0.019249327480793, 0.02327096089720726, -0.0032950115855783224, -0.12743820250034332, -0.03542334958910942, 0.10196109861135483, -0.010135119780898094, 0.13146887719631195, -0.0012597121531143785, 0.08751244843006134, 0.09600424021482468, 0.020667294040322304, 0.007519938517361879, 0.058829423040151596, 0.059443358331918716, 0.018297646194696426, -0.05957156419754028, -0.013660590164363384, 0.0031661621760576963, 0.08978123217821121, 0.040115147829055786, -0.03404681012034416, 0.05029639974236488, 0.027235234156250954, -0.03434263542294502, -0.009051639586687088, 0.008154860697686672, -0.060889922082424164, 0.16189105808734894, 0.1902250051498413, 0.016068557277321815, -0.0035224303137511015, -0.03632901981472969, -0.001829329994507134, -0.012186210602521896, -0.13136173784732819, -0.10331807285547256, -0.08745130896568298, -0.019961072131991386, -0.004681202117353678, 0.03710533306002617, -0.1431460678577423, -0.053325433284044266, 0.07868736237287521, 0.013888110406696796, 0.01964399591088295, 0.02832789160311222, 0.07312269508838654, 0.047005705535411835, -0.030112603679299355, -0.21725186705589294, 0.011766307055950165, 0.1020660474896431, -0.0640367716550827, -0.1279371678829193 ]
null
null
transformers
# Modeus DialoGPT Model
{"tags": ["conversational"]}
text-generation
Mythiie/DialoGPT-small-Modeus
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Modeus DialoGPT Model
[ "# Modeus DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Modeus DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Modeus DialoGPT Model" ]
[ -0.02484733983874321, 0.04575357958674431, -0.005541832651942968, 0.0008820635848678648, 0.14192385971546173, -0.011065944097936153, 0.15960469841957092, 0.11588361859321594, -0.08104562014341354, -0.04054936021566391, 0.10542023181915283, 0.15553952753543854, -0.0037687241565436125, 0.1014564111828804, -0.0730559304356575, -0.2668905556201935, 0.05115610733628273, 0.0362556092441082, 0.04471736028790474, 0.11795453727245331, 0.11849642544984818, -0.043431606143713, 0.0726386308670044, 0.01782143861055374, -0.15383855998516083, 0.02538362331688404, 0.02168363146483898, -0.10988175123929977, 0.1149626225233078, 0.0800345316529274, 0.04948102682828903, 0.017869051545858383, -0.04388028383255005, -0.15004223585128784, 0.029372597113251686, -0.015564779751002789, -0.03225265070796013, 0.03097904659807682, 0.03964744135737419, -0.07406818866729736, 0.09524749219417572, 0.11577683687210083, -0.0006937240832485259, 0.01940443553030491, -0.14453190565109253, 0.009856583550572395, -0.003628726117312908, 0.0951848104596138, 0.08039742708206177, 0.12796616554260254, -0.04989355802536011, 0.11541258543729782, -0.08720594644546509, 0.10853305459022522, 0.08404725790023804, -0.31772369146347046, -0.019621988758444786, 0.12135154008865356, 0.030138181522488594, 0.06867694854736328, -0.029640711843967438, 0.09619782119989395, 0.0010999947553500533, -0.0038953640032559633, -0.03843889757990837, -0.07122606784105301, -0.06970793753862381, -0.003871382214128971, -0.08639466017484665, -0.021843846887350082, 0.2456870973110199, -0.043855223804712296, 0.07622511684894562, -0.1073162630200386, -0.07858158648014069, -0.009605830535292625, -0.036435868591070175, -0.05797761678695679, -0.09265344589948654, 0.08462536334991455, 0.01775805838406086, -0.08256886154413223, -0.13415372371673584, -0.007141910959035158, -0.18104782700538635, 0.14821964502334595, 0.029374754056334496, 0.019434846937656403, -0.20821204781532288, 0.1118445098400116, -0.017008446156978607, -0.09349140524864197, 0.04192446917295456, -0.07963736355304718, -0.0011389714200049639, -0.0008996071992442012, -0.037147097289562225, -0.05229990929365158, 0.06766404211521149, 0.09404557943344116, -0.04752080515027046, 0.013641225174069405, -0.03353934735059738, 0.06125105172395706, 0.05123964697122574, 0.0816589817404747, -0.013631629757583141, -0.0998113751411438, 0.029437312856316566, -0.09039777517318726, -0.007133108098059893, -0.05765218660235405, -0.18775497376918793, -0.02479173243045807, 0.08708227425813675, 0.0727781131863594, 0.029047565534710884, 0.10276828706264496, 0.002277347259223461, -0.06404883414506912, 0.06066959723830223, -0.026569757610559464, -0.01710154488682747, -0.0000369572444469668, 0.015337902121245861, 0.14571838080883026, -0.004413921851664782, 0.060127269476652145, -0.14910739660263062, 0.021878110244870186, -0.045170389115810394, -0.006154520902782679, -0.004786710720509291, -0.058621883392333984, -0.011225773952901363, -0.04411115497350693, -0.009529695846140385, -0.16101554036140442, -0.11303289979696274, -0.00642124330624938, -0.003626120975241065, -0.04630925506353378, -0.10487648099660873, -0.0861188992857933, -0.01398067083209753, 0.06483545154333115, -0.061306435614824295, -0.00703438138589263, -0.04551801085472107, 0.06785394996404648, -0.014057448133826256, 0.06714128702878952, -0.08443135023117065, 0.0962638109922409, -0.07606171071529388, -0.034914642572402954, -0.08573063462972641, 0.12836727499961853, 0.0058534895069897175, 0.039406538009643555, -0.052594538778066635, -0.02283388003706932, -0.06845502555370331, 0.0617765411734581, -0.02291514165699482, 0.227077454328537, -0.06389354169368744, -0.10766677558422089, 0.2576356828212738, -0.07775109261274338, -0.10204974561929703, 0.14306321740150452, 0.0008165353792719543, 0.07281100004911423, 0.1286340355873108, 0.18282100558280945, 0.040292076766490936, 0.006383412517607212, 0.09073004871606827, 0.0838192030787468, -0.07341098040342331, -0.007064430043101311, 0.013533258810639381, -0.020137056708335876, -0.05548335984349251, 0.025737445801496506, 0.07097991555929184, 0.06382061541080475, -0.05215989798307419, -0.02893533743917942, 0.0024105971679091454, 0.0035332858096808195, 0.06934987753629684, -0.028736045584082603, 0.12968584895133972, -0.023137817159295082, -0.06034591421484947, -0.010944031178951263, 0.01597089134156704, -0.055713776499032974, 0.02700003609061241, -0.08941742032766342, 0.07905248552560806, -0.018217816948890686, 0.055284492671489716, -0.15130195021629333, -0.07672995328903198, -0.036767199635505676, 0.20231983065605164, 0.05501032620668411, 0.1277698278427124, 0.044624511152505875, -0.04054025560617447, -0.03249721601605415, 0.022144431248307228, 0.16724683344364166, -0.007023862563073635, -0.09347538650035858, -0.09814560413360596, 0.09771571308374405, -0.060340140014886856, 0.09113024175167084, -0.0442836694419384, -0.00215549417771399, 0.03579242154955864, 0.11464284360408783, -0.015925468876957893, 0.026449672877788544, 0.0025056914892047644, -0.0071716210804879665, -0.050770994275808334, 0.011167931370437145, 0.08985043317079544, -0.012789111584424973, -0.07171851396560669, 0.24151493608951569, -0.21191805601119995, 0.18425999581813812, 0.1759915053844452, -0.20632025599479675, -0.02080577053129673, -0.16971400380134583, -0.023974211886525154, -0.0009605471859686077, 0.041928090155124664, -0.04487813264131546, 0.23162929713726044, -0.02203674055635929, 0.16556447744369507, -0.045057591050863266, -0.0351722426712513, -0.027953756973147392, -0.049595870077610016, 0.00775728328153491, 0.08967867493629456, 0.07580162584781647, -0.23056261241436005, 0.1490267813205719, 0.10784956812858582, 0.0429302379488945, 0.18770059943199158, 0.0313616618514061, 0.01541582029312849, 0.044171854853630066, 0.010609783232212067, -0.04265313223004341, -0.10096425563097, -0.29874056577682495, -0.04264207184314728, 0.05824477970600128, 0.06609143316745758, 0.12583592534065247, -0.0973416194319725, -0.016266141086816788, 0.00873582810163498, -0.01925157755613327, 0.09775689989328384, 0.11571140587329865, 0.02419048175215721, 0.11748436838388443, -0.0096598444506526, -0.10028937458992004, 0.0807386115193367, 0.008326472714543343, -0.09625521302223206, 0.17884807288646698, -0.13065440952777863, -0.3977416157722473, -0.10032488405704498, -0.17354126274585724, -0.06597185879945755, 0.036638788878917694, 0.09792552143335342, -0.10389481484889984, -0.01837742328643799, -0.009949414059519768, 0.09076976776123047, -0.08920038491487503, 0.020915811881422997, -0.05918726697564125, 0.004124518949538469, -0.12240128219127655, -0.0956834927201271, -0.05935000628232956, -0.037609800696372986, -0.09712253510951996, 0.09124264121055603, -0.12998275458812714, 0.021696630865335464, 0.2583642303943634, 0.03139936551451683, 0.05817199870944023, -0.03274165093898773, 0.2060229629278183, -0.11193876713514328, 0.011510043404996395, 0.21263037621974945, -0.010694197379052639, 0.0601230263710022, 0.14197571575641632, -0.007432965561747551, -0.06544723361730576, 0.036263685673475266, -0.01444888673722744, -0.06895704567432404, -0.19819582998752594, -0.14243991672992706, -0.10780781507492065, 0.021732352674007416, 0.009356081485748291, 0.04083379730582237, 0.19948001205921173, 0.05692164972424507, -0.03694990649819374, -0.007064702920615673, 0.06539712846279144, 0.07236988842487335, 0.29012805223464966, -0.0708857849240303, 0.16923794150352478, -0.006732461974024773, -0.17250707745552063, 0.07252331078052521, 0.06501512974500656, 0.08242391794919968, 0.062442347407341, 0.013063850812613964, 0.01710810698568821, 0.04707520455121994, 0.12775002419948578, 0.04094475880265236, 0.018346671015024185, -0.057221103459596634, -0.0339818149805069, -0.04284851998090744, -0.032374221831560135, 0.04368668794631958, 0.12524822354316711, -0.19083498418331146, -0.044300734996795654, -0.015189990401268005, 0.058634500950574875, 0.03476797044277191, 0.09308560192584991, -0.15726983547210693, -0.021757766604423523, 0.05957690626382828, -0.056922368705272675, -0.11028004437685013, 0.07860026508569717, 0.05219193920493126, -0.12599490582942963, 0.017330529168248177, -0.018912309780716896, 0.11236336082220078, -0.10994415730237961, 0.08746571093797684, -0.1008417084813118, -0.059479862451553345, 0.00869518332183361, 0.09848172217607498, -0.23525409400463104, 0.2179049253463745, -0.01030530221760273, -0.04894866794347763, -0.11313924938440323, 0.00016094105376396328, 0.014829495921730995, 0.153239905834198, 0.08051151782274246, -0.01100757997483015, 0.014356118626892567, 0.013361857272684574, -0.047071561217308044, 0.01276624295860529, 0.11032086610794067, -0.028163982555270195, -0.02408527210354805, -0.04343768209218979, 0.002288870047777891, 0.00014328719407785684, -0.00022085601813159883, -0.03617533668875694, -0.20604856312274933, 0.09828806668519974, 0.11160154640674591, 0.07964559644460678, 0.023350918665528297, -0.0372675396502018, -0.08391579240560532, 0.32743754982948303, 0.007622783072292805, -0.08922732621431351, -0.09334476292133331, -0.043741755187511444, 0.04912648722529411, -0.06034727767109871, 0.018387606367468834, -0.0671447142958641, 0.033950548619031906, -0.06919661909341812, -0.1456422507762909, 0.12378203868865967, -0.09660837054252625, -0.023127790540456772, -0.032112933695316315, 0.21265563368797302, -0.016473714262247086, 0.005372319836169481, 0.06809724867343903, -0.013720561750233173, -0.07849135994911194, -0.10758596658706665, -0.0004233268555253744, 0.025126196444034576, -0.05191047489643097, 0.036003295332193375, -0.007201332598924637, -0.09123919159173965, -0.04740618169307709, -0.026484327390789986, 0.33337822556495667, 0.10824356228113174, -0.03982432186603546, 0.16539530456066132, 0.14463193714618683, -0.06121645122766495, -0.28351375460624695, -0.10552338510751724, -0.059150874614715576, -0.04100519046187401, -0.07985475659370422, -0.16681350767612457, 0.06373310834169388, -0.04481814429163933, -0.021817367523908615, 0.05299728736281395, -0.31983569264411926, -0.10885011404752731, 0.179805725812912, -0.03022007644176483, 0.3678032159805298, -0.09273195266723633, -0.08755868673324585, -0.04706722870469093, -0.1303633153438568, 0.1766328364610672, -0.002068147761747241, 0.13154669106006622, 0.008526752702891827, 0.16343633830547333, 0.04831792786717415, -0.004832999315112829, 0.07281786948442459, 0.01868380978703499, -0.05272644758224487, -0.08520644903182983, -0.06752683222293854, 0.028075294569134712, 0.027904683724045753, 0.03281354531645775, -0.03651496395468712, 0.024583961814641953, -0.1480625420808792, -0.06326479464769363, -0.0714695006608963, 0.042153798043727875, 0.022876590490341187, -0.07393832504749298, 0.008402896113693714, -0.030088769271969795, -0.005319012328982353, 0.013086624443531036, 0.12937568128108978, -0.11137543618679047, 0.13042141497135162, 0.12863971292972565, 0.13265688717365265, -0.12462736666202545, -0.036296457052230835, -0.0760788694024086, -0.05604807659983635, 0.06403492391109467, -0.04230823367834091, 0.030651310458779335, 0.10267158597707748, -0.0576196052134037, 0.09388498216867447, 0.0793558657169342, -0.008526301942765713, 0.01645124889910221, 0.09401977807283401, -0.24041715264320374, -0.05870051309466362, -0.0895392969250679, 0.0633339211344719, 0.07566811144351959, 0.09890281409025192, 0.24164706468582153, -0.009991806000471115, -0.022186502814292908, 0.0022675825748592615, 0.027063747867941856, -0.05385439842939377, 0.11957108974456787, -0.020390788093209267, 0.021751267835497856, -0.13828711211681366, 0.049063991755247116, 0.02000223658978939, -0.07172244042158127, 0.036154571920633316, 0.17575185000896454, -0.09691330790519714, -0.12661172449588776, -0.08128625154495239, 0.0699852928519249, -0.1509186327457428, -0.029551757499575615, -0.012651452794671059, -0.14139296114444733, 0.07025410234928131, 0.060093801468610764, 0.048083700239658356, 0.07488945126533508, -0.11715375632047653, -0.05481993034482002, -0.006868377793580294, -0.01087404228746891, 0.04643857106566429, -0.028109516948461533, -0.045807331800460815, 0.07875201851129532, -0.037357572466135025, 0.0789017379283905, -0.09746899455785751, -0.1175009235739708, -0.15079858899116516, 0.03795789182186127, -0.12946799397468567, -0.0913204774260521, -0.06767283380031586, -0.04299219697713852, 0.0033058826811611652, -0.017974337562918663, -0.03305766358971596, -0.040207114070653915, -0.09810744971036911, 0.04452070593833923, -0.049644626677036285, 0.017898013815283775, -0.08408920466899872, 0.02574005164206028, 0.03688118979334831, -0.01441617589443922, 0.1801023632287979, 0.11457931250333786, -0.11052750796079636, 0.08095366507768631, -0.2018401175737381, -0.07620641589164734, 0.09860064834356308, 0.021584950387477875, 0.047624289989471436, -0.0013128235004842281, 0.012348596937954426, 0.08060964941978455, 0.04400144889950752, 0.062341175973415375, 0.03182472288608551, -0.09117546677589417, 0.029485244303941727, -0.06250293552875519, -0.14294953644275665, -0.03624865040183067, -0.005772511474788189, 0.04597767814993858, 0.028675740584731102, 0.08567807078361511, -0.0673464760184288, 0.08692016452550888, -0.03786826506257057, 0.03666219115257263, 0.017571864649653435, -0.1531204879283905, -0.02403537929058075, -0.0800633430480957, 0.043996911495923996, 0.015595606528222561, 0.17964957654476166, 0.05112723261117935, 0.007270290516316891, 0.030875036492943764, 0.035393208265304565, 0.05246744304895401, -0.00038517219945788383, 0.1332293599843979, 0.12493935227394104, -0.040666718035936356, -0.09566578269004822, 0.06903214007616043, 0.020218834280967712, 0.029309874400496483, 0.1569403111934662, -0.03763529285788536, -0.01981716975569725, 0.0970258042216301, 0.0045217531733214855, 0.009238419122993946, -0.15089812874794006, -0.13068938255310059, -0.041109517216682434, 0.07402172684669495, -0.08948642760515213, 0.07613159716129303, 0.14304405450820923, -0.014603039249777794, 0.017963266000151634, -0.021906662732362747, -0.07292182743549347, -0.17384043335914612, -0.19898952543735504, -0.07466056942939758, -0.12158551067113876, -0.010639959014952183, -0.13348674774169922, 0.05692591145634651, -0.01785518415272236, 0.10270100831985474, -0.06985083222389221, 0.07527866214513779, 0.01771204173564911, -0.10529915243387222, 0.09083409607410431, -0.03292110934853554, 0.11347687244415283, -0.032998424023389816, 0.008444509468972683, -0.07466057687997818, 0.059711843729019165, 0.017357775941491127, 0.04689309000968933, -0.05435991287231445, -0.0017519735265523195, -0.10267779976129532, -0.07927733659744263, -0.05532511696219444, 0.03982683643698692, 0.004630801267921925, 0.17653588950634003, 0.026442380622029305, -0.03758702427148819, 0.016935737803578377, 0.2133873850107193, -0.06596329808235168, -0.1336946189403534, -0.06938841938972473, 0.22262857854366302, 0.0075766537338495255, 0.13519662618637085, -0.028914062306284904, 0.013707861304283142, -0.08698828518390656, 0.32885444164276123, 0.3119402229785919, -0.10151691734790802, 0.02424309216439724, 0.031968943774700165, 0.037516653537750244, 0.10413284599781036, 0.10296992212533951, 0.08500377088785172, 0.32150518894195557, -0.05662783235311508, 0.015111290849745274, -0.027883343398571014, -0.047997165471315384, -0.048767369240522385, 0.013799887150526047, 0.07335586845874786, -0.0473296120762825, -0.03571123629808426, 0.1194688081741333, -0.2572573721408844, 0.08313781768083572, -0.1356150358915329, -0.1739930808544159, -0.08674205094575882, 0.004026134964078665, 0.10779424756765366, 0.019829129800200462, 0.09774257242679596, 0.002456736983731389, -0.06304845213890076, 0.051482751965522766, 0.027698390185832977, -0.16836756467819214, -0.023076584562659264, 0.0830865427851677, -0.034236326813697815, -0.07451208680868149, -0.02607976645231247, 0.07905816286802292, 0.08758004754781723, 0.054634056985378265, 0.0007829240639694035, 0.033714648336172104, 0.0017410630825906992, -0.06312183290719986, 0.026941807940602303, 0.0506800077855587, 0.02614457719027996, -0.043719351291656494, 0.07420985400676727, -0.14717954397201538, 0.048593249171972275, -0.03898182883858681, -0.004659694619476795, -0.014795871451497078, 0.029815714806318283, -0.05675569921731949, 0.0812930092215538, 0.0913800299167633, -0.005000699311494827, -0.013354087248444557, -0.019954675808548927, -0.016330083832144737, -0.035031430423259735, -0.07106205075979233, -0.08658875524997711, -0.18854555487632751, -0.10493125766515732, 0.10289037972688675, 0.01531129889190197, -0.142532080411911, -0.004451577551662922, -0.11279996484518051, 0.0691923126578331, -0.14160972833633423, 0.11260241270065308, 0.09202054888010025, 0.015045542269945145, -0.0011566359316930175, -0.04512900114059448, 0.027975047007203102, 0.06987417489290237, -0.12418871372938156, -0.07465251535177231 ]
null
null
null
# My Awesome Model
{"tags": ["conversational"]}
text-generation
N8Daawg/chat_bot
[ "conversational", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #conversational #region-us
# My Awesome Model
[ "# My Awesome Model" ]
[ "TAGS\n#conversational #region-us \n", "# My Awesome Model" ]
[ 10, 4 ]
[ "passage: TAGS\n#conversational #region-us \n# My Awesome Model" ]
[ -0.03546040877699852, 0.10198262333869934, -0.009167643263936043, -0.06873539090156555, 0.09720280766487122, 0.08721881359815598, 0.05205019563436508, -0.007754879537969828, 0.1307058483362198, -0.07418710738420486, 0.15552383661270142, 0.09356602281332016, -0.055710893124341965, 0.054602161049842834, 0.023362739011645317, -0.24776877462863922, 0.05416174605488777, -0.018358204513788223, 0.038026146590709686, 0.059432677924633026, -0.008917242288589478, -0.06420617550611496, 0.032797642052173615, -0.054550930857658386, -0.047955937683582306, 0.05154015123844147, -0.015393521636724472, 0.022211134433746338, 0.13093532621860504, -0.025035109370946884, 0.11168728768825531, 0.02722480148077011, -0.08139488101005554, -0.2497675120830536, 0.04726335033774376, -0.03638986125588417, -0.05304733291268349, 0.00051964569138363, 0.040234215557575226, -0.0361054390668869, 0.10917626321315765, 0.2034350037574768, -0.013044852763414383, 0.12713150680065155, -0.29209578037261963, -0.05327993258833885, 0.019334495067596436, -0.017972392961382866, 0.007713220082223415, 0.010176903568208218, -0.0199898611754179, 0.12909340858459473, -0.18584373593330383, -0.03348228707909584, -0.05190730839967728, -0.19455693662166595, 0.02133459411561489, 0.195334330201149, 0.03859388828277588, 0.13404928147792816, -0.03862352669239044, 0.11985649168491364, -0.001176132122054696, -0.02696487121284008, -0.09787412732839584, -0.083623968064785, 0.05212550610303879, 0.10331118106842041, -0.045747701078653336, -0.01321598794311285, 0.2609056234359741, 0.08284381031990051, 0.022602153941988945, 0.052660685032606125, -0.037852801382541656, -0.01829439401626587, 0.0014054370112717152, -0.11398514360189438, -0.012235384434461594, 0.15790092945098877, 0.0919213518500328, -0.11842559278011322, -0.11358761787414551, 0.05386804789304733, -0.19349870085716248, 0.11392602324485779, -0.05408310890197754, 0.09635519981384277, -0.2302265763282776, -0.03165312856435776, -0.23591788113117218, -0.03137180954217911, 0.008059423416852951, -0.1426098644733429, -0.019717667251825333, -0.05638215318322182, 0.06416983157396317, 0.10064838826656342, 0.026653507724404335, 0.05982319638133049, 0.0024214035365730524, 0.03094852901995182, 0.04355722293257713, 0.06075821444392204, 0.13682802021503448, 0.07382544130086899, 0.08105745166540146, -0.015595734119415283, -0.13757064938545227, -0.15028178691864014, -0.042663972824811935, -0.0603194423019886, -0.10847903043031693, 0.1175917387008667, -0.1348201483488083, 0.056088559329509735, -0.008282779715955257, -0.06160689890384674, -0.1302913874387741, 0.07904011011123657, -0.009780440479516983, 0.07031673192977905, -0.0500408336520195, -0.030323538929224014, -0.012846678495407104, 0.09941235184669495, -0.15968181192874908, 0.07661482691764832, 0.08417865633964539, -0.017821768298745155, -0.13900868594646454, -0.0261816568672657, -0.06725679337978363, 0.07161393761634827, 0.027083612978458405, -0.05911887437105179, 0.07335446029901505, -0.1062290146946907, -0.10800924897193909, -0.0462496280670166, 0.05760948359966278, -0.04071643948554993, 0.02777569182217121, -0.08487419039011002, 0.04323674738407135, -0.043959055095911026, -0.011524932458996773, -0.0954136773943901, -0.10528495907783508, 0.03591811656951904, 0.07539372146129608, 0.007705247029662132, -0.16573995351791382, 0.034548647701740265, -0.0835181400179863, 0.049681901931762695, -0.02308676578104496, 0.0353124737739563, 0.043504517525434494, 0.1896352767944336, 0.09132891893386841, 0.11749628186225891, -0.18855012953281403, 0.05541319027543068, -0.12649250030517578, 0.2574777901172638, -0.15162399411201477, -0.0263845045119524, 0.23871201276779175, -0.028618283569812775, -0.1612473875284195, 0.031156204640865326, -0.024838248267769814, 0.21029748022556305, 0.15949539840221405, 0.3231875002384186, -0.14657290279865265, -0.040706682950258255, 0.10120458155870438, 0.1641000658273697, -0.0947515219449997, -0.05097932368516922, 0.07063271850347519, -0.08443530648946762, -0.10856795310974121, -0.009461409412324429, 0.06791952252388, 0.08725127577781677, -0.08676911145448685, -0.057048194110393524, 0.04897143691778183, -0.05990632250905037, 0.0539734810590744, 0.09225133061408997, 0.019252249971032143, -0.06463979184627533, 0.03549259155988693, -0.03487003222107887, 0.06323052942752838, 0.1344345659017563, -0.07567445188760757, -0.04621171951293945, 0.05208157002925873, -0.008416261523962021, 0.011982166208326817, -0.07247529923915863, -0.10293032974004745, -0.09282280504703522, 0.15225498378276825, 0.12907147407531738, 0.24349625408649445, 0.07771562039852142, -0.08539113402366638, 0.023976441472768784, 0.05230777710676193, 0.02473422884941101, 0.08408135920763016, 0.006961719132959843, -0.04070304334163666, 0.1336214691400528, -0.06637702882289886, 0.0505339689552784, -0.15414337813854218, -0.06644152104854584, -0.003572809975594282, 0.03696983680129051, 0.08356471359729767, -0.04645514488220215, -0.0019455882720649242, 0.0381651446223259, 0.05654265359044075, 0.010789581574499607, 0.11545533686876297, -0.04579673334956169, -0.07804842293262482, 0.17771583795547485, -0.10181614011526108, 0.1486288160085678, 0.11471833288669586, -0.2789360582828522, 0.027798451483249664, -0.09091047197580338, -0.016147438436746597, 0.03836512193083763, 0.07616107910871506, -0.04532977193593979, 0.0876641571521759, 0.008135076612234116, 0.045742783695459366, 0.030224641785025597, 0.049904048442840576, -0.05050472542643547, -0.038475628942251205, -0.1468115597963333, 0.11378216743469238, 0.14237768948078156, -0.1664484292268753, 0.15665550529956818, 0.344457745552063, 0.2189280241727829, 0.24893705546855927, -0.02391449362039566, -0.0023995572701096535, -0.007929098792374134, -0.05631881207227707, -0.1310952752828598, 0.1292494386434555, -0.3013957738876343, -0.0045976797118783, 0.0019251068588346243, 0.020475171506404877, 0.10425154119729996, -0.11188157647848129, -0.11943034082651138, 0.01928076706826687, 0.013840734958648682, -0.04093893617391586, 0.05837646499276161, -0.08973170816898346, 0.06690972298383713, 0.044297512620687485, -0.09705004096031189, 0.12118203192949295, 0.032428398728370667, -0.027439109981060028, 0.06038731336593628, -0.13205169141292572, -0.161821186542511, -0.014722511172294617, -0.12305234372615814, 0.03822726011276245, 0.02258457988500595, -0.0011937115341424942, -0.11695839464664459, -0.017612792551517487, 0.08651383221149445, 0.06550532579421997, -0.21545930206775665, -0.08663901686668396, -0.05402546748518944, 0.06747249513864517, -0.14086408913135529, -0.005392237100750208, -0.0580231137573719, -0.04639870673418045, -0.015921805053949356, 0.03196299821138382, -0.1326124668121338, 0.05850570648908615, 0.1952214241027832, 0.11960950493812561, 0.08012472093105316, 0.00031379942083731294, 0.25854313373565674, -0.15044859051704407, -0.031102577224373817, 0.04013100266456604, -0.03211164101958275, 0.08570355176925659, 0.2004045695066452, 0.07713621109724045, -0.04407672584056854, -0.032609183341264725, -0.06265520304441452, -0.08382013440132141, -0.17984598875045776, -0.09944407641887665, -0.09241506457328796, 0.11593815684318542, -0.10455609858036041, 0.02279566414654255, 0.1293002963066101, 0.038562655448913574, 0.10591760277748108, -0.17956066131591797, -0.08083869516849518, -0.015997247770428658, 0.10070767998695374, -0.1476999819278717, -0.036619413644075394, -0.057095691561698914, -0.13042104244232178, 0.09085097908973694, 0.07351890206336975, -0.07598336786031723, 0.2753245234489441, 0.14356902241706848, 0.06460168957710266, 0.0372491329908371, 0.050815973430871964, 0.03296361491084099, 0.06032256409525871, -0.08821487426757812, -0.024852164089679718, 0.008612229488790035, -0.022380370646715164, 0.05025824159383774, 0.21010570228099823, -0.24013914167881012, -0.010044138878583908, -0.12094947695732117, 0.058911196887493134, -0.09226593375205994, 0.15273147821426392, -0.005419398192316294, 0.07938405126333237, 0.13775718212127686, 0.017697615548968315, -0.08790077269077301, 0.10226619243621826, 0.06094779446721077, -0.12483128160238266, -0.00920578371733427, 0.10987824946641922, 0.08385234326124191, -0.016504161059856415, 0.11643458902835846, -0.21195663511753082, -0.13761650025844574, 0.033488254994153976, 0.10529548674821854, -0.1958140879869461, 0.3039077818393707, 0.009235309436917305, -0.1351068764925003, -0.0639132410287857, -0.11496353149414062, -0.012014171108603477, 0.10743112862110138, 0.10711206495761871, 0.042469725012779236, -0.07393775135278702, -0.026096675544977188, 0.009214960969984531, -0.007742607034742832, 0.09298452734947205, -0.08414001762866974, -0.12013377249240875, 0.010150929912924767, 0.03940318152308464, -0.048703476786613464, 0.1009429320693016, -0.08256068825721741, -0.07715889811515808, -0.009262125939130783, -0.013167787343263626, 0.013363508507609367, 0.0613013356924057, 0.09955485910177231, -0.03248724341392517, -0.016322879120707512, 0.17398953437805176, 0.06143142655491829, -0.028618335723876953, -0.15928512811660767, -0.0019777673296630383, -0.04513169452548027, -0.039428479969501495, -0.06899980455636978, -0.07661525160074234, -0.11753853410482407, -0.09040885418653488, 0.12060512602329254, -0.1011100485920906, 0.08252820372581482, -0.07234728336334229, 0.1556718945503235, 0.0449947789311409, 0.027362238615751266, 0.0420842170715332, 0.010504845529794693, -0.0637444332242012, -0.06514158844947815, 0.13852131366729736, -0.17852531373500824, -0.03856880962848663, 0.10458311438560486, 0.07282499223947525, 0.011911713518202305, 0.01789284311234951, -0.12299755960702896, 0.19862310588359833, 0.21264135837554932, -0.00729813938960433, 0.16878525912761688, 0.24475444853305817, -0.06593596935272217, -0.205520361661911, -0.06422454118728638, -0.23529553413391113, -0.07331079244613647, 0.15935871005058289, -0.17814143002033234, 0.09661445021629333, -0.008739825338125229, -0.0568917840719223, 0.14693109691143036, -0.32532361149787903, -0.005068281665444374, 0.19984132051467896, -0.03098643198609352, 0.5961567759513855, -0.067476287484169, -0.10163161158561707, -0.020260926336050034, -0.08047869801521301, 0.2328169047832489, -0.10700056701898575, 0.022178582847118378, 0.05551070347428322, 0.10260957479476929, 0.06909338384866714, 0.020471658557653427, 0.1812010407447815, -0.05112580955028534, -0.06011022627353668, -0.11829929798841476, -0.20441202819347382, 0.01321274135261774, -0.002565407194197178, -0.09609947353601456, 0.06216618791222572, -0.06233842670917511, -0.17662794888019562, 0.015285306610167027, -0.08876541256904602, -0.03281906247138977, 0.012494787573814392, -0.04522540047764778, -0.03335950896143913, 0.03184078261256218, -0.1269286721944809, 0.02083008363842964, 0.15761429071426392, -0.09231042861938477, 0.21183836460113525, -0.09038146585226059, 0.11980026215314865, -0.1715039312839508, -0.07381453365087509, -0.0892910361289978, -0.0739288181066513, 0.022808637470006943, -0.05498562753200531, 0.039651405066251755, 0.1229073703289032, -0.06014255806803703, 0.13346922397613525, 0.04274857044219971, -0.07362601161003113, -0.009332284331321716, 0.14846959710121155, -0.19871611893177032, -0.2910851538181305, -0.11566967517137527, 0.05923045426607132, 0.2028619349002838, 0.007538134697824717, 0.0777674987912178, 0.11478970944881439, -0.016507035121321678, 0.015079355798661709, 0.010267493315041065, -0.11431148648262024, -0.10444751381874084, 0.06896253675222397, 0.0007569619920104742, -0.08558188378810883, 0.10230734944343567, 0.019022267311811447, -0.18386487662792206, -0.11988023668527603, 0.20762711763381958, -0.0200219564139843, -0.09439916163682938, -0.0994015783071518, 0.1823630928993225, -0.060141727328300476, -0.026388145983219147, 0.060486629605293274, -0.0971314087510109, -0.027692077681422234, 0.17358030378818512, 0.05215360224246979, 0.07331566512584686, 0.043052736669778824, -0.019896553829312325, 0.19111433625221252, -0.07610252499580383, -0.08666720986366272, -0.11411479860544205, -0.1113981381058693, -0.09765390306711197, -0.022090891376137733, 0.17170315980911255, -0.1043042540550232, -0.1465844213962555, -0.23367111384868622, 0.08566625416278839, -0.07618606090545654, -0.14835943281650543, -0.12351330369710922, -0.09960491955280304, 0.07022807002067566, -0.005836328491568565, -0.025358503684401512, -0.09784673154354095, -0.1479180008172989, 0.10302628576755524, 0.09353149682283401, 0.02215663343667984, 0.03276374191045761, 0.06490205228328705, 0.16425716876983643, 0.006264934781938791, 0.11560661345720291, 0.09335105121135712, 0.004334130324423313, 0.12737876176834106, -0.24313320219516754, -0.03612852096557617, 0.061543241143226624, -0.02008756995201111, 0.03869541361927986, 0.1556338667869568, -0.07101669907569885, -0.008599703200161457, 0.07346312701702118, 0.05884246155619621, -0.06158248707652092, -0.07029277831315994, -0.020444681867957115, 0.15146341919898987, -0.21854759752750397, -0.010464908555150032, -0.13543701171875, 0.08618341386318207, -0.06382738798856735, 0.026516791433095932, 0.07620655745267868, 0.08659784495830536, 0.003671627026051283, 0.051998503506183624, 0.02891702763736248, -0.10376659035682678, 0.11479650437831879, -0.1011483371257782, -0.010525095276534557, -0.04059837758541107, 0.3260350227355957, 0.008407027460634708, 0.01702333241701126, 0.04124368727207184, 0.1525338888168335, 0.036274950951337814, 0.002469088416546583, 0.11112259328365326, 0.1318541020154953, -0.05502143129706383, -0.1530759334564209, 0.1053222268819809, -0.03983991593122482, 0.017480194568634033, 0.12883120775222778, -0.017984678968787193, 0.05133776366710663, 0.0598396472632885, 0.03326093778014183, 0.06138930097222328, 0.08058228343725204, -0.2519816756248474, 0.05864633992314339, -0.008193781599402428, -0.10156036913394928, 0.14093659818172455, 0.12776172161102295, -0.04358195886015892, 0.03643115237355232, -0.08332061767578125, -0.017144199460744858, -0.13900910317897797, -0.08347687125205994, 0.011046548373997211, -0.0890955775976181, 0.024407757446169853, -0.02158970944583416, -0.01773250661790371, 0.20654316246509552, 0.00810841005295515, -0.08619078248739243, 0.024125345051288605, -0.03246486932039261, -0.1350407898426056, -0.028485149145126343, 0.004192930646240711, 0.07534855604171753, -0.10631464421749115, -0.009459893219172955, -0.1966288685798645, -0.03228876367211342, -0.10370618849992752, 0.031585320830345154, -0.13690978288650513, -0.056601304560899734, -0.1394949108362198, -0.046161238104104996, -0.07070588320493698, 0.0341620109975338, -0.10124900937080383, 0.14852306246757507, -0.02862531691789627, 0.04669342562556267, 0.001221607206389308, 0.21473883092403412, -0.0055917128920555115, 0.1111997663974762, -0.03376127779483795, 0.025253819301724434, -0.07697580754756927, 0.12172159552574158, -0.06678933650255203, -0.016789207234978676, -0.0435773991048336, 0.26328104734420776, 0.3666163384914398, -0.13708895444869995, -0.03541192784905434, -0.029750946909189224, 0.032959096133708954, 0.055593449622392654, 0.09368465840816498, -0.04153439402580261, 0.29192063212394714, -0.11245544999837875, 0.09509938210248947, 0.017580494284629822, 0.02136683464050293, 0.05382193997502327, 0.03408944979310036, 0.0992191731929779, 0.02480826899409294, -0.06201104819774628, 0.22126658260822296, -0.28489235043525696, 0.12929458916187286, -0.09419834613800049, -0.1977759450674057, -0.024422185495495796, -0.09278329461812973, 0.1075873076915741, 0.009720941074192524, 0.1459415853023529, -0.059647753834724426, -0.14828547835350037, -0.10750431567430496, 0.04621806740760803, -0.34202659130096436, -0.19561326503753662, 0.13985638320446014, 0.041433196514844894, 0.09628590941429138, -0.006398599129170179, 0.017934424802660942, 0.040109071880578995, 0.005008349195122719, 0.011764837428927422, 0.06299147754907608, 0.06230494752526283, -0.03965628892183304, -0.18101933598518372, 0.038948025554418564, 0.03522862121462822, -0.13118009269237518, 0.08088650554418564, -0.19435498118400574, 0.03574736788868904, 0.1191537082195282, -0.08429282158613205, 0.052640412002801895, 0.12771375477313995, -0.12840406596660614, 0.03900361806154251, 0.03826753795146942, 0.04374406486749649, -0.039075564593076706, 0.019395308569073677, 0.00933300144970417, -0.021677589043974876, -0.11719156801700592, -0.12927356362342834, 0.06891030073165894, -0.07255057990550995, 0.15803246200084686, -0.043637361377477646, -0.06318710744380951, 0.02716391533613205, -0.05997319892048836, 0.09305576235055923, -0.026282401755452156, 0.04433848708868027, 0.20267623662948608, 0.046981945633888245, 0.005474291741847992, -0.09724274277687073, 0.07344987988471985, 0.0035022953525185585, 0.002335904398933053, -0.08806760609149933 ]
null
null
null
# Francesco's Machine Learning Discord BOT
{"tags": ["conversational"]}
text-generation
NASABOI/MachineLearningAI
[ "conversational", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #conversational #region-us
# Francesco's Machine Learning Discord BOT
[ "# Francesco's Machine Learning Discord BOT" ]
[ "TAGS\n#conversational #region-us \n", "# Francesco's Machine Learning Discord BOT" ]
[ 10, 10 ]
[ "passage: TAGS\n#conversational #region-us \n# Francesco's Machine Learning Discord BOT" ]
[ -0.02237258292734623, 0.09151487797498703, -0.005729248281568289, -0.015935763716697693, 0.10980719327926636, -0.01565719209611416, 0.16362838447093964, 0.033787090331315994, 0.26746779680252075, 0.056418340653181076, 0.0841614305973053, 0.17070388793945312, 0.01781010814011097, 0.15299001336097717, -0.024070963263511658, -0.17765162885189056, 0.11099771410226822, -0.05822264030575752, 0.009720378555357456, 0.040764279663562775, 0.05043935775756836, -0.1381555050611496, 0.06376020610332489, -0.07287361472845078, -0.16538888216018677, 0.04157915711402893, 0.017229264602065086, -0.06556908041238785, 0.1110314130783081, 0.05638689175248146, 0.10318604856729507, -0.0061187162064015865, 0.006286608520895243, -0.0185761209577322, 0.06108499690890312, -0.004630552139133215, -0.030832141637802124, 0.04692302271723747, 0.02062363736331463, -0.0858442559838295, 0.08564361184835434, 0.18910615146160126, 0.02522556483745575, 0.08822359889745712, -0.18571673333644867, 0.09561602026224136, 0.014468223787844181, -0.09382810443639755, 0.03502357378602028, 0.08809895068407059, -0.04072438180446625, 0.1676633358001709, -0.16705383360385895, 0.01283081341534853, 0.008612819015979767, -0.21200096607208252, -0.03890058025717735, 0.0689164400100708, 0.00754055380821228, 0.09217629581689835, -0.043222635984420776, 0.13043998181819916, 0.05607213079929352, -0.047744229435920715, -0.1532953530550003, -0.014322263188660145, 0.037611231207847595, -0.018551811575889587, -0.05201325938105583, -0.038989078253507614, 0.37709441781044006, 0.08528066426515579, 0.049433160573244095, 0.04622526839375496, -0.03742402046918869, 0.0963486060500145, -0.022484702989459038, -0.07316552847623825, -0.10628008842468262, 0.08340471982955933, -0.0018604969372972846, -0.09687631577253342, -0.05999242141842842, -0.0424564965069294, -0.1781044453382492, 0.2733513414859772, -0.02783406712114811, 0.059668004512786865, -0.14264385402202606, -0.059827763587236404, 0.06881418079137802, -0.0037095211446285248, 0.0645885244011879, -0.06163189187645912, -0.02268373779952526, -0.061272770166397095, -0.02755064330995083, -0.008144685998558998, 0.11537092179059982, 0.1380944401025772, -0.002326029120013118, -0.028191201388835907, -0.002509837271645665, 0.1123911514878273, 0.14199845492839813, 0.02177382819354534, -0.12970878183841705, 0.029107948765158653, 0.009437303990125656, -0.050037503242492676, -0.0063539971597492695, -0.11377700418233871, -0.16058744490146637, 0.0807688906788826, -0.09333989769220352, 0.08930298686027527, 0.07552333921194077, 0.03219866380095482, -0.12147247791290283, -0.017289716750383377, -0.017097629606723785, -0.005266851279884577, -0.010462897829711437, 0.05563954636454582, -0.13499833643436432, -0.006038641091436148, -0.18155892193317413, 0.012213405221700668, 0.01479922141879797, -0.06958964467048645, -0.09026650339365005, -0.05189599469304085, -0.06693917512893677, -0.03877626359462738, 0.07551535218954086, 0.1218116357922554, 0.0709930956363678, -0.16334104537963867, 0.06855365633964539, -0.004571528639644384, -0.05603119358420372, -0.10792168229818344, 0.057111334055662155, -0.08961399644613266, 0.06061903014779091, -0.05599161982536316, -0.029677487909793854, -0.049522608518600464, -0.0768401250243187, 0.0935090109705925, -0.09361952543258667, 0.13562238216400146, -0.19777780771255493, -0.003005407517775893, -0.05773645639419556, -0.03214746341109276, -0.15887875854969025, 0.04026840999722481, -0.07180836796760559, 0.137517049908638, -0.04578527435660362, 0.12744678556919098, -0.15298087894916534, 0.06581225991249084, 0.018341680988669395, 0.18975888192653656, -0.17821715772151947, -0.06218009069561958, 0.11904511600732803, -0.03447791561484337, -0.07872498035430908, 0.18022584915161133, 0.001466569839976728, 0.08675339818000793, 0.12351187318563461, 0.24947254359722137, -0.29956892132759094, -0.021589813753962517, -0.020862197503447533, 0.08979177474975586, -0.18827848136425018, 0.057878997176885605, 0.02144557796418667, -0.01331817451864481, -0.12452664226293564, -0.016949379816651344, 0.1396745890378952, 0.059642817825078964, -0.1245414987206459, -0.06306184083223343, 0.0904187560081482, -0.10643360763788223, 0.046876806765794754, 0.023265773430466652, 0.05996258929371834, -0.07037873566150665, -0.027311110869050026, -0.1525157243013382, 0.05683017149567604, 0.052345383912324905, -0.05044218525290489, -0.2249429076910019, 0.22632800042629242, 0.024422893300652504, 0.07156047224998474, -0.17212910950183868, -0.0980997085571289, 0.020357050001621246, 0.08363181352615356, 0.1502123922109604, 0.12181925028562546, 0.006955283228307962, -0.06847770512104034, 0.04115142300724983, 0.050511885434389114, 0.06381469964981079, 0.05283402279019356, -0.022631758823990822, -0.1489512026309967, 0.08049404621124268, -0.08053722232580185, 0.03399557247757912, -0.04628932103514671, -0.03832721710205078, 0.16062307357788086, 0.07684411853551865, -0.0053809829987585545, 0.036253463476896286, 0.016286252066493034, -0.006121769547462463, -0.03303716331720352, -0.0045892633497715, 0.12684911489486694, -0.06422478705644608, -0.08650868386030197, 0.07020902633666992, -0.0928005650639534, 0.09568669646978378, 0.21675002574920654, -0.25175461173057556, -0.02751050889492035, 0.04245080426335335, 0.0011772706639021635, 0.02507679909467697, 0.053362879902124405, -0.11174038797616959, 0.08953963965177536, -0.028804009780287743, 0.02911691926419735, -0.02687176503241062, 0.08886343985795975, -0.019415445625782013, 0.0015813143691048026, -0.12989304959774017, 0.07724294811487198, 0.13119398057460785, -0.17665044963359833, 0.11955101042985916, 0.226864293217659, 0.07903286814689636, 0.24586577713489532, 0.049658969044685364, 0.02155725099146366, -0.009837821125984192, 0.036707665771245956, -0.05973726138472557, 0.1079622134566307, -0.3064652979373932, -0.021459778770804405, -0.016203036531805992, -0.04691709205508232, 0.08224958926439285, -0.08539015799760818, -0.07831314951181412, -0.10169079154729843, 0.004923159722238779, 0.012791979126632214, 0.0417979471385479, -0.039373598992824554, 0.16215546429157257, 0.08016174286603928, -0.21019291877746582, 0.04658594727516174, -0.009149154648184776, -0.002154714660719037, 0.0741344690322876, -0.12073671817779541, -0.27876710891723633, 0.015318368561565876, -0.031190359964966774, -0.04943171143531799, 0.09898704290390015, -0.0494050495326519, -0.1452825963497162, 0.006000373046845198, 0.06250505894422531, 0.13667063415050507, -0.018768370151519775, -0.04876148700714111, 0.09803225845098495, 0.07168128341436386, -0.07142277806997299, -0.06554866582155228, -0.03476836159825325, -0.13020922243595123, -0.10845830291509628, 0.011633053421974182, -0.1644788384437561, 0.1549340933561325, 0.19930367171764374, 0.054229408502578735, 0.07032231241464615, 0.013720150105655193, 0.21450923383235931, -0.13781534135341644, -0.03427007049322128, 0.04466773197054863, -0.009372644126415253, -0.029834697023034096, 0.17681299149990082, 0.05586720630526543, -0.1769135594367981, 0.003181794658303261, -0.004221020266413689, -0.0951564610004425, -0.14376354217529297, -0.19281047582626343, -0.04053448513150215, 0.17114560306072235, 0.04647913947701454, 0.034615833312273026, 0.051703453063964844, -0.03164863958954811, 0.028430307283997536, -0.17171627283096313, 0.07496967166662216, -0.011199962347745895, -0.032004255801439285, -0.16788046061992645, 0.072367362678051, -0.02464037947356701, -0.07404804974794388, 0.1461622565984726, 0.07516198605298996, -0.022619791328907013, 0.21003921329975128, 0.1504082828760147, 0.031400635838508606, 0.022562211379408836, 0.0926271453499794, 0.03909283131361008, 0.07918709516525269, -0.08840346336364746, -0.03395417705178261, -0.002661261009052396, -0.07433969527482986, 0.049775246530771255, 0.2159448117017746, -0.12936361134052277, -0.03558419644832611, -0.06788751482963562, 0.12938034534454346, 0.037772003561258316, 0.09236844629049301, -0.10598018020391464, -0.0637018010020256, 0.06929991394281387, 0.013624089770019054, -0.04483747482299805, 0.12719757854938507, 0.19288545846939087, -0.06094830110669136, -0.10718784481287003, 0.08165010064840317, 0.1104045882821083, -0.14251714944839478, 0.049578651785850525, -0.2190566062927246, -0.027641506865620613, 0.004167713224887848, 0.08237277716398239, -0.2073686569929123, 0.2221543788909912, 0.027623167261481285, -0.00722710182890296, -0.11485651135444641, -0.089356929063797, 0.04509986937046051, 0.09291215986013412, 0.14457465708255768, -0.00019848936062771827, -0.17598409950733185, -0.2368687242269516, -0.12611380219459534, 0.0008705767686478794, 0.0925622507929802, -0.07486822456121445, -0.12281591445207596, 0.027735328301787376, 0.030481934547424316, -0.008283711969852448, -0.023701004683971405, -0.07378555089235306, -0.12979213893413544, -0.006956772413104773, 0.11538157612085342, 0.07235042005777359, 0.044977542012929916, -0.017902329564094543, 0.05168748274445534, 0.06454843282699585, 0.011615234427154064, 0.10938432067632675, -0.028834959492087364, 0.10933423042297363, -0.02982020191848278, -0.03694632649421692, -0.0689166709780693, -0.05095071718096733, 0.043853748589754105, -0.09733355045318604, -0.08476609736680984, 0.15935948491096497, -0.1093941330909729, 0.029850421473383904, -0.08399488776922226, 0.10774131864309311, 0.11623046547174454, 0.04620368406176567, 0.08925185352563858, -0.061018358916044235, 0.02042251266539097, -0.05236886069178581, 0.11224677413702011, 0.010839268565177917, -0.006661420222371817, 0.09843767434358597, -0.0728936418890953, -0.18764102458953857, -0.06978709250688553, -0.0210912823677063, 0.2280774861574173, 0.09186363965272903, -0.038579028099775314, 0.049269407987594604, 0.24283987283706665, 0.021540865302085876, -0.35649585723876953, -0.027911914512515068, -0.17351746559143066, -0.06494546681642532, -0.054090384393930435, -0.17350459098815918, 0.06498795002698898, 0.024309949949383736, -0.017624512314796448, 0.02261994779109955, -0.28113725781440735, -0.06412392854690552, 0.2044854611158371, 0.03384951874613762, 0.5608956217765808, -0.016481252387166023, -0.05531802773475647, -0.03869791328907013, 0.09149504452943802, 0.2005818635225296, 0.025179753080010414, 0.0675162598490715, 0.12884922325611115, 0.04413476586341858, 0.03485649451613426, 0.07265114039182663, 0.14326342940330505, 0.021938247606158257, -0.03504728153347969, -0.09559136629104614, -0.27351635694503784, 0.03985864296555519, 0.09256664663553238, -0.08155694603919983, 0.12170831114053726, -0.08630004525184631, -0.062605120241642, -0.023504098877310753, -0.0649387463927269, 0.09441801905632019, 0.03560975566506386, 0.0005507469177246094, -0.02854032814502716, 0.04910953715443611, -0.08999960869550705, 0.1013282909989357, 0.1330609917640686, -0.09392549842596054, 0.08900206536054611, 0.04392484948039055, 0.12330600619316101, -0.227198526263237, 0.17383867502212524, 0.031149664893746376, -0.06579405814409256, 0.1284446269273758, 0.013574190437793732, 0.04814901947975159, 0.0683092549443245, -0.05564847216010094, 0.06094653531908989, 0.05918791890144348, -0.08242690563201904, 0.04590592905879021, 0.12002798169851303, -0.1878262758255005, -0.2755700647830963, 0.01595366932451725, 0.11906936764717102, 0.20321673154830933, 0.04798172041773796, 0.08270479738712311, 0.07989870756864548, -0.026773741468787193, 0.0073516834527254105, 0.01936202310025692, -0.05998026207089424, 0.052923426032066345, 0.004888451658189297, -0.019939018413424492, -0.1427353173494339, 0.13816902041435242, 0.07221052795648575, -0.20689110457897186, 0.025114789605140686, 0.01416968833655119, -0.06373593956232071, -0.11924466490745544, -0.19720125198364258, 0.12732669711112976, -0.013746832497417927, -0.03850217163562775, -0.035509686917066574, -0.0871269628405571, 0.031866345554590225, 0.11067835241556168, 0.05736386403441429, 0.045486513525247574, -0.04954991862177849, -0.013555505312979221, 0.057412561029195786, -0.005535043776035309, -0.018860889598727226, -0.1187785193324089, 0.06772086769342422, -0.03895369544625282, 0.047936588525772095, 0.08156058937311172, -0.10593857616186142, -0.160733163356781, -0.24120192229747772, 0.08947231620550156, -0.07792139053344727, -0.13812850415706635, -0.0899653211236, -0.05491049587726593, -0.018732966855168343, -0.08594384044408798, -0.06425078958272934, -0.07190381735563278, -0.11737188696861267, 0.04655320569872856, 0.07912122458219528, 0.07254355400800705, -0.04588519036769867, 0.03822450712323189, 0.05336388945579529, -0.0011679647723212838, 0.15844272077083588, 0.16362863779067993, -0.002452082931995392, 0.07385335862636566, -0.20291519165039062, -0.07974102348089218, 0.03480910137295723, 0.02293400466442108, 0.09524816274642944, 0.057298481464385986, -0.021601781249046326, 0.042239297181367874, 0.03933503106236458, 0.025378426536917686, 0.11867764592170715, -0.03628053143620491, 0.11269281059503555, 0.046861592680215836, -0.17811419069766998, -0.004704511258751154, 0.014089862816035748, 0.19533175230026245, -0.031348083168268204, 0.018306897953152657, 0.030347198247909546, 0.022858470678329468, 0.0493403822183609, 0.08308994024991989, 0.011342714540660381, -0.15472431480884552, 0.015698116272687912, -0.029664093628525734, 0.04342426732182503, -0.05765653774142265, 0.19397582113742828, 0.00033075548708438873, 0.013443063013255596, 0.06561527401208878, 0.14018695056438446, -0.09403214603662491, 0.11869411915540695, 0.011443190276622772, 0.05234379693865776, 0.005636864807456732, -0.11049722880125046, 0.07498007267713547, 0.07511833310127258, 0.1834428459405899, 0.06032097712159157, 0.06019672378897667, 0.046020835638046265, 0.004908725619316101, 0.10320913791656494, 0.015073577873408794, -0.06342733651399612, 0.034560028463602066, -0.1100265309214592, -0.009909874759614468, -0.07807445526123047, 0.012342394329607487, 0.06591705232858658, -0.006962534040212631, -0.005615696310997009, -0.09827137738466263, -0.04733474180102348, -0.11353764683008194, -0.19148778915405273, -0.031297605484724045, -0.10322264581918716, -0.004504622425884008, -0.06719077378511429, -0.01370229572057724, 0.05151377618312836, 0.030649298802018166, -0.11621224880218506, 0.1326170116662979, -0.12124812602996826, -0.030548719689249992, 0.06229892373085022, -0.007078591734170914, 0.00547066843137145, -0.11811675876379013, -0.10203581303358078, -0.1269793063402176, 0.04966919496655464, -0.04982860013842583, 0.07117358595132828, -0.05248159170150757, -0.018751414492726326, -0.12272854894399643, -0.08332528918981552, -0.06268665194511414, -0.0020985181909054518, -0.11368580907583237, 0.12789936363697052, 0.038060303777456284, -0.01167001947760582, 0.04719705507159233, 0.22354207932949066, 0.03955942019820213, 0.13986819982528687, -0.10736482590436935, 0.02017102763056755, -0.13564057648181915, 0.15708808600902557, -0.05829246714711189, 0.025990575551986694, -0.1384415179491043, 0.27295050024986267, 0.23116670548915863, -0.21036027371883392, -0.013182899914681911, -0.05705374851822853, 0.06235663965344429, -0.05766797438263893, 0.13652341067790985, 0.02417466975748539, 0.1465684026479721, -0.014380973763763905, 0.00620598578825593, -0.016124386340379715, -0.002253339858725667, -0.018939293920993805, 0.055272120982408524, 0.06378570199012756, 0.009609839878976345, -0.13636918365955353, 0.11220882087945938, -0.18226410448551178, -0.06805215030908585, -0.030004074797034264, -0.23684024810791016, -0.08421676605939865, 0.005557464901357889, -0.008888696320354939, 0.03344545140862465, 0.19083857536315918, -0.06544937938451767, -0.07835543155670166, -0.12874503433704376, 0.005919034127146006, -0.24272064864635468, -0.1763111799955368, 0.06543254107236862, -0.057985514402389526, 0.18747873604297638, -0.0352916456758976, 0.04005386307835579, 0.06963127106428146, 0.023315533995628357, -0.03523619845509529, 0.045216482132673264, 0.06545688956975937, -0.024439336732029915, -0.061965953558683395, 0.08546488732099533, -0.027722975239157677, 0.20818603038787842, 0.15429569780826569, -0.16673727333545685, 0.04806853458285332, -0.14465691149234772, -0.0932149663567543, -0.024999767541885376, 0.15190206468105316, -0.128631591796875, 0.1166839599609375, 0.05154331028461456, -0.03053668700158596, -0.013866052031517029, 0.051725927740335464, 0.013900195248425007, 0.011175881139934063, -0.028757648542523384, -0.1687876135110855, -0.20226828753948212, -0.05906552076339722, -0.09593532234430313, -0.0072480752132833, -0.04050620272755623, 0.0202884953469038, -0.1376928687095642, 0.08399608731269836, -0.013865080662071705, 0.04733694717288017, 0.1351247876882553, -0.008929871022701263, -0.01105498243123293, -0.09547009319067001, 0.059717368334531784, 0.043219853192567825, -0.0012450801441445947, -0.08169151097536087 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/1epochv3
[ "transformers", "pytorch", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 87, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.05241292342543602, 0.11625625193119049, -0.004290661308914423, 0.03704634681344032, 0.10461783409118652, 0.04311855509877205, 0.02440350502729416, 0.14597083628177643, -0.0326407290995121, 0.07119511812925339, 0.041633110493421555, 0.054706066846847534, 0.043093517422676086, 0.08738726377487183, 0.003954117186367512, -0.25242874026298523, 0.07505936175584793, -0.029287222772836685, -0.11313261091709137, 0.03122064843773842, 0.09910337626934052, -0.11940285563468933, 0.06363672018051147, -0.022568227723240852, -0.04831758141517639, 0.04378461092710495, 0.0006135303410701454, -0.000733526365365833, 0.10929683595895767, 0.08638615906238556, 0.04970165714621544, 0.12570685148239136, 0.06911695003509521, -0.18629753589630127, 0.035372961312532425, 0.08342937380075455, 0.024787431582808495, 0.06046098843216896, 0.04166822135448456, 0.034362152218818665, 0.082123763859272, -0.12192820012569427, -0.0012567057274281979, 0.04668266698718071, -0.014703585766255856, -0.18785467743873596, -0.08851271122694016, 0.010912719182670116, 0.06809866428375244, 0.026018789038062096, -0.007647804915904999, 0.04403293877840042, -0.0634857788681984, 0.04492665082216263, 0.2033262997865677, -0.24006621539592743, 0.005089050158858299, 0.09652864933013916, -0.05766617879271507, -0.0021894872188568115, -0.05889365077018738, 0.06054612621665001, -0.03787076845765114, -0.004404023289680481, 0.0953960046172142, -0.007863149046897888, -0.027227632701396942, 0.0461537167429924, -0.09998410940170288, -0.028319157660007477, -0.017032254487276077, -0.021815486252307892, -0.019361726939678192, -0.12552063167095184, -0.13545437157154083, -0.09394071251153946, -0.09466221928596497, -0.10201138257980347, 0.0481635145843029, -0.034098174422979355, 0.0027645486406981945, -0.12905673682689667, -0.0781635195016861, -0.040702249854803085, -0.012745361775159836, 0.05975765734910965, 0.09084657579660416, -0.01573737896978855, 0.01888619363307953, 0.12221407145261765, -0.05866345763206482, -0.09633360803127289, -0.06893252581357956, -0.07345526665449142, -0.10827293992042542, 0.029612606391310692, -0.06550871580839157, -0.13967692852020264, 0.036477793008089066, 0.1664232611656189, 0.10732275247573853, 0.11743797361850739, 0.004422088153660297, -0.026239631697535515, -0.052678272128105164, 0.17081505060195923, -0.09168440103530884, -0.04946804791688919, 0.13098061084747314, 0.014130712486803532, 0.006833386141806841, -0.028710070997476578, -0.056183066219091415, -0.06179269403219223, 0.08802732825279236, 0.09180831909179688, -0.04942389577627182, 0.0682508647441864, 0.04268469288945198, -0.043976616114377975, 0.11267323791980743, -0.15951232612133026, -0.036047544330358505, 0.011395269073545933, -0.04261685162782669, 0.026174025610089302, 0.029328111559152603, -0.03301100805401802, -0.08342672884464264, 0.08510751277208328, -0.04280565679073334, -0.07014468312263489, -0.07203925400972366, -0.08885645121335983, -0.002707223640754819, 0.026051171123981476, -0.04565635696053505, -0.14187489449977875, -0.17441733181476593, 0.030235065147280693, 0.028298823162913322, -0.014657210558652878, 0.04990647733211517, 0.026168597862124443, -0.010564020834863186, -0.018691079691052437, -0.02134881541132927, 0.027627741917967796, -0.024837426841259003, 0.0335579439997673, 0.03671235218644142, 0.07314646989107132, -0.02218751236796379, 0.02800191007554531, -0.034974511712789536, 0.030307423323392868, -0.23950177431106567, 0.08978857100009918, -0.100782111287117, -0.08487722277641296, -0.08426533639431, -0.05481399595737457, -0.03245139867067337, 0.0033765919506549835, 0.028284341096878052, 0.08222424238920212, -0.1274300068616867, -0.06718092411756516, 0.14027553796768188, -0.11157736927270889, -0.02299368381500244, 0.09707306325435638, -0.0022017641458660364, -0.055710747838020325, 0.0570623055100441, 0.09115156531333923, 0.25128817558288574, -0.14172594249248505, -0.0486244335770607, 0.013722315430641174, -0.008094100281596184, -0.013274040073156357, 0.023913394659757614, 0.0699276551604271, 0.10617539286613464, 0.059531647711992264, -0.11182842403650284, -0.020532988011837006, -0.022604839876294136, -0.0500359833240509, -0.03825552761554718, -0.014430549927055836, 0.0602373443543911, -0.035922009497880936, 0.048536550253629684, 0.04468540474772453, -0.08056991547346115, 0.06431268900632858, 0.14005854725837708, -0.08971033990383148, 0.04085931554436684, -0.13266025483608246, 0.029658399522304535, -0.010808725841343403, 0.05502389371395111, -0.14949937164783478, -0.17315195500850677, 0.028714269399642944, -0.20101559162139893, -0.014984328299760818, 0.16706711053848267, 0.06520620733499527, 0.0769575759768486, -0.07139018177986145, 0.03544917330145836, -0.0465189628303051, -0.026657596230506897, -0.012762212194502354, -0.039144936949014664, -0.0904446691274643, -0.003422896610572934, 0.1434527039527893, -0.07098592817783356, 0.01785021647810936, 0.12045729160308838, 0.15926221013069153, 0.04081423953175545, -0.030288774520158768, -0.021402545273303986, -0.0352913998067379, 0.035939160734415054, -0.07606267929077148, -0.0013733342057093978, -0.0005944598815403879, -0.050455402582883835, 0.08635364472866058, -0.15149633586406708, 0.13202467560768127, 0.08588232100009918, 0.12846827507019043, 0.041794151067733765, 0.010709304362535477, -0.03575313836336136, -0.029423724859952927, -0.04201208055019379, -0.06960401684045792, 0.051100172102451324, 0.07274112105369568, 0.14544865489006042, -0.06951829791069031, -0.05518389493227005, 0.00011524007277330384, 0.05315240100026131, -0.013236815109848976, 0.00699988566339016, 0.024189092218875885, -0.08823025971651077, 0.024960024282336235, 0.024626167491078377, -0.005524864420294762, 0.13287346065044403, -0.029232831671833992, -0.09008967876434326, -0.001253121648915112, -0.046175241470336914, -0.013912647031247616, 0.039902493357658386, 0.07383694499731064, 0.0639260783791542, 0.04250863939523697, 0.023993737995624542, 0.08118695765733719, -0.09342873096466064, 0.07322639971971512, 0.04650180786848068, -0.08427714556455612, 0.030391348525881767, 0.08757985383272171, 0.025929659605026245, 0.03123314678668976, 0.00714842090383172, 0.07555853575468063, -0.00331905041821301, -0.026500098407268524, -0.03688663989305496, 0.12997634708881378, -0.0822547972202301, -0.23660308122634888, -0.18573130667209625, -0.03791981562972069, -0.08769375085830688, -0.005737293511629105, 0.04672124981880188, -0.04999380186200142, -0.14164356887340546, -0.03944524750113487, 0.10956963896751404, 0.04122653976082802, 0.003083166666328907, -0.0214821957051754, 0.004241902381181717, 0.07114008814096451, -0.14301002025604248, 0.008071999996900558, -0.001016158377751708, -0.052957240492105484, 0.025252217426896095, 0.0748182088136673, 0.04050320014357567, 0.07052327692508698, -0.03639766946434975, 0.02550515905022621, -0.004302534740418196, 0.1658281534910202, -0.012941831722855568, 0.04620959237217903, 0.21517708897590637, -0.007216479163616896, 0.06316639482975006, 0.05751804634928703, 0.014120690524578094, -0.07226750999689102, 0.021344417706131935, 0.09521719068288803, -0.019487185403704643, -0.19240689277648926, -0.06886369735002518, -0.06324194371700287, -0.054118406027555466, 0.02421768195927143, 0.044022656977176666, -0.004982439801096916, -0.0016912405844777822, -0.06946996599435806, 0.040119536221027374, 0.017158862203359604, 0.024831630289554596, 0.22673776745796204, 0.006654110737144947, 0.07586584240198135, -0.0724959746003151, -0.030478915199637413, 0.05292810499668121, 0.006225659511983395, 0.10337235033512115, -0.07113487273454666, 0.17548975348472595, 0.042972128838300705, 0.10130257904529572, 0.039969056844711304, 0.08176793158054352, 0.004960310645401478, -0.022856149822473526, -0.0067467233166098595, -0.07481648027896881, -0.11259543895721436, -0.006386597640812397, 0.023220716044306755, -0.08257874101400375, -0.05477657541632652, 0.13395044207572937, 0.02214772254228592, 0.20002344250679016, -0.028818124905228615, -0.1599300056695938, -0.0223435927182436, 0.019692517817020416, -0.04360513761639595, -0.04057908430695534, 0.0003680957015603781, 0.07890064269304276, -0.06522704660892487, -0.013948838226497173, -0.09038207679986954, 0.059641867876052856, -0.11011307686567307, 0.06691239774227142, 0.032354652881622314, 0.143149271607399, 0.05443072319030762, 0.06908782571554184, -0.136577770113945, 0.0795135572552681, 0.04900651425123215, 0.06754480302333832, -0.05291585996747017, 0.04053719341754913, 0.05957832559943199, -0.005011576693505049, 0.07362199574708939, -0.03655879572033882, -0.1409459412097931, -0.11101935803890228, -0.11368174850940704, 0.0739123672246933, 0.15117238461971283, -0.04177320748567581, 0.09509071707725525, -0.09223560988903046, -0.05499028414487839, -0.008637280203402042, 0.10573383420705795, -0.060759831219911575, -0.16425339877605438, 0.12261752784252167, -0.021293271332979202, 0.06473468244075775, -0.07867137342691422, -0.03397493064403534, -0.048652321100234985, 0.1408054679632187, -0.06580232828855515, -0.055161621421575546, -0.15515515208244324, 0.06835756450891495, 0.12346778810024261, -0.11046431213617325, 0.09829434007406235, 0.034361906349658966, 0.20692701637744904, 0.0005366243422031403, -0.17279279232025146, -0.012875732034444809, -0.05714242160320282, -0.12331905215978622, 0.0713687613606453, 0.09545672684907913, -0.06955087929964066, 0.01881030946969986, 0.0003694920160342008, 0.0038044890388846397, 0.0648389384150505, -0.08347233384847641, -0.0572601854801178, 0.0902041345834732, -0.0015367961023002863, -0.03171086311340332, -0.08969924598932266, 0.08469782769680023, -0.013516637496650219, 0.06795331090688705, 0.12755215167999268, 0.2332714945077896, -0.06980971246957779, 0.10268174111843109, 0.11847653985023499, 0.0072484505362808704, -0.30070143938064575, -0.07622402906417847, 0.01508452370762825, 0.034728746861219406, -0.007006390020251274, -0.18646465241909027, 0.10465225577354431, 0.11567004024982452, -0.02884756773710251, -0.05301745980978012, -0.2699161171913147, -0.14559048414230347, 0.11126522719860077, 0.06479524075984955, 0.006330265663564205, -0.026210790500044823, -0.021084869280457497, -0.11830145865678787, -0.14790286123752594, 0.01954120397567749, -0.10122183710336685, 0.12580469250679016, -0.02215210720896721, -0.024180641397833824, 0.02689736895263195, -0.0581243634223938, 0.10030766576528549, -0.039426300674676895, 0.0923372209072113, -0.020662806928157806, 0.1322280317544937, 0.050655264407396317, -0.07646145671606064, 0.1026589423418045, 0.004817130975425243, 0.07517361640930176, -0.04937495291233063, -0.048940420150756836, -0.011841831728816032, 0.002896582707762718, 0.001836896175518632, -0.05926266685128212, -0.029681045562028885, 0.03017173893749714, 0.05639512836933136, -0.048700716346502304, 0.000969441665802151, -0.012328814715147018, -0.060070183128118515, 0.1339639276266098, 0.036595191806554794, -0.05175512656569481, -0.10908308625221252, -0.02379962056875229, 0.01199538353830576, 0.06731636822223663, -0.05083142966032028, 0.1288711279630661, 0.08653184026479721, -0.019205482676625252, 0.07109231501817703, 0.02301650121808052, -0.031241459771990776, -0.04017029330134392, 0.06411214172840118, -0.13149906694889069, -0.22445173561573029, -0.04035969823598862, -0.003964333329349756, -0.04578391835093498, 0.022841298952698708, 0.15573525428771973, -0.04311463236808777, -0.01083876471966505, 0.05421200767159462, 0.005467413924634457, 0.03126690536737442, 0.15760089457035065, 0.041560910642147064, 0.06835300475358963, -0.13382463157176971, 0.12485238164663315, 0.033840361982584, -0.09547188133001328, -0.0032187507022172213, 0.0637090802192688, -0.10742011666297913, -0.06447898596525192, -0.04932399466633797, 0.06657829880714417, 0.023493371903896332, -0.013045123778283596, -0.08447141200304031, -0.058050308376550674, 0.018555717542767525, 0.10908051580190659, 0.029986441135406494, 0.09514385461807251, -0.07890836894512177, 0.016175953671336174, -0.12613677978515625, 0.08410951495170593, 0.017345881089568138, 0.06484183669090271, -0.13420170545578003, 0.07065223157405853, -0.03845332935452461, 0.03517135977745056, -0.05709301680326462, 0.005170234479010105, -0.04392032325267792, -0.048343803733587265, -0.1765640527009964, -0.006047061178833246, 0.004548920318484306, -0.043268267065286636, 0.003539072582498193, 0.024005118757486343, -0.03276915103197098, 0.03686981648206711, -0.08423015475273132, -0.061444416642189026, -0.06776043027639389, 0.015145798213779926, -0.1298491507768631, 0.028867676854133606, 0.025676202028989792, -0.10317113995552063, 0.10611970722675323, 0.027018163353204727, 0.022449053823947906, 0.13244663178920746, 0.039490919560194016, -0.11081844568252563, 0.03625766932964325, 0.08276601135730743, 0.042760513722896576, -0.04711465537548065, -0.0017788584809750319, -0.03314288705587387, -0.04047403857111931, -0.02824929729104042, 0.04140515998005867, -0.12595486640930176, 0.033654842525720596, -0.0985659658908844, 0.059719864279031754, -0.05175250396132469, -0.010809225961565971, 0.06528006494045258, 0.0651686042547226, 0.08419101685285568, -0.0616983100771904, -0.009090735577046871, -0.1340108960866928, -0.005983277689665556, 0.010448146611452103, -0.06587451696395874, -0.06545503437519073, -0.07444304972887039, 0.046268317848443985, -0.040648605674505234, 0.10357742756605148, -0.05032479390501976, -0.04449968785047531, 0.063878133893013, -0.03028818778693676, -0.05879247561097145, 0.02909315750002861, 0.10988312214612961, 0.09321517497301102, 0.009683806449174881, 0.03270789980888367, 0.004037970677018166, 0.005692401435226202, 0.02461782470345497, 0.1947879046201706, 0.18114711344242096, 0.047160565853118896, 0.1009051501750946, 0.0073593854904174805, -0.06058067828416824, -0.09827041625976562, 0.12655390799045563, -0.13597998023033142, 0.10119486600160599, -0.0578971765935421, 0.03981122002005577, 0.11826001852750778, -0.1467830389738083, 0.08328340947628021, -0.006882031448185444, -0.04854379594326019, -0.14936299622058868, -0.02267826534807682, -0.09319871664047241, -0.09190364927053452, -0.0050718337297439575, -0.09833969175815582, 0.021932389587163925, 0.05805574730038643, 0.040705613791942596, -0.002747556194663048, 0.20599813759326935, -0.04898061230778694, -0.061674512922763824, 0.007680319249629974, 0.012954996898770332, -0.05927680805325508, 0.08522646874189377, -0.05753780156373978, 0.028485259041190147, 0.05660206452012062, 0.0518532358109951, 0.0010675085941329598, 0.06393739581108093, 0.028470823541283607, -0.06369183957576752, -0.0653093010187149, -0.0034019954036921263, 0.02331715077161789, 0.033972177654504776, 0.06322687119245529, -0.022653568536043167, -0.0592658706009388, -0.03528359532356262, 0.23925799131393433, -0.04668236896395683, -0.12855541706085205, -0.09132948517799377, 0.19149088859558105, 0.15593498945236206, 0.08112460374832153, 0.020433591678738594, -0.10061638802289963, -0.04003565385937691, 0.15064677596092224, 0.12990884482860565, -0.04684830456972122, 0.003852730616927147, 0.03406898304820061, -0.004812746308743954, -0.05363235995173454, 0.15303771197795868, 0.05627741664648056, 0.1625811606645584, -0.020495248958468437, -0.00043971932609565556, 0.03595009446144104, -0.06956958025693893, -0.042895857244729996, 0.21299278736114502, -0.0496150478720665, 0.0058876690454781055, -0.018968680873513222, -0.0211721733212471, -0.004630656447261572, -0.17833876609802246, -0.0374411903321743, -0.06402450054883957, -0.13354623317718506, -0.01777898147702217, -0.009796665981411934, -0.0016883540665730834, 0.05714863911271095, -0.013333170674741268, -0.011401773430407047, 0.1794513463973999, 0.013484508730471134, -0.07516146451234818, -0.020686641335487366, 0.09928201884031296, 0.079424649477005, 0.1758870929479599, 0.02221892774105072, -0.008008033968508244, 0.06656724959611893, -0.023357057943940163, -0.1028418242931366, 0.046502113342285156, 0.03252603858709335, -0.20225293934345245, 0.007131351623684168, 0.18690389394760132, -0.023438340052962303, 0.044114526361227036, -0.004088737536221743, -0.1597050279378891, -0.012284711003303528, 0.028708454221487045, -0.04290628060698509, -0.0068760523572564125, 0.021430158987641335, -0.035139646381139755, 0.08563287556171417, 0.17241866886615753, 0.021064264699816704, 0.058720678091049194, -0.07776331156492233, 0.04090698063373566, 0.03163593262434006, 0.07538473606109619, -0.014276460744440556, -0.1452966332435608, 0.026692945510149002, -0.0302111953496933, -0.02999967709183693, -0.19675295054912567, -0.10110118985176086, 0.03192555531859398, -0.0006416713004000485, -0.030724622309207916, 0.15183618664741516, 0.048725489526987076, 0.04474559798836708, 0.009140466339886189, -0.12921077013015747, -0.04423608258366585, 0.025643398985266685, -0.16402772068977356, -0.0631222277879715 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/2epochv3mlni
[ "transformers", "pytorch", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 87, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.05241292342543602, 0.11625625193119049, -0.004290661308914423, 0.03704634681344032, 0.10461783409118652, 0.04311855509877205, 0.02440350502729416, 0.14597083628177643, -0.0326407290995121, 0.07119511812925339, 0.041633110493421555, 0.054706066846847534, 0.043093517422676086, 0.08738726377487183, 0.003954117186367512, -0.25242874026298523, 0.07505936175584793, -0.029287222772836685, -0.11313261091709137, 0.03122064843773842, 0.09910337626934052, -0.11940285563468933, 0.06363672018051147, -0.022568227723240852, -0.04831758141517639, 0.04378461092710495, 0.0006135303410701454, -0.000733526365365833, 0.10929683595895767, 0.08638615906238556, 0.04970165714621544, 0.12570685148239136, 0.06911695003509521, -0.18629753589630127, 0.035372961312532425, 0.08342937380075455, 0.024787431582808495, 0.06046098843216896, 0.04166822135448456, 0.034362152218818665, 0.082123763859272, -0.12192820012569427, -0.0012567057274281979, 0.04668266698718071, -0.014703585766255856, -0.18785467743873596, -0.08851271122694016, 0.010912719182670116, 0.06809866428375244, 0.026018789038062096, -0.007647804915904999, 0.04403293877840042, -0.0634857788681984, 0.04492665082216263, 0.2033262997865677, -0.24006621539592743, 0.005089050158858299, 0.09652864933013916, -0.05766617879271507, -0.0021894872188568115, -0.05889365077018738, 0.06054612621665001, -0.03787076845765114, -0.004404023289680481, 0.0953960046172142, -0.007863149046897888, -0.027227632701396942, 0.0461537167429924, -0.09998410940170288, -0.028319157660007477, -0.017032254487276077, -0.021815486252307892, -0.019361726939678192, -0.12552063167095184, -0.13545437157154083, -0.09394071251153946, -0.09466221928596497, -0.10201138257980347, 0.0481635145843029, -0.034098174422979355, 0.0027645486406981945, -0.12905673682689667, -0.0781635195016861, -0.040702249854803085, -0.012745361775159836, 0.05975765734910965, 0.09084657579660416, -0.01573737896978855, 0.01888619363307953, 0.12221407145261765, -0.05866345763206482, -0.09633360803127289, -0.06893252581357956, -0.07345526665449142, -0.10827293992042542, 0.029612606391310692, -0.06550871580839157, -0.13967692852020264, 0.036477793008089066, 0.1664232611656189, 0.10732275247573853, 0.11743797361850739, 0.004422088153660297, -0.026239631697535515, -0.052678272128105164, 0.17081505060195923, -0.09168440103530884, -0.04946804791688919, 0.13098061084747314, 0.014130712486803532, 0.006833386141806841, -0.028710070997476578, -0.056183066219091415, -0.06179269403219223, 0.08802732825279236, 0.09180831909179688, -0.04942389577627182, 0.0682508647441864, 0.04268469288945198, -0.043976616114377975, 0.11267323791980743, -0.15951232612133026, -0.036047544330358505, 0.011395269073545933, -0.04261685162782669, 0.026174025610089302, 0.029328111559152603, -0.03301100805401802, -0.08342672884464264, 0.08510751277208328, -0.04280565679073334, -0.07014468312263489, -0.07203925400972366, -0.08885645121335983, -0.002707223640754819, 0.026051171123981476, -0.04565635696053505, -0.14187489449977875, -0.17441733181476593, 0.030235065147280693, 0.028298823162913322, -0.014657210558652878, 0.04990647733211517, 0.026168597862124443, -0.010564020834863186, -0.018691079691052437, -0.02134881541132927, 0.027627741917967796, -0.024837426841259003, 0.0335579439997673, 0.03671235218644142, 0.07314646989107132, -0.02218751236796379, 0.02800191007554531, -0.034974511712789536, 0.030307423323392868, -0.23950177431106567, 0.08978857100009918, -0.100782111287117, -0.08487722277641296, -0.08426533639431, -0.05481399595737457, -0.03245139867067337, 0.0033765919506549835, 0.028284341096878052, 0.08222424238920212, -0.1274300068616867, -0.06718092411756516, 0.14027553796768188, -0.11157736927270889, -0.02299368381500244, 0.09707306325435638, -0.0022017641458660364, -0.055710747838020325, 0.0570623055100441, 0.09115156531333923, 0.25128817558288574, -0.14172594249248505, -0.0486244335770607, 0.013722315430641174, -0.008094100281596184, -0.013274040073156357, 0.023913394659757614, 0.0699276551604271, 0.10617539286613464, 0.059531647711992264, -0.11182842403650284, -0.020532988011837006, -0.022604839876294136, -0.0500359833240509, -0.03825552761554718, -0.014430549927055836, 0.0602373443543911, -0.035922009497880936, 0.048536550253629684, 0.04468540474772453, -0.08056991547346115, 0.06431268900632858, 0.14005854725837708, -0.08971033990383148, 0.04085931554436684, -0.13266025483608246, 0.029658399522304535, -0.010808725841343403, 0.05502389371395111, -0.14949937164783478, -0.17315195500850677, 0.028714269399642944, -0.20101559162139893, -0.014984328299760818, 0.16706711053848267, 0.06520620733499527, 0.0769575759768486, -0.07139018177986145, 0.03544917330145836, -0.0465189628303051, -0.026657596230506897, -0.012762212194502354, -0.039144936949014664, -0.0904446691274643, -0.003422896610572934, 0.1434527039527893, -0.07098592817783356, 0.01785021647810936, 0.12045729160308838, 0.15926221013069153, 0.04081423953175545, -0.030288774520158768, -0.021402545273303986, -0.0352913998067379, 0.035939160734415054, -0.07606267929077148, -0.0013733342057093978, -0.0005944598815403879, -0.050455402582883835, 0.08635364472866058, -0.15149633586406708, 0.13202467560768127, 0.08588232100009918, 0.12846827507019043, 0.041794151067733765, 0.010709304362535477, -0.03575313836336136, -0.029423724859952927, -0.04201208055019379, -0.06960401684045792, 0.051100172102451324, 0.07274112105369568, 0.14544865489006042, -0.06951829791069031, -0.05518389493227005, 0.00011524007277330384, 0.05315240100026131, -0.013236815109848976, 0.00699988566339016, 0.024189092218875885, -0.08823025971651077, 0.024960024282336235, 0.024626167491078377, -0.005524864420294762, 0.13287346065044403, -0.029232831671833992, -0.09008967876434326, -0.001253121648915112, -0.046175241470336914, -0.013912647031247616, 0.039902493357658386, 0.07383694499731064, 0.0639260783791542, 0.04250863939523697, 0.023993737995624542, 0.08118695765733719, -0.09342873096466064, 0.07322639971971512, 0.04650180786848068, -0.08427714556455612, 0.030391348525881767, 0.08757985383272171, 0.025929659605026245, 0.03123314678668976, 0.00714842090383172, 0.07555853575468063, -0.00331905041821301, -0.026500098407268524, -0.03688663989305496, 0.12997634708881378, -0.0822547972202301, -0.23660308122634888, -0.18573130667209625, -0.03791981562972069, -0.08769375085830688, -0.005737293511629105, 0.04672124981880188, -0.04999380186200142, -0.14164356887340546, -0.03944524750113487, 0.10956963896751404, 0.04122653976082802, 0.003083166666328907, -0.0214821957051754, 0.004241902381181717, 0.07114008814096451, -0.14301002025604248, 0.008071999996900558, -0.001016158377751708, -0.052957240492105484, 0.025252217426896095, 0.0748182088136673, 0.04050320014357567, 0.07052327692508698, -0.03639766946434975, 0.02550515905022621, -0.004302534740418196, 0.1658281534910202, -0.012941831722855568, 0.04620959237217903, 0.21517708897590637, -0.007216479163616896, 0.06316639482975006, 0.05751804634928703, 0.014120690524578094, -0.07226750999689102, 0.021344417706131935, 0.09521719068288803, -0.019487185403704643, -0.19240689277648926, -0.06886369735002518, -0.06324194371700287, -0.054118406027555466, 0.02421768195927143, 0.044022656977176666, -0.004982439801096916, -0.0016912405844777822, -0.06946996599435806, 0.040119536221027374, 0.017158862203359604, 0.024831630289554596, 0.22673776745796204, 0.006654110737144947, 0.07586584240198135, -0.0724959746003151, -0.030478915199637413, 0.05292810499668121, 0.006225659511983395, 0.10337235033512115, -0.07113487273454666, 0.17548975348472595, 0.042972128838300705, 0.10130257904529572, 0.039969056844711304, 0.08176793158054352, 0.004960310645401478, -0.022856149822473526, -0.0067467233166098595, -0.07481648027896881, -0.11259543895721436, -0.006386597640812397, 0.023220716044306755, -0.08257874101400375, -0.05477657541632652, 0.13395044207572937, 0.02214772254228592, 0.20002344250679016, -0.028818124905228615, -0.1599300056695938, -0.0223435927182436, 0.019692517817020416, -0.04360513761639595, -0.04057908430695534, 0.0003680957015603781, 0.07890064269304276, -0.06522704660892487, -0.013948838226497173, -0.09038207679986954, 0.059641867876052856, -0.11011307686567307, 0.06691239774227142, 0.032354652881622314, 0.143149271607399, 0.05443072319030762, 0.06908782571554184, -0.136577770113945, 0.0795135572552681, 0.04900651425123215, 0.06754480302333832, -0.05291585996747017, 0.04053719341754913, 0.05957832559943199, -0.005011576693505049, 0.07362199574708939, -0.03655879572033882, -0.1409459412097931, -0.11101935803890228, -0.11368174850940704, 0.0739123672246933, 0.15117238461971283, -0.04177320748567581, 0.09509071707725525, -0.09223560988903046, -0.05499028414487839, -0.008637280203402042, 0.10573383420705795, -0.060759831219911575, -0.16425339877605438, 0.12261752784252167, -0.021293271332979202, 0.06473468244075775, -0.07867137342691422, -0.03397493064403534, -0.048652321100234985, 0.1408054679632187, -0.06580232828855515, -0.055161621421575546, -0.15515515208244324, 0.06835756450891495, 0.12346778810024261, -0.11046431213617325, 0.09829434007406235, 0.034361906349658966, 0.20692701637744904, 0.0005366243422031403, -0.17279279232025146, -0.012875732034444809, -0.05714242160320282, -0.12331905215978622, 0.0713687613606453, 0.09545672684907913, -0.06955087929964066, 0.01881030946969986, 0.0003694920160342008, 0.0038044890388846397, 0.0648389384150505, -0.08347233384847641, -0.0572601854801178, 0.0902041345834732, -0.0015367961023002863, -0.03171086311340332, -0.08969924598932266, 0.08469782769680023, -0.013516637496650219, 0.06795331090688705, 0.12755215167999268, 0.2332714945077896, -0.06980971246957779, 0.10268174111843109, 0.11847653985023499, 0.0072484505362808704, -0.30070143938064575, -0.07622402906417847, 0.01508452370762825, 0.034728746861219406, -0.007006390020251274, -0.18646465241909027, 0.10465225577354431, 0.11567004024982452, -0.02884756773710251, -0.05301745980978012, -0.2699161171913147, -0.14559048414230347, 0.11126522719860077, 0.06479524075984955, 0.006330265663564205, -0.026210790500044823, -0.021084869280457497, -0.11830145865678787, -0.14790286123752594, 0.01954120397567749, -0.10122183710336685, 0.12580469250679016, -0.02215210720896721, -0.024180641397833824, 0.02689736895263195, -0.0581243634223938, 0.10030766576528549, -0.039426300674676895, 0.0923372209072113, -0.020662806928157806, 0.1322280317544937, 0.050655264407396317, -0.07646145671606064, 0.1026589423418045, 0.004817130975425243, 0.07517361640930176, -0.04937495291233063, -0.048940420150756836, -0.011841831728816032, 0.002896582707762718, 0.001836896175518632, -0.05926266685128212, -0.029681045562028885, 0.03017173893749714, 0.05639512836933136, -0.048700716346502304, 0.000969441665802151, -0.012328814715147018, -0.060070183128118515, 0.1339639276266098, 0.036595191806554794, -0.05175512656569481, -0.10908308625221252, -0.02379962056875229, 0.01199538353830576, 0.06731636822223663, -0.05083142966032028, 0.1288711279630661, 0.08653184026479721, -0.019205482676625252, 0.07109231501817703, 0.02301650121808052, -0.031241459771990776, -0.04017029330134392, 0.06411214172840118, -0.13149906694889069, -0.22445173561573029, -0.04035969823598862, -0.003964333329349756, -0.04578391835093498, 0.022841298952698708, 0.15573525428771973, -0.04311463236808777, -0.01083876471966505, 0.05421200767159462, 0.005467413924634457, 0.03126690536737442, 0.15760089457035065, 0.041560910642147064, 0.06835300475358963, -0.13382463157176971, 0.12485238164663315, 0.033840361982584, -0.09547188133001328, -0.0032187507022172213, 0.0637090802192688, -0.10742011666297913, -0.06447898596525192, -0.04932399466633797, 0.06657829880714417, 0.023493371903896332, -0.013045123778283596, -0.08447141200304031, -0.058050308376550674, 0.018555717542767525, 0.10908051580190659, 0.029986441135406494, 0.09514385461807251, -0.07890836894512177, 0.016175953671336174, -0.12613677978515625, 0.08410951495170593, 0.017345881089568138, 0.06484183669090271, -0.13420170545578003, 0.07065223157405853, -0.03845332935452461, 0.03517135977745056, -0.05709301680326462, 0.005170234479010105, -0.04392032325267792, -0.048343803733587265, -0.1765640527009964, -0.006047061178833246, 0.004548920318484306, -0.043268267065286636, 0.003539072582498193, 0.024005118757486343, -0.03276915103197098, 0.03686981648206711, -0.08423015475273132, -0.061444416642189026, -0.06776043027639389, 0.015145798213779926, -0.1298491507768631, 0.028867676854133606, 0.025676202028989792, -0.10317113995552063, 0.10611970722675323, 0.027018163353204727, 0.022449053823947906, 0.13244663178920746, 0.039490919560194016, -0.11081844568252563, 0.03625766932964325, 0.08276601135730743, 0.042760513722896576, -0.04711465537548065, -0.0017788584809750319, -0.03314288705587387, -0.04047403857111931, -0.02824929729104042, 0.04140515998005867, -0.12595486640930176, 0.033654842525720596, -0.0985659658908844, 0.059719864279031754, -0.05175250396132469, -0.010809225961565971, 0.06528006494045258, 0.0651686042547226, 0.08419101685285568, -0.0616983100771904, -0.009090735577046871, -0.1340108960866928, -0.005983277689665556, 0.010448146611452103, -0.06587451696395874, -0.06545503437519073, -0.07444304972887039, 0.046268317848443985, -0.040648605674505234, 0.10357742756605148, -0.05032479390501976, -0.04449968785047531, 0.063878133893013, -0.03028818778693676, -0.05879247561097145, 0.02909315750002861, 0.10988312214612961, 0.09321517497301102, 0.009683806449174881, 0.03270789980888367, 0.004037970677018166, 0.005692401435226202, 0.02461782470345497, 0.1947879046201706, 0.18114711344242096, 0.047160565853118896, 0.1009051501750946, 0.0073593854904174805, -0.06058067828416824, -0.09827041625976562, 0.12655390799045563, -0.13597998023033142, 0.10119486600160599, -0.0578971765935421, 0.03981122002005577, 0.11826001852750778, -0.1467830389738083, 0.08328340947628021, -0.006882031448185444, -0.04854379594326019, -0.14936299622058868, -0.02267826534807682, -0.09319871664047241, -0.09190364927053452, -0.0050718337297439575, -0.09833969175815582, 0.021932389587163925, 0.05805574730038643, 0.040705613791942596, -0.002747556194663048, 0.20599813759326935, -0.04898061230778694, -0.061674512922763824, 0.007680319249629974, 0.012954996898770332, -0.05927680805325508, 0.08522646874189377, -0.05753780156373978, 0.028485259041190147, 0.05660206452012062, 0.0518532358109951, 0.0010675085941329598, 0.06393739581108093, 0.028470823541283607, -0.06369183957576752, -0.0653093010187149, -0.0034019954036921263, 0.02331715077161789, 0.033972177654504776, 0.06322687119245529, -0.022653568536043167, -0.0592658706009388, -0.03528359532356262, 0.23925799131393433, -0.04668236896395683, -0.12855541706085205, -0.09132948517799377, 0.19149088859558105, 0.15593498945236206, 0.08112460374832153, 0.020433591678738594, -0.10061638802289963, -0.04003565385937691, 0.15064677596092224, 0.12990884482860565, -0.04684830456972122, 0.003852730616927147, 0.03406898304820061, -0.004812746308743954, -0.05363235995173454, 0.15303771197795868, 0.05627741664648056, 0.1625811606645584, -0.020495248958468437, -0.00043971932609565556, 0.03595009446144104, -0.06956958025693893, -0.042895857244729996, 0.21299278736114502, -0.0496150478720665, 0.0058876690454781055, -0.018968680873513222, -0.0211721733212471, -0.004630656447261572, -0.17833876609802246, -0.0374411903321743, -0.06402450054883957, -0.13354623317718506, -0.01777898147702217, -0.009796665981411934, -0.0016883540665730834, 0.05714863911271095, -0.013333170674741268, -0.011401773430407047, 0.1794513463973999, 0.013484508730471134, -0.07516146451234818, -0.020686641335487366, 0.09928201884031296, 0.079424649477005, 0.1758870929479599, 0.02221892774105072, -0.008008033968508244, 0.06656724959611893, -0.023357057943940163, -0.1028418242931366, 0.046502113342285156, 0.03252603858709335, -0.20225293934345245, 0.007131351623684168, 0.18690389394760132, -0.023438340052962303, 0.044114526361227036, -0.004088737536221743, -0.1597050279378891, -0.012284711003303528, 0.028708454221487045, -0.04290628060698509, -0.0068760523572564125, 0.021430158987641335, -0.035139646381139755, 0.08563287556171417, 0.17241866886615753, 0.021064264699816704, 0.058720678091049194, -0.07776331156492233, 0.04090698063373566, 0.03163593262434006, 0.07538473606109619, -0.014276460744440556, -0.1452966332435608, 0.026692945510149002, -0.0302111953496933, -0.02999967709183693, -0.19675295054912567, -0.10110118985176086, 0.03192555531859398, -0.0006416713004000485, -0.030724622309207916, 0.15183618664741516, 0.048725489526987076, 0.04474559798836708, 0.009140466339886189, -0.12921077013015747, -0.04423608258366585, 0.025643398985266685, -0.16402772068977356, -0.0631222277879715 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/3epoch-3large
[ "transformers", "pytorch", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 91, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.04712529852986336, 0.11632681638002396, -0.005513798911124468, 0.04034528508782387, 0.09261268377304077, 0.041880253702402115, 0.007833678275346756, 0.1437971293926239, -0.036538708955049515, 0.06895140558481216, 0.052906185388565063, 0.048526328057050705, 0.048395950347185135, 0.09214141964912415, 0.003681971225887537, -0.2341788411140442, 0.07397912442684174, -0.026379063725471497, -0.09742267429828644, 0.05265279486775398, 0.1080227792263031, -0.11553676426410675, 0.05288761481642723, -0.00786243099719286, -0.03502046316862106, 0.03428156301379204, -0.008052940480411053, -0.01313028670847416, 0.0982774868607521, 0.08160832524299622, 0.05526924133300781, 0.1318749487400055, 0.06007041782140732, -0.16747678816318512, 0.035320356488227844, 0.085148386657238, 0.0369623526930809, 0.0718550756573677, 0.05858064442873001, 0.05762948468327522, 0.053467705845832825, -0.1128077283501625, -0.012466798536479473, 0.04233165830373764, -0.02168363705277443, -0.18253515660762787, -0.08752509951591492, 0.0032855768222361803, 0.056699562817811966, 0.02536107785999775, -0.01608235575258732, 0.08578988909721375, -0.08466160297393799, 0.034347981214523315, 0.20722617208957672, -0.25843122601509094, -0.019391722977161407, 0.0665048137307167, -0.02392335794866085, -0.0017972111236304045, -0.04952795058488846, 0.047683145850896835, -0.03549984097480774, 0.0038408644031733274, 0.0896705612540245, -0.005910615436732769, -0.0030211363919079304, 0.04856908693909645, -0.11140447109937668, -0.02381145767867565, -0.031572651118040085, -0.020290199667215347, -0.02176397480070591, -0.12430358678102493, -0.1258654147386551, -0.055981166660785675, -0.08734901249408722, -0.1192668229341507, 0.040016770362854004, -0.036247946321964264, -0.0015630091074854136, -0.10677099972963333, -0.07956281304359436, -0.04071968421339989, -0.01231368537992239, 0.06038827449083328, 0.08628429472446442, -0.008030709810554981, 0.035654790699481964, 0.1254246085882187, -0.040830936282873154, -0.10346140712499619, -0.07905876636505127, -0.09622544795274734, -0.11999285221099854, 0.016309374943375587, -0.0586562342941761, -0.11902649700641632, 0.056023359298706055, 0.1382610946893692, 0.12154877185821533, 0.1086641475558281, 0.02701127529144287, -0.03268757462501526, -0.04372057318687439, 0.17504048347473145, -0.08834930509328842, -0.050708502531051636, 0.10476969927549362, 0.04202458634972572, -0.012580826878547668, -0.018144939094781876, -0.037915803492069244, -0.07275217026472092, 0.09375783056020737, 0.08932757377624512, -0.04464203119277954, 0.06844504922628403, 0.026609309017658234, -0.05321897938847542, 0.1063835546374321, -0.15370501577854156, -0.0272221677005291, 0.021987788379192352, -0.05728312209248543, 0.04639561474323273, 0.004566253162920475, -0.03661295771598816, -0.08886884152889252, 0.06560073792934418, -0.0344834178686142, -0.06434406340122223, -0.08199803531169891, -0.08750474452972412, -0.007289708126336336, 0.03695927560329437, -0.05005808174610138, -0.13684795796871185, -0.15055090188980103, 0.015570350922644138, 0.02507244609296322, -0.023581724613904953, 0.0244109146296978, 0.03542438894510269, -0.01288549229502678, -0.010061499662697315, -0.019843734800815582, 0.05235252529382706, -0.028908394277095795, 0.057138655334711075, 0.04124312475323677, 0.07569045573472977, 0.02610844001173973, 0.0212369654327631, -0.04998933896422386, 0.025996489450335503, -0.2344748079776764, 0.10486934334039688, -0.10977257788181305, -0.08997494727373123, -0.09863027930259705, -0.05938425287604332, -0.028496278449892998, 0.003119619330391288, 0.027742110192775726, 0.09143280237913132, -0.15124602615833282, -0.04659153148531914, 0.11987509578466415, -0.09744323790073395, -0.02055317535996437, 0.11510057002305984, -0.006204533856362104, -0.045592501759529114, 0.05413203686475754, 0.07100527733564377, 0.26308247447013855, -0.15530814230442047, -0.05809803307056427, 0.024983322247862816, -0.019533343613147736, 0.017638258635997772, 0.04335278645157814, 0.08663983643054962, 0.07228252291679382, 0.05760876089334488, -0.12364064157009125, -0.021429941058158875, -0.029822377488017082, -0.05480824410915375, -0.03715866059064865, -0.020292017608880997, 0.07740022987127304, -0.02739362232387066, 0.045818034559488297, 0.04535282030701637, -0.09720968455076218, 0.039491940289735794, 0.1409142017364502, -0.060463424772024155, 0.007282084319740534, -0.13976508378982544, 0.026785386726260185, -0.011296042241156101, 0.052760329097509384, -0.13898706436157227, -0.1592322736978531, 0.045100871473550797, -0.2092106193304062, -0.0037731134798377752, 0.15607589483261108, 0.07330156862735748, 0.060745734721422195, -0.06014931946992874, 0.025895269587635994, -0.04257611557841301, -0.027583016082644463, -0.0215979665517807, -0.02184593677520752, -0.09004822373390198, 0.001925273914821446, 0.14303579926490784, -0.08857439458370209, 0.013482823967933655, 0.09195464104413986, 0.15589258074760437, 0.027469633147120476, -0.016556264832615852, -0.01768515072762966, -0.03846433013677597, 0.040238168090581894, -0.06988503038883209, 0.009472021833062172, 0.0023472425527870655, -0.040176570415496826, 0.08592034131288528, -0.11531557887792587, 0.1001487523317337, 0.06882693618535995, 0.133205384016037, 0.015582000836730003, 0.016597921028733253, -0.04178610071539879, -0.02671061083674431, -0.047552064061164856, -0.0689375102519989, 0.04052545130252838, 0.07996892929077148, 0.12558765709400177, -0.08470228314399719, -0.059981461614370346, -0.003275923663750291, 0.037008676677942276, -0.0005927811725996435, 0.03732413426041603, 0.043999917805194855, -0.08973007649183273, 0.010178475640714169, 0.04905495047569275, -0.011261597275733948, 0.11293817311525345, -0.027395376935601234, -0.0985155925154686, -0.006571685895323753, -0.031193070113658905, -0.005638659466058016, 0.02256038784980774, 0.09809628129005432, 0.06964413821697235, 0.05023844912648201, 0.004805447533726692, 0.07084488123655319, -0.09726882725954056, 0.06604769825935364, 0.03337583690881729, -0.08001543581485748, 0.03326991945505142, 0.08352072536945343, 0.039942167699337006, 0.039122048765420914, 0.013843844644725323, 0.06491959095001221, 0.001866188831627369, -0.022079864516854286, -0.02842782437801361, 0.12429262697696686, -0.0712706670165062, -0.2507046163082123, -0.18305496871471405, -0.02263958752155304, -0.06360174715518951, -0.017641447484493256, 0.03348453715443611, -0.061336204409599304, -0.12295027822256088, -0.033605460077524185, 0.09850643575191498, 0.04853428155183792, -0.0006152824498713017, -0.011457228101789951, 0.01613658294081688, 0.08047007769346237, -0.13231484591960907, 0.0166722871363163, 0.004733786452561617, -0.055528830736875534, 0.033364187926054, 0.062468066811561584, 0.05384020134806633, 0.07386218756437302, -0.04183025658130646, 0.010931301862001419, 0.002135220915079117, 0.1991342455148697, -0.03144242614507675, 0.05783737078309059, 0.2312772423028946, -0.0028442542534321547, 0.07325225323438644, 0.06404800713062286, 0.0220558512955904, -0.0476984940469265, 0.02106388844549656, 0.09120727330446243, -0.008005927316844463, -0.2007640153169632, -0.05440699681639671, -0.05155760049819946, -0.06821660697460175, 0.017467668280005455, 0.04731082543730736, -0.031414467841386795, 0.012845993041992188, -0.06333675980567932, 0.01886633224785328, 0.007401598151773214, 0.017819330096244812, 0.21401408314704895, 0.01825554110109806, 0.08221791684627533, -0.0627194344997406, -0.049706023186445236, 0.060577042400836945, 0.013487355783581734, 0.10507852584123611, -0.07808462530374527, 0.14320795238018036, 0.040839772671461105, 0.1064785048365593, 0.05820319429039955, 0.06294859200716019, 0.014762835577130318, -0.02494632638990879, -0.018285654485225677, -0.08331337571144104, -0.09409096837043762, -0.0034922605846077204, 0.03608551248908043, -0.0805211067199707, -0.03295808658003807, 0.14266735315322876, 0.02459770254790783, 0.21504561603069305, -0.02216898836195469, -0.15890178084373474, -0.026227189227938652, 0.004687800537794828, -0.05460355058312416, -0.04189256578683853, -0.0006756660295650363, 0.06135229021310806, -0.07322841882705688, -0.018816174939274788, -0.07940693944692612, 0.06125558167695999, -0.08662763983011246, 0.06604789197444916, 0.049912724643945694, 0.16169250011444092, 0.056589264422655106, 0.08593767881393433, -0.14346760511398315, 0.0662945955991745, 0.043685182929039, 0.08168550580739975, -0.050991080701351166, 0.041826725006103516, 0.06488432735204697, -0.027130113914608955, 0.07960788160562515, -0.03956495225429535, -0.11154335737228394, -0.11619788408279419, -0.12569980323314667, 0.07261581718921661, 0.14355085790157318, -0.015140322968363762, 0.1023520901799202, -0.08988821506500244, -0.04232247918844223, -0.014629242941737175, 0.0962153896689415, -0.07165642082691193, -0.1532876342535019, 0.11964912712574005, -0.03526897355914116, 0.046777524054050446, -0.08106109499931335, -0.04325233772397041, -0.06128987669944763, 0.1291637271642685, -0.07224220037460327, -0.061434708535671234, -0.1368471086025238, 0.038200054317712784, 0.15185873210430145, -0.10548936575651169, 0.06925797462463379, 0.01690029725432396, 0.20672576129436493, 0.015002579428255558, -0.15269583463668823, -0.028383765369653702, -0.053966592997312546, -0.13484114408493042, 0.05783824995160103, 0.10445321351289749, -0.062876395881176, 0.04185574874281883, 0.004160220734775066, 0.005373608320951462, 0.039439063519239426, -0.08326336741447449, -0.05197681114077568, 0.04111988842487335, -0.0389084666967392, -0.030667368322610855, -0.08351331204175949, 0.0681614875793457, -0.03289073333144188, 0.052775777876377106, 0.11996695399284363, 0.2470674067735672, -0.07720746845006943, 0.1016426533460617, 0.12678702175617218, 0.00460921972990036, -0.29028284549713135, -0.09256969392299652, 0.056885235011577606, 0.040231700986623764, -0.009548124857246876, -0.19083404541015625, 0.1121416687965393, 0.12573286890983582, -0.025122206658124924, -0.053508859127759933, -0.275797963142395, -0.13761332631111145, 0.10250279307365417, 0.03517608344554901, -0.00004900233761873096, -0.03359666466712952, -0.029478391632437706, -0.09468240290880203, -0.1341886967420578, 0.034771427512168884, -0.09420584887266159, 0.12973415851593018, -0.020929604768753052, -0.00906689465045929, 0.03192394599318504, -0.055187683552503586, 0.12087519466876984, -0.05919647216796875, 0.07356700301170349, -0.010260646231472492, 0.10109464079141617, 0.03097681887447834, -0.07598528265953064, 0.08718094229698181, 0.018100503832101822, 0.07343500107526779, -0.05708851292729378, -0.047939784824848175, -0.02997461147606373, 0.006448954809457064, -0.00324034970253706, -0.04717043787240982, -0.04024704173207283, 0.05082664638757706, 0.047591108828783035, -0.04929770529270172, -0.022138919681310654, -0.01799909770488739, -0.061791982501745224, 0.12481015920639038, 0.04319484904408455, -0.045732591301202774, -0.10755942761898041, -0.023193540051579475, 0.018269294872879982, 0.06744962185621262, -0.034309763461351395, 0.12114722281694412, 0.09976113587617874, -0.01831544190645218, 0.06698162853717804, 0.01663334108889103, -0.03615175932645798, -0.030095646157860756, 0.05792611837387085, -0.13420253992080688, -0.2318660467863083, -0.03968800604343414, 0.004541734233498573, -0.07943828403949738, -0.009598013013601303, 0.15013211965560913, -0.04041966795921326, -0.025537747889757156, 0.05789368227124214, 0.0006763296551071107, 0.034948572516441345, 0.18758541345596313, 0.046860817819833755, 0.07367727160453796, -0.12133876979351044, 0.11285144090652466, 0.05508557707071304, -0.06489677727222443, -0.018598351627588272, 0.08251488208770752, -0.095675028860569, -0.05968267843127251, -0.03829959034919739, 0.050671543926000595, 0.03185758367180824, -0.01161973550915718, -0.08872479200363159, -0.05686366185545921, 0.027949297800660133, 0.09979207813739777, 0.02608519047498703, 0.07927896827459335, -0.06774836778640747, 0.015277218073606491, -0.10499407351016998, 0.0895736962556839, 0.04806762933731079, 0.055218081921339035, -0.12818391621112823, 0.0830635130405426, -0.03612520545721054, 0.049161337316036224, -0.06100553646683693, 0.012031028047204018, -0.0568084716796875, -0.04745887219905853, -0.19730274379253387, 0.012806636281311512, 0.006840388756245375, -0.03347041457891464, 0.0041272458620369434, 0.025133339688181877, -0.03675929456949234, 0.0318957082927227, -0.08546794950962067, -0.050792377442121506, -0.061052240431308746, 0.030075062066316605, -0.12237923592329025, 0.02348695695400238, 0.03445716202259064, -0.09882906079292297, 0.09729526191949844, 0.013943708501756191, 0.015072016976773739, 0.10602162778377533, 0.023312238976359367, -0.12173450738191605, 0.028343597427010536, 0.09289661049842834, 0.032326437532901764, -0.05529801920056343, 0.016552012413740158, -0.028869668021798134, -0.03714461252093315, -0.02821842022240162, 0.046389613300561905, -0.1176861897110939, 0.042870137840509415, -0.08735401183366776, 0.05003002658486366, -0.055765558034181595, 0.007191925775259733, 0.0656496211886406, 0.07170109450817108, 0.10982408374547958, -0.07343965023756027, -0.0012141978368163109, -0.14741025865077972, -0.005700036883354187, -0.0005758258048444986, -0.06496996432542801, -0.06165224313735962, -0.06263523548841476, 0.06141069903969765, -0.03795089200139046, 0.09866271913051605, -0.04322953149676323, -0.012928162701427937, 0.06217862293124199, -0.03694451227784157, -0.07398893684148788, 0.02436528541147709, 0.07874860614538193, 0.10128672420978546, 0.01650676690042019, 0.03875061497092247, -0.008728332817554474, -0.0136333042755723, 0.01618269830942154, 0.17858950793743134, 0.1990099847316742, 0.06185358762741089, 0.09569542855024338, 0.01754162833094597, -0.03956770896911621, -0.10018140077590942, 0.14012478291988373, -0.12989915907382965, 0.09427891671657562, -0.06647820770740509, 0.07885831594467163, 0.10499788820743561, -0.12249782681465149, 0.07206771522760391, -0.04051346331834793, -0.05269749462604523, -0.14419227838516235, -0.01313803531229496, -0.09925451874732971, -0.08094489574432373, -0.0047582220286130905, -0.09402284026145935, 0.034386783838272095, 0.04704536870121956, 0.03875453397631645, -0.012288397178053856, 0.18553438782691956, -0.04476271942257881, -0.06676850467920303, 0.020345574244856834, 0.0016499999910593033, -0.035489272326231, 0.08583010733127594, -0.04693009704351425, 0.03992133215069771, 0.07448257505893707, 0.057408031076192856, 0.001840639510191977, 0.05006628483533859, 0.009841912426054478, -0.0752246156334877, -0.07529333233833313, -0.005153049249202013, 0.019119132310152054, 0.028945675119757652, 0.05589685216546059, -0.0137475049123168, -0.04721096530556679, -0.04389902949333191, 0.22604097425937653, -0.03814482316374779, -0.1449185013771057, -0.11467470973730087, 0.20334772765636444, 0.10743867605924606, 0.07459811121225357, 0.025783265009522438, -0.08959130197763443, -0.050493355840444565, 0.16710025072097778, 0.10856826603412628, -0.08427812159061432, -0.01245513278990984, 0.03729113191366196, -0.0029669441282749176, -0.05923290550708771, 0.16504593193531036, 0.055578164756298065, 0.1666671186685562, -0.03624936193227768, 0.010263602249324322, 0.01487015001475811, -0.05909431353211403, -0.05860644206404686, 0.21023398637771606, -0.03491247445344925, 0.025676002725958824, -0.033390581607818604, -0.01083461195230484, -0.007668070029467344, -0.17667502164840698, -0.08045147359371185, -0.05345446988940239, -0.146321102976799, -0.014497529715299606, -0.026277529075741768, -0.014498940669000149, 0.05717204883694649, -0.01655644364655018, -0.007124810945242643, 0.1563013643026352, 0.004475877154618502, -0.048499688506126404, -0.001127520459704101, 0.12577396631240845, 0.06500834971666336, 0.1745394915342331, 0.0282707791775465, 0.007817626930773258, 0.07740268856287003, -0.0352196991443634, -0.1172051876783371, 0.0354507751762867, 0.041353240609169006, -0.21872270107269287, 0.019322270527482033, 0.16662439703941345, -0.010241019539535046, 0.04140906408429146, 0.01697547920048237, -0.13125434517860413, -0.017496807500720024, 0.03026174195110798, -0.03434108570218086, -0.018001651391386986, 0.04003630578517914, -0.03186614438891411, 0.08486954867839813, 0.19600822031497955, 0.019524050876498222, 0.05300247296690941, -0.09280431270599365, 0.04831124469637871, 0.020712941884994507, 0.0817708671092987, -0.020404018461704254, -0.14461058378219604, 0.021560123190283775, -0.02696800045669079, -0.01572362333536148, -0.17996978759765625, -0.1005876362323761, 0.022549495100975037, 0.007071018218994141, -0.015032893978059292, 0.14601558446884155, 0.0398930162191391, 0.05101315304636955, 0.0182642862200737, -0.13139553368091583, -0.048455722630023956, 0.026611851528286934, -0.16213445365428925, -0.04903775081038475 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa large model fine-tuned with MNLI task. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, you need to specify **--sharded_ddp** ```bash cd transformers/examples/text-classification/ export TASK_NAME=mrpc python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\\n--task_name $TASK_NAME --do_train --do_eval --max_seq_length 128 --per_device_train_batch_size 4 \\\n--learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v1", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/ZSD-microsoft-v2xxlmnli
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "deberta-v1", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa large model fine-tuned with MNLI task. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, you need to specify --sharded\_ddp If you find DeBERTa useful for your work, please cite the following paper:
[ "#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 88, 32, 186 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.08160653710365295, 0.14178912341594696, -0.0056040650233626366, -0.013423237018287182, 0.08656241744756699, 0.011307350359857082, 0.06037154048681259, 0.10219544172286987, 0.015383425168693066, 0.047933515161275864, 0.07864893972873688, 0.1046132892370224, 0.031780194491147995, 0.11529602855443954, -0.002768120728433132, -0.22420603036880493, 0.06498926877975464, 0.04464029148221016, -0.11771257221698761, 0.06544594466686249, 0.12184781581163406, -0.11165592819452286, 0.02907121740281582, 0.011705787852406502, -0.07542774826288223, 0.04426250979304314, 0.01852283626794815, -0.005527423694729805, 0.1369284689426422, 0.09269001334905624, 0.08893055468797684, 0.15177379548549652, 0.054398685693740845, -0.20322923362255096, 0.02140158601105213, 0.0621478371322155, 0.017716875299811363, 0.07103460282087326, 0.06635694205760956, 0.02675876021385193, 0.14291192591190338, -0.15794000029563904, 0.003967816010117531, 0.0561823807656765, -0.03499487787485123, -0.21797731518745422, -0.11210424453020096, 0.08947276324033737, 0.11135106533765793, 0.0407126285135746, -0.025807734578847885, 0.1601598858833313, -0.04295102506875992, 0.04252456873655319, 0.1891786903142929, -0.2350613921880722, 0.0005704922368749976, 0.10642095655202866, 0.005789491347968578, -0.02826349250972271, -0.040928784757852554, 0.061173148453235626, 0.020678643137216568, 0.0024466384202241898, 0.10017938911914825, -0.03263202682137489, 0.040153902024030685, 0.016847411170601845, -0.12024550884962082, -0.038662396371364594, 0.04728226363658905, -0.007957703433930874, -0.048629164695739746, -0.1595999002456665, -0.06820505112409592, 0.02686356194317341, -0.045855239033699036, -0.17200501263141632, 0.019469082355499268, -0.0691492110490799, 0.036504894495010376, -0.12704943120479584, -0.09622469544410706, -0.04356483370065689, 0.00197408115491271, 0.11831889301538467, 0.04501929134130478, -0.037661004811525345, 0.005065434146672487, 0.11936044692993164, -0.11181123554706573, -0.09117569029331207, -0.053641628473997116, -0.07701080292463303, -0.10730992257595062, -0.014853529632091522, -0.05135431885719299, -0.0964014083147049, 0.03576992079615593, 0.1896948665380478, -0.007664924953132868, 0.047176457941532135, 0.01986202970147133, -0.037481486797332764, -0.005780387669801712, 0.11966805160045624, -0.08598252385854721, -0.009177492931485176, 0.07965323328971863, 0.013776342384517193, -0.006434925366193056, -0.02273590862751007, -0.027484223246574402, -0.09798875451087952, 0.09076455980539322, 0.11362531036138535, 0.019987579435110092, 0.054780617356300354, 0.02337680757045746, -0.0561351478099823, 0.198507621884346, -0.15918509662151337, -0.03313213586807251, -0.010559780523180962, 0.012761302292346954, -0.005680941976606846, 0.01267812680453062, 0.010955161415040493, -0.07390818744897842, 0.0998852550983429, -0.04961174353957176, -0.07120416313409805, -0.01627998985350132, -0.0762929692864418, 0.012095464393496513, -0.002612548414617777, -0.026230448856949806, -0.14601121842861176, -0.1089138388633728, 0.007614564150571823, 0.00496984226629138, 0.026184801012277603, -0.015173312276601791, 0.0636715218424797, -0.018866734579205513, -0.02171405404806137, -0.03569866344332695, 0.0193755142390728, 0.00048170011723414063, 0.0504077710211277, 0.046565450727939606, 0.033998556435108185, -0.04513467475771904, 0.022245122119784355, -0.035572949796915054, 0.03538617864251137, -0.21529264748096466, 0.06626616418361664, -0.10758128017187119, -0.06451474130153656, -0.1306312084197998, -0.05363081768155098, 0.01711989939212799, 0.021647678688168526, 0.021599318832159042, 0.11113359779119492, -0.0935121476650238, -0.058465685695409775, 0.12921276688575745, -0.1404924988746643, -0.03790799900889397, 0.07084591686725616, -0.011475350707769394, -0.02133295312523842, 0.0928165391087532, 0.10802341252565384, 0.2277587652206421, -0.14862558245658875, -0.04048605263233185, 0.005505486391484737, -0.025120923295617104, 0.021870680153369904, 0.04137115925550461, 0.025770671665668488, 0.01705951616168022, 0.013805740512907505, -0.2064971923828125, -0.061611395329236984, -0.026378348469734192, -0.06898774951696396, -0.03284318000078201, -0.013517213985323906, 0.08136085420846939, -0.0012443686136975884, 0.03289567679166794, 0.04469570145010948, -0.09867995232343674, 0.05187782645225525, 0.1293966919183731, -0.06297466158866882, 0.010097806341946125, -0.13371506333351135, 0.020340578630566597, -0.015964778140187263, 0.03096306137740612, -0.1610749065876007, -0.15099698305130005, 0.0071940976195037365, -0.15944860875606537, 0.012295784428715706, 0.14754898846149445, 0.04143049195408821, 0.05307655781507492, -0.06712380796670914, 0.02870098501443863, -0.045539479702711105, -0.00039923988515511155, -0.027089672163128853, -0.07244604080915451, -0.09911560267210007, -0.03791547194123268, 0.11132322996854782, -0.16329500079154968, 0.025279764086008072, 0.09538431465625763, 0.15210014581680298, 0.054577942937612534, -0.003709516255185008, -0.01060759462416172, 0.015489187091588974, 0.025311866775155067, -0.047950685024261475, 0.0050738537684082985, -0.012202558107674122, -0.09356354922056198, 0.06564074009656906, -0.10393603146076202, 0.08310439437627792, 0.0814533457159996, 0.11155901104211807, -0.031164003536105156, -0.012628942728042603, -0.04804903641343117, -0.0077484105713665485, -0.04228585585951805, -0.09439963847398758, 0.028194040060043335, 0.056627869606018066, 0.08283963054418564, -0.07597717642784119, -0.07594755291938782, -0.010019932873547077, 0.025927918031811714, -0.03080003522336483, 0.03990436717867851, 0.017169859260320663, -0.1024240255355835, 0.022485313937067986, 0.033273130655288696, 0.00860675796866417, 0.06434403359889984, -0.027110807597637177, -0.06370100378990173, -0.005576167721301317, -0.009916968643665314, 0.009414510801434517, 0.06071294844150543, 0.024089058861136436, 0.033818863332271576, 0.06270156055688858, -0.011620870791375637, 0.06488049775362015, -0.11560789495706558, 0.016318248584866524, 0.025575712323188782, -0.07836537808179855, -0.05648380517959595, 0.09084253013134003, 0.0012466732878237963, 0.05332290381193161, 0.003979323897510767, 0.06518197804689407, -0.004973859991878271, -0.00570883322507143, -0.09486440569162369, 0.14143072068691254, -0.05168217048048973, -0.22403642535209656, -0.15618808567523956, 0.05550404265522957, -0.056874245405197144, 0.0051866271533071995, 0.06220658868551254, -0.07929310947656631, -0.11693651974201202, -0.017836667597293854, 0.1120942011475563, 0.006604895927011967, -0.02268698625266552, -0.006846132688224316, -0.0024638287723064423, 0.07244151830673218, -0.1339103877544403, 0.0069170244969427586, 0.0009487596107646823, -0.01988747902214527, 0.041905973106622696, 0.031159447506070137, 0.10459784418344498, 0.0952056422829628, -0.0458441786468029, 0.007552784401923418, -0.009833288379013538, 0.2083800584077835, -0.07151075452566147, 0.03475428372621536, 0.22668513655662537, 0.013618804514408112, 0.05910368263721466, 0.11132290959358215, 0.04282897710800171, -0.06880658119916916, 0.010989844799041748, 0.10037342458963394, -0.013732667081058025, -0.23524293303489685, -0.07305196672677994, -0.03810906410217285, -0.10327988862991333, 0.03113875910639763, 0.047957491129636765, 0.05827917903661728, 0.022047709673643112, -0.099458709359169, 0.0021672870498150587, 0.02166026271879673, 0.051463447511196136, 0.20071947574615479, 0.013481281697750092, 0.08970703184604645, -0.052717067301273346, -0.07174846529960632, 0.03577125817537308, 0.007216214202344418, 0.13400700688362122, -0.019431447610259056, 0.11417323350906372, 0.10189750790596008, 0.06060382351279259, 0.013393952511250973, 0.05852852016687393, 0.02841850183904171, -0.04006583243608475, 0.003205095184966922, -0.09314249455928802, -0.06267182528972626, 0.02093234471976757, 0.028892947360873222, -0.10994314402341843, -0.07540535926818848, 0.03542992100119591, -0.024039460346102715, 0.1641886681318283, -0.004375311080366373, -0.19443358480930328, -0.03255603462457657, 0.017089756205677986, -0.034861501306295395, -0.023058012127876282, -0.03947528079152107, 0.0255754292011261, -0.08151824027299881, 0.05868010222911835, -0.06919314712285995, 0.0677775964140892, -0.04352066293358803, 0.04347887635231018, -0.027256475761532784, 0.06410394608974457, 0.02783750183880329, 0.09405051916837692, -0.1753825545310974, 0.17116008698940277, 0.047731462866067886, 0.0916689783334732, -0.06066219508647919, 0.03582591563463211, 0.04267604649066925, 0.061324913054704666, 0.1477600485086441, -0.024703875184059143, -0.07747065275907516, -0.1610550880432129, -0.054458022117614746, 0.06296416372060776, 0.09113038331270218, -0.08626727759838104, 0.09043699502944946, -0.03695153445005417, -0.021348724141716957, 0.0054128123447299, 0.09167326241731644, -0.1948348432779312, -0.14642393589019775, 0.13143931329250336, -0.011411992833018303, 0.04034408926963806, -0.08096643537282944, -0.05399926006793976, -0.04212027043104172, 0.14271993935108185, -0.054240066558122635, -0.08907035738229752, -0.14101684093475342, 0.032373443245887756, 0.13445240259170532, -0.10431680083274841, 0.04736282303929329, 0.03341824188828468, 0.19319793581962585, -0.03573053330183029, -0.11114708334207535, -0.04084552079439163, -0.059372540563344955, -0.1315397024154663, 0.054708171635866165, 0.13825666904449463, 0.0300039853900671, 0.01887615956366062, 0.05199485272169113, 0.001142927329055965, 0.09106799215078354, -0.09190697222948074, -0.015394515357911587, 0.057070422917604446, 0.03014625795185566, -0.03266812115907669, -0.05118076130747795, 0.013934704475104809, -0.08601129800081253, 0.010768849402666092, 0.08952241390943527, 0.2462126910686493, -0.0813620537519455, 0.10104819387197495, 0.15990234911441803, -0.012379936873912811, -0.24011951684951782, -0.08146753907203674, 0.048251282423734665, 0.026722129434347153, 0.013496551662683487, -0.1724839210510254, 0.10287104547023773, 0.06891142576932907, -0.033859316259622574, -0.049716606736183167, -0.2702748477458954, -0.1423129141330719, 0.12090852856636047, 0.07268518209457397, -0.04612258821725845, -0.07648469507694244, -0.06881705671548843, -0.08023828268051147, -0.15473052859306335, 0.10381804406642914, -0.09601953625679016, 0.07101456075906754, -0.008500033058226109, 0.009556787088513374, 0.037529777735471725, -0.04104849323630333, 0.16786491870880127, 0.03084103949368, 0.06912200897932053, -0.0148247629404068, 0.06533872336149216, 0.026515064761042595, -0.06195903941988945, 0.09489145874977112, 0.031299080699682236, 0.05424198508262634, -0.07576873898506165, -0.07868283241987228, -0.055563151836395264, 0.07145749777555466, -0.03021949902176857, -0.07343827188014984, -0.034154511988162994, 0.03861018642783165, 0.03952914476394653, -0.05149285867810249, 0.02221056818962097, -0.07053294032812119, 0.04418836534023285, 0.090408556163311, 0.12184911221265793, -0.05973958224058151, -0.07948634773492813, -0.003954967949539423, -0.019192036241292953, 0.09733092039823532, -0.05894863232970238, 0.11668264120817184, 0.09808213263750076, 0.0528763122856617, 0.0721914991736412, 0.0030854917131364346, -0.06342528015375137, -0.046257928013801575, 0.06923367828130722, -0.11226415634155273, -0.21081168949604034, -0.03013959713280201, -0.014626632444560528, -0.043487947434186935, 0.062255699187517166, 0.17036165297031403, -0.04153512045741081, -0.04475501552224159, 0.05233530327677727, -0.0008341233478859067, 0.009678902104496956, 0.1787683218717575, 0.06110328063368797, 0.09598924964666367, -0.09992440044879913, 0.04927163943648338, 0.059181470423936844, -0.10736251622438431, -0.015966106206178665, 0.011035731993615627, -0.09424694627523422, -0.07465774565935135, -0.06476055830717087, 0.0076393974013626575, -0.059903692454099655, -0.0556851401925087, -0.018344450742006302, -0.06419110298156738, 0.014496002346277237, 0.09287233650684357, 0.04190625250339508, 0.058903202414512634, -0.03294697031378746, 0.0036270481068640947, -0.11566454917192459, 0.04759246110916138, 0.05369000509381294, 0.07209133356809616, -0.16567842662334442, 0.09190633893013, -0.009018595330417156, 0.05634927377104759, -0.045596349984407425, 0.02230946719646454, -0.052190423011779785, -0.040340110659599304, -0.1118953600525856, 0.039972275495529175, -0.07842245697975159, -0.02374245598912239, -0.002315458608791232, -0.015144855715334415, -0.022415766492486, 0.0339081697165966, -0.08237145841121674, -0.043552618473768234, -0.05119616910815239, 0.0556471049785614, -0.12825524806976318, 0.021032139658927917, 0.030572829768061638, -0.09502727538347244, 0.0859450027346611, -0.04626094177365303, -0.0024330338928848505, 0.09674381464719772, -0.03556700423359871, -0.06396659463644028, 0.03383834660053253, 0.061897069215774536, 0.016172783449292183, -0.11881888657808304, -0.01550603099167347, 0.003000999568030238, -0.013778245076537132, 0.00973546039313078, 0.050310567021369934, -0.12829042971134186, -0.025220392271876335, -0.07640021294355392, 0.0007021488854661584, -0.05179854482412338, 0.03197694942355156, 0.0940643772482872, 0.04205813258886337, 0.10205308347940445, -0.06422258913516998, 0.009192085824906826, -0.1386733502149582, -0.0007026935345493257, -0.02908308058977127, -0.009205609560012817, -0.01591632328927517, -0.04385780915617943, 0.04518977180123329, -0.037981484085321426, 0.10640622675418854, -0.0844423919916153, 0.042918071150779724, 0.05865596607327461, -0.07441777735948563, -0.018099527806043625, 0.005791722796857357, 0.19496145844459534, 0.06153441220521927, 0.013776099309325218, 0.01918633282184601, 0.009347913786768913, -0.049480997025966644, 0.031870923936367035, 0.2178768515586853, 0.2012457251548767, 0.015756387263536453, 0.05216837674379349, 0.01791265606880188, -0.07214652746915817, -0.097731813788414, 0.046666231006383896, -0.055755626410245895, 0.05569403991103172, -0.032608725130558014, 0.050760384649038315, 0.1570126861333847, -0.14521148800849915, 0.07689467072486877, -0.003091878956183791, -0.05864732712507248, -0.128555029630661, -0.07028468698263168, -0.07819995284080505, -0.07818193733692169, -0.013769919984042645, -0.10084125399589539, -0.007700812537223101, 0.037010423839092255, 0.008306012488901615, -0.011854332871735096, 0.1628100872039795, -0.039476178586483, -0.0823218896985054, 0.010480494238436222, 0.01024257205426693, -0.04795769974589348, 0.06531643122434616, -0.00230181822553277, 0.06126108393073082, 0.06199413910508156, 0.03557093068957329, 0.0012608597753569484, 0.02011856995522976, 0.01355111412703991, -0.059825349599123, -0.05218062922358513, -0.008566370233893394, 0.050017207860946655, 0.07963655889034271, 0.10498830676078796, -0.005701862741261721, -0.05556056275963783, -0.03397329896688461, 0.21198706328868866, -0.012864128686487675, -0.12811309099197388, -0.09428943693637848, 0.2937461733818054, 0.10172509402036667, 0.05399717390537262, 0.03863723948597908, -0.10939420014619827, 0.022120511159300804, 0.1347457468509674, 0.12427373230457306, 0.021715745329856873, 0.007347635924816132, -0.010783549398183823, 0.006893089506775141, -0.05237812548875809, 0.13207170367240906, 0.021334735676646233, 0.23464399576187134, -0.02975933998823166, 0.003879181109368801, -0.018496941775083542, -0.024956731125712395, -0.05083700641989708, 0.1717323213815689, 0.008524416014552116, -0.03876085579395294, -0.01837807707488537, 0.07568498700857162, 0.03910036385059357, -0.18395070731639862, -0.0199106615036726, -0.03747711330652237, -0.09852325171232224, -0.0021154291462153196, -0.0469711497426033, -0.030610160902142525, 0.0625569075345993, -0.029734253883361816, -0.02439121901988983, 0.1708112359046936, 0.01846802979707718, -0.06181648001074791, 0.012359271757304668, 0.12750421464443207, 0.06773979961872101, 0.19100691378116608, -0.0000341544218827039, 0.11598970741033554, 0.08934729546308517, -0.026136208325624466, -0.09883444756269455, 0.048939768224954605, 0.045573197305202484, -0.11872010678052902, 0.06647395342588425, 0.16419893503189087, 0.007616997696459293, 0.004156799521297216, 0.024238193407654762, -0.20184825360774994, -0.0001898870395962149, 0.06746426969766617, -0.04193885996937752, -0.049313269555568695, 0.07433053106069565, -0.08752736449241638, 0.08706557005643845, 0.12326104193925858, -0.0014857708010822535, 0.05227139964699745, -0.06415144354104996, 0.06938480585813522, 0.03671780228614807, 0.09673194587230682, -0.013422155752778053, -0.17711393535137177, 0.048703618347644806, -0.02931978553533554, -0.00114069867413491, -0.21577562391757965, -0.07375990599393845, 0.02711177058517933, 0.0024196221493184566, 0.006313161924481392, 0.11838051676750183, 0.018823744729161263, 0.034261465072631836, 0.011059467680752277, -0.1803378164768219, -0.06135566532611847, 0.06604054570198059, -0.16192929446697235, -0.045906879007816315 ]
null
null
transformers
I tried to train v3 xl to mnli using my own training code and got this result.
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/deberta-v2-xlarge-mnli
[ "transformers", "pytorch", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #license-mit #autotrain_compatible #endpoints_compatible #region-us
I tried to train v3 xl to mnli using my own training code and got this result.
[]
[ "TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 77 ]
[ "passage: TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.030489450320601463, 0.13613605499267578, -0.007837893441319466, 0.06102415919303894, 0.14656929671764374, 0.044838715344667435, 0.14066645503044128, 0.09064040333032608, 0.01845572516322136, -0.026682717725634575, 0.12996937334537506, 0.28713318705558777, -0.0020199287682771683, 0.05915484577417374, -0.11843162029981613, -0.2732817530632019, 0.0677759125828743, 0.06212734803557396, -0.03655935451388359, 0.07401275634765625, 0.12710489332675934, -0.04016168415546417, 0.06134062632918358, -0.042602937668561935, -0.06256377696990967, 0.040368933230638504, 0.05346330627799034, -0.11709904670715332, 0.10906554758548737, 0.04674572870135307, 0.08583148568868637, 0.08186456561088562, -0.002954564057290554, -0.14607536792755127, 0.009929942898452282, 0.017768314108252525, -0.10119599103927612, 0.01850617304444313, 0.024809295311570168, -0.0425967313349247, 0.05897662788629532, -0.05244836211204529, 0.05471833422780037, 0.03539225459098816, -0.11098302155733109, -0.19931910932064056, -0.04173562303185463, 0.023815473541617393, 0.08490899205207825, 0.02159351482987404, 0.020513618364930153, 0.08486868441104889, -0.09588231891393661, 0.057353194802999496, 0.05311378464102745, -0.2660377025604248, 0.012164209969341755, 0.09040616452693939, 0.02860182523727417, 0.04248177632689476, 0.0014212856767699122, 0.06821414083242416, 0.05005709454417229, 0.006410993169993162, -0.011442315764725208, -0.05089354142546654, 0.06415434926748276, -0.0042521217837929726, -0.05471793934702873, -0.02628156915307045, 0.1575486660003662, -0.03717253357172012, 0.011225695721805096, -0.04603078216314316, -0.04322686046361923, -0.046473775058984756, -0.013765973970293999, -0.08675408363342285, 0.035096779465675354, 0.04195353388786316, 0.07848881930112839, 0.027804238721728325, -0.09486187994480133, -0.028421787545084953, -0.19041264057159424, 0.1194671094417572, 0.03281358629465103, 0.02716384455561638, -0.15176019072532654, 0.05076846480369568, -0.026132604107260704, -0.10950569808483124, -0.01743704453110695, -0.05975038558244705, 0.05522952228784561, -0.01755848154425621, -0.04770801216363907, 0.006319362670183182, 0.11841966211795807, 0.21515528857707977, 0.05610744282603264, 0.04245743155479431, -0.06839007139205933, 0.06812424957752228, 0.01769762858748436, 0.09971502423286438, 0.023044565692543983, -0.04462670162320137, 0.06019751355051994, -0.09543272107839584, 0.06052669882774353, -0.03433863818645477, -0.1691543459892273, -0.09847482293844223, 0.10608835518360138, 0.1517985314130783, 0.021412527188658714, 0.058910105377435684, -0.05465640127658844, -0.03689884394407272, 0.12561573088169098, -0.054881077259778976, 0.011775446124374866, -0.004478121642023325, 0.042262524366378784, 0.09671283513307571, -0.054378263652324677, -0.0011414472246542573, -0.04002881050109863, 0.1263977587223053, -0.04762391373515129, -0.04671996459364891, -0.010511386208236217, -0.07412830740213394, 0.06528768688440323, -0.15156950056552887, 0.03700551763176918, -0.18847912549972534, -0.06827755272388458, 0.04083457961678505, -0.0066201407462358475, -0.04758772253990173, -0.034317512065172195, -0.025563132017850876, -0.007818453945219517, 0.017650974914431572, -0.0697580948472023, -0.07583171129226685, -0.07606522738933563, 0.07896139472723007, -0.03902563452720642, 0.050642773509025574, -0.17972946166992188, 0.04793018102645874, -0.06110835447907448, 0.02203945815563202, -0.12766604125499725, 0.029722066596150398, -0.07672681659460068, 0.10554857552051544, -0.0470832884311676, -0.048052165657281876, 0.0684245228767395, 0.03177229315042496, -0.038952525705099106, 0.12274127453565598, -0.15858487784862518, -0.07502308487892151, 0.10451994091272354, -0.1467827558517456, -0.08974561095237732, 0.053984735161066055, -0.004277581814676523, -0.016165168955922127, 0.11717317998409271, 0.18445371091365814, 0.08735761791467667, -0.08034861832857132, 0.10105664283037186, 0.058341339230537415, -0.127229705452919, -0.09941016137599945, 0.05593632906675339, -0.035763662308454514, -0.08076556771993637, 0.045971959829330444, -0.05603266507387161, 0.0026136513333767653, -0.023833991959691048, -0.03708389773964882, 0.01031839195638895, -0.012459184043109417, 0.03209608793258667, 0.019057221710681915, 0.04684551805257797, -0.09353554248809814, -0.004252946935594082, 0.05953902378678322, 0.02307227998971939, 0.03273656219244003, -0.00023715911083854735, -0.07960887253284454, 0.08838909864425659, 0.008574020117521286, 0.014645452611148357, -0.09776708483695984, -0.0638432651758194, 0.009310990571975708, -0.00396699970588088, 0.002974085509777069, 0.11837989836931229, 0.005856781732290983, -0.021842047572135925, -0.03753240033984184, 0.004209826234728098, 0.12222212553024292, 0.07953362166881561, 0.003591601038351655, -0.14201822876930237, 0.04586091637611389, -0.036683257669210434, -0.005606959108263254, -0.050706926733255386, 0.01415371336042881, 0.10731393098831177, 0.10014645010232925, -0.00896814838051796, 0.09904050081968307, -0.06956246495246887, 0.05358085408806801, -0.03723567724227905, -0.014211507514119148, 0.10546387732028961, -0.006856259889900684, -0.056104447692632675, 0.1117437407374382, -0.08914873749017715, 0.2861228883266449, 0.17273826897144318, -0.15441544353961945, -0.020481042563915253, -0.0011749546974897385, 0.01679580844938755, 0.014048994518816471, 0.0605175644159317, -0.004569482058286667, 0.019661318510770798, -0.00832275953143835, 0.15251237154006958, -0.05066594481468201, -0.0345582515001297, -0.00045973650412634015, -0.016146067529916763, -0.11323758959770203, 0.07509169727563858, 0.13349592685699463, -0.2914721965789795, 0.16677649319171906, 0.26310715079307556, 0.04260389134287834, 0.12162204831838608, -0.018472906202077866, 0.027529431506991386, -0.021675975993275642, -0.06449995934963226, 0.019250942394137383, 0.02528202161192894, -0.13751433789730072, -0.002424210775643587, 0.04512215033173561, 0.06179877743124962, 0.03553367778658867, -0.09435860812664032, -0.061947084963321686, 0.007138111628592014, -0.006196425296366215, -0.023098178207874298, 0.09426769614219666, 0.017465941607952118, 0.13003267347812653, 0.020293571054935455, -0.13025401532649994, 0.1241665855050087, -0.004999241791665554, -0.0656229555606842, 0.16846966743469238, -0.15343792736530304, -0.2734238803386688, -0.09238090366125107, -0.10972398519515991, -0.04269081726670265, 0.06400604546070099, 0.0902603417634964, -0.08441087603569031, -0.03353826329112053, 0.041821740567684174, 0.023273373022675514, -0.14389219880104065, 0.02211865969002247, -0.09737230837345123, 0.09537501633167267, 0.008474210277199745, -0.11233577132225037, -0.08617590367794037, -0.01370926946401596, -0.03176617994904518, 0.0972808226943016, -0.07324795424938202, 0.06090778857469559, 0.12941350042819977, -0.01853867806494236, 0.017100157216191292, -0.046051912009716034, 0.16967569291591644, -0.06418094784021378, -0.021246563643217087, 0.16522572934627533, -0.021836677566170692, 0.043343111872673035, 0.18546351790428162, 0.021002141758799553, -0.09568257629871368, -0.028942417353391647, -0.009512275457382202, -0.05657058209180832, -0.2085111439228058, -0.09302665293216705, -0.05771684646606445, 0.04391368106007576, 0.051167670637369156, 0.0691317692399025, 0.15103186666965485, 0.036885809153318405, 0.025714149698615074, 0.020938927307724953, 0.038602691143751144, 0.09596838057041168, 0.3784296214580536, 0.0017143243458122015, 0.13333898782730103, -0.05704918131232262, -0.09503918886184692, 0.08927648514509201, 0.05244080349802971, 0.04722326993942261, 0.13938923180103302, 0.0008286855882033706, 0.0773208737373352, 0.008319344371557236, 0.11224586516618729, 0.02444320172071457, 0.08172104507684708, 0.003861179342493415, -0.06532976776361465, -0.00970936007797718, -0.031185301020741463, 0.006086735986173153, 0.019343890249729156, -0.1772124022245407, -0.0771956667304039, -0.12456411123275757, 0.054057661443948746, 0.06390003859996796, 0.1096683219075203, -0.17488747835159302, 0.01767779514193535, 0.09456492215394974, -0.007516887504607439, -0.07825697958469391, 0.07253710180521011, -0.1208767220377922, -0.08230305463075638, 0.13730740547180176, 0.0086485231295228, 0.09937812387943268, -0.1392081379890442, 0.08185940980911255, -0.0007735286490060389, -0.08910199999809265, 0.03641725704073906, 0.08146928995847702, -0.36731112003326416, 0.16388222575187683, 0.017110612243413925, -0.029849553480744362, -0.07352951169013977, -0.040350548923015594, 0.007367231417447329, 0.19247505068778992, 0.1084759309887886, -0.013007293455302715, -0.08450226485729218, -0.11975689977407455, 0.0029394763987511396, 0.011898383498191833, 0.07816596329212189, -0.028688551858067513, -0.034306079149246216, -0.0381142795085907, 0.013804500922560692, 0.0077930958941578865, 0.05200915411114693, -0.08620226383209229, -0.12149795144796371, 0.07603231817483902, 0.04342174530029297, 0.03420735150575638, -0.00923191662877798, -0.05964451655745506, -0.052746955305337906, 0.12504272162914276, -0.15242299437522888, -0.08383331447839737, -0.08697837591171265, -0.046896953135728836, -0.006913826800882816, -0.046854108572006226, 0.05028530955314636, -0.07511437684297562, 0.03564872220158577, -0.05264519527554512, -0.17960681021213531, 0.12341468781232834, -0.0665309801697731, -0.08323544263839722, -0.06998489797115326, 0.12343837320804596, -0.04761052876710892, -0.0016510151326656342, 0.054425470530986786, 0.01726723276078701, 0.03767615929245949, -0.06386533379554749, 0.0378703847527504, 0.011882475577294827, 0.03185126930475235, -0.034926749765872955, -0.08003143221139908, -0.09033877402544022, -0.015729857608675957, -0.011660581454634666, 0.1838380992412567, 0.2818240523338318, -0.05791845545172691, 0.132155179977417, 0.13770471513271332, -0.10977085679769516, -0.25812140107154846, -0.08884436637163162, -0.12537313997745514, -0.05421333387494087, -0.01375600416213274, -0.14571762084960938, 0.044808194041252136, 0.11142008751630783, -0.04907812550663948, 0.13843123614788055, -0.20571184158325195, -0.10240662097930908, 0.19170311093330383, 0.005361754447221756, 0.35298076272010803, -0.12563325464725494, -0.09835455566644669, -0.12325112521648407, -0.11806138604879379, 0.1786314994096756, 0.0561009980738163, 0.08539050072431564, -0.05380081385374069, -0.013252446427941322, 0.016641641035676003, -0.013545241206884384, 0.15879444777965546, 0.034515220671892166, 0.08004996925592422, -0.12376800924539566, -0.11168443411588669, 0.09493657201528549, -0.009999880567193031, -0.005364468786865473, -0.061973173171281815, -0.011477350257337093, -0.07047148793935776, -0.06997350603342056, -0.05030845105648041, 0.027569308876991272, -0.032018404453992844, -0.06946702301502228, -0.043949078768491745, 0.04453287646174431, -0.014153680764138699, -0.059730660170316696, 0.1660897582769394, -0.05243990197777748, 0.05332076549530029, 0.036204468458890915, 0.08040961623191833, -0.05265795812010765, 0.03401355817914009, -0.05993267521262169, -0.07847066223621368, 0.08833333849906921, -0.044208914041519165, 0.04920386150479317, 0.11079197376966476, -0.07269410043954849, 0.0768711268901825, 0.09198565781116486, 0.02661258541047573, -0.03843683749437332, 0.12063635140657425, -0.11363577097654343, -0.028929289430379868, 0.027324294671416283, -0.047346070408821106, 0.1578947752714157, 0.12853513658046722, 0.1364414095878601, 0.03906889632344246, -0.028416546061635017, 0.023275502026081085, 0.0007729048375040293, -0.052172642201185226, 0.04585444927215576, 0.05104503408074379, 0.012877325527369976, -0.09948078542947769, 0.10604728013277054, 0.06258666515350342, -0.10320805758237839, 0.01051172986626625, 0.030938001349568367, -0.11385422945022583, -0.1063123270869255, -0.04262938350439072, 0.11019666492938995, -0.1366327852010727, -0.07152677327394485, -0.07149321585893631, -0.1506718099117279, 0.08627497404813766, 0.13926756381988525, 0.08832916617393494, 0.09417404234409332, -0.024216094985604286, -0.05954848229885101, -0.008304855786263943, -0.0039979927241802216, -0.09555377066135406, 0.027710894122719765, -0.13111627101898193, -0.0076707499101758, -0.013091254979372025, 0.05018806457519531, -0.05902046710252762, -0.0202660970389843, -0.16396009922027588, 0.019306177273392677, -0.18278031051158905, -0.0164916031062603, -0.06075684726238251, -0.004470640327781439, 0.033280447125434875, -0.03223961591720581, -0.028351513668894768, -0.035489004105329514, -0.06954347342252731, 0.004198000766336918, 0.004559298511594534, 0.09645187854766846, -0.08830076456069946, -0.08479475975036621, 0.01460924744606018, -0.04054383933544159, 0.10871363431215286, 0.01224131602793932, -0.03384951502084732, 0.04182732105255127, -0.09759773313999176, -0.058380499482154846, 0.0901356041431427, 0.007259315811097622, 0.040872082114219666, -0.03937405347824097, 0.02983992174267769, 0.08897111564874649, -0.05012279003858566, 0.07159469276666641, 0.040535446256399155, -0.10743177682161331, -0.003190329298377037, 0.01922348141670227, -0.16892756521701813, -0.011475726030766964, -0.055121805518865585, 0.1427476406097412, -0.013616925105452538, 0.1430884301662445, -0.05694746598601341, 0.03787323459982872, -0.028542067855596542, -0.013353165239095688, -0.0033170999959111214, -0.1388968825340271, -0.04679809510707855, -0.05165873095393181, 0.0017382482765242457, -0.007758545223623514, 0.2534356415271759, 0.0538330003619194, -0.007823776453733444, 0.08914704620838165, 0.0399448536336422, -0.014639628119766712, 0.04940750449895859, 0.21528707444667816, 0.05913696810603142, -0.015547242946922779, -0.09988079220056534, 0.03223872557282448, 0.0108071593567729, 0.025590257719159126, 0.09362875670194626, 0.15164677798748016, -0.02924271486699581, 0.024743618443608284, 0.06087740510702133, -0.01355359423905611, -0.14774079620838165, -0.03769572079181671, 0.06353892385959625, 0.08087711781263351, 0.0343981571495533, 0.002812720835208893, 0.10686696320772171, -0.06647709757089615, 0.016624705865979195, -0.07279927283525467, -0.03231840580701828, -0.17094606161117554, -0.17434711754322052, -0.09514650702476501, -0.0647180899977684, 0.010565163567662239, -0.0404963418841362, -0.009272841736674309, 0.027621455490589142, 0.05138856917619705, -0.09877511113882065, -0.009742483496665955, 0.004974220413714647, -0.0539657287299633, 0.0842907577753067, 0.0034737582318484783, -0.00358772580511868, -0.042039450258016586, -0.006408408749848604, -0.08648520708084106, -0.006050921510905027, -0.06381504237651825, -0.00015631341375410557, -0.04526326432824135, -0.030675234273076057, -0.09563110768795013, -0.07329311221837997, 0.008500553667545319, 0.05619942024350166, -0.030151396989822388, 0.22360704839229584, 0.00991296861320734, -0.009911458007991314, 0.08287856727838516, 0.07581549882888794, 0.03302402421832085, -0.11675475537776947, -0.003664375049993396, 0.22843682765960693, 0.06785254180431366, 0.12963181734085083, 0.0167103111743927, -0.016086066141724586, -0.047959115356206894, 0.1550607979297638, 0.30103936791419983, -0.051211096346378326, 0.014901475980877876, 0.011415480636060238, 0.01069639902561903, 0.12259377539157867, 0.10557955503463745, 0.024855639785528183, 0.20044784247875214, -0.02923576347529888, -0.005013561341911554, -0.04516750946640968, 0.031175054609775543, -0.06531098484992981, 0.1242622658610344, 0.04093068465590477, -0.05090545117855072, -0.04626080393791199, 0.11647696793079376, -0.1134980171918869, 0.06400139629840851, 0.11309637874364853, -0.12702301144599915, -0.08724094182252884, -0.0014432960888370872, 0.08051655441522598, -0.0328778401017189, 0.048749566078186035, -0.04067404195666313, -0.060460250824689865, 0.0538603775203228, -0.004581002052873373, -0.16099809110164642, -0.05910507217049599, 0.0716339722275734, 0.1563292145729065, 0.08289781957864761, -0.021406762301921844, 0.09848065674304962, 0.10126946121454239, 0.08910540491342545, -0.050665874034166336, 0.11127041280269623, 0.015416644513607025, 0.0025416070129722357, 0.025901444256305695, -0.08094479143619537, -0.023267699405550957, -0.020815931260585785, 0.09595664590597153, -0.12805290520191193, 0.034394100308418274, 0.021899858489632607, -0.14301742613315582, -0.057431526482105255, 0.08678609877824783, -0.08775011450052261, 0.02284449338912964, 0.04959868639707565, 0.0031219283118844032, -0.03775029256939888, -0.057603731751441956, 0.0025605885311961174, 0.06140110269188881, -0.1287619024515152, -0.08986642956733704, -0.04267442226409912, -0.010001959279179573, 0.08664575964212418, 0.047499608248472214, -0.14716209471225739, -0.028317168354988098, -0.10090107470750809, 0.02061091922223568, -0.14226222038269043, 0.06407824903726578, -0.024016544222831726, -0.00008537252142559737, -0.03625372797250748, -0.07888752222061157, -0.003663160838186741, 0.0532461553812027, -0.11981402337551117, -0.08380892872810364 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "tags": ["deberta-v3", "deberta-mnli", "deberta", "deberta-v2"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/debertav3-mnli-snli-anli
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "deberta-v3", "deberta-mnli", "deberta", "zero-shot-classification", "en", "arxiv:2006.03654", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-mnli #deberta #zero-shot-classification #en #arxiv-2006.03654 #autotrain_compatible #endpoints_compatible #has_space #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-mnli #deberta #zero-shot-classification #en #arxiv-2006.03654 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 87, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-mnli #deberta #zero-shot-classification #en #arxiv-2006.03654 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.0531579852104187, 0.12238097190856934, -0.003731853561475873, 0.037357356399297714, 0.09856607019901276, 0.039573248475790024, 0.02605469338595867, 0.1396854966878891, -0.0375838577747345, 0.07658404111862183, 0.042348939925432205, 0.041899506002664566, 0.04344138130545616, 0.09591490030288696, 0.0015623753424733877, -0.24471411108970642, 0.07810239493846893, -0.04062385857105255, -0.11217951029539108, 0.034912001341581345, 0.09706113487482071, -0.12412545084953308, 0.060540854930877686, -0.02119535021483898, -0.046884018927812576, 0.04613982513546944, 0.006519824732095003, -0.0026683916803449392, 0.109640933573246, 0.07739875465631485, 0.05488615855574608, 0.12692362070083618, 0.07285439968109131, -0.1820584535598755, 0.03724479302763939, 0.08789084851741791, 0.025876620784401894, 0.06343519687652588, 0.04806952923536301, 0.036415234208106995, 0.07524998486042023, -0.11907435208559036, 0.0033468734472990036, 0.044933464378118515, -0.013082418590784073, -0.19584888219833374, -0.09504514932632446, 0.017807813361287117, 0.06419284641742706, 0.02949296496808529, -0.007357797585427761, 0.05930616706609726, -0.07440325617790222, 0.04360652714967728, 0.2033897191286087, -0.24377666413784027, 0.0022572653833776712, 0.09473873674869537, -0.05232845991849899, 0.008502698503434658, -0.06826362013816833, 0.058456044644117355, -0.033667292445898056, -0.0061949873343110085, 0.1024385467171669, -0.004593211226165295, -0.03275998681783676, 0.046913523226976395, -0.10363567620515823, -0.02665909379720688, -0.016462117433547974, -0.020801257342100143, -0.0155685655772686, -0.13056422770023346, -0.14102274179458618, -0.09042204916477203, -0.08618195354938507, -0.10769745707511902, 0.04231403395533562, -0.03917752951383591, 0.007650673855096102, -0.12475204467773438, -0.08084189146757126, -0.04422392696142197, -0.014698206447064877, 0.07564939558506012, 0.09003040939569473, -0.013258377090096474, 0.03029458224773407, 0.12536866962909698, -0.053770046681165695, -0.10096269845962524, -0.061575692147016525, -0.07709316164255142, -0.11152885109186172, 0.031883012503385544, -0.06226830929517746, -0.1377260535955429, 0.033588703721761703, 0.1635979413986206, 0.0912121906876564, 0.11479535698890686, 0.0028161988593637943, -0.02413306199014187, -0.051651064306497574, 0.16410750150680542, -0.09729354083538055, -0.048649128526449203, 0.1282610148191452, 0.031027717515826225, 0.013778015039861202, -0.03200056403875351, -0.04801080748438835, -0.059386223554611206, 0.08849915862083435, 0.09309111535549164, -0.04893495887517929, 0.06879021227359772, 0.040931615978479385, -0.04061964154243469, 0.11862410604953766, -0.16677942872047424, -0.03181155025959015, 0.015807149931788445, -0.040126897394657135, 0.012112461030483246, 0.030916744843125343, -0.02969430945813656, -0.0722576379776001, 0.086533322930336, -0.04505782946944237, -0.06976617127656937, -0.07542432844638824, -0.08825609087944031, -0.0029809693805873394, 0.03543324023485184, -0.05376238375902176, -0.14428386092185974, -0.1716853678226471, 0.03093525394797325, 0.024528710171580315, -0.008789412677288055, 0.049276724457740784, 0.034494027495384216, -0.009065707214176655, -0.018040675669908524, -0.018776550889015198, 0.023017393425107002, -0.02801879309117794, 0.034922417253255844, 0.03105960600078106, 0.07520538568496704, -0.01271333359181881, 0.021594872698187828, -0.039844807237386703, 0.031684424728155136, -0.24630439281463623, 0.07673357427120209, -0.1060819923877716, -0.08376388996839523, -0.07618776708841324, -0.050233300775289536, -0.040231358259916306, 0.0028449860401451588, 0.023548876866698265, 0.09108790755271912, -0.12483630329370499, -0.06306611001491547, 0.1418856382369995, -0.12226173281669617, -0.025428706780076027, 0.10336556285619736, 0.000881168816704303, -0.06127870827913284, 0.050446782261133194, 0.103116974234581, 0.24806736409664154, -0.14754463732242584, -0.05079783871769905, 0.012956862337887287, -0.008670097216963768, -0.0030217452440410852, 0.029781989753246307, 0.06835439056158066, 0.0968456044793129, 0.05567479506134987, -0.10392820835113525, -0.019356032833456993, -0.025159062817692757, -0.05377160757780075, -0.043536361306905746, -0.017996378242969513, 0.06701838225126266, -0.03264429420232773, 0.045758772641420364, 0.037900522351264954, -0.08469168841838837, 0.04451089724898338, 0.14334388077259064, -0.08220451325178146, 0.03759817034006119, -0.13186317682266235, 0.033678703010082245, -0.013379406183958054, 0.05084914714097977, -0.14804209768772125, -0.18063397705554962, 0.030958931893110275, -0.2052663266658783, -0.018859367817640305, 0.1635412573814392, 0.06798716634511948, 0.08479217439889908, -0.0751265436410904, 0.02695721574127674, -0.05320541560649872, -0.020068148151040077, -0.012697123922407627, -0.04695165157318115, -0.08508017659187317, -0.008393603377044201, 0.15579693019390106, -0.08318027853965759, 0.020313024520874023, 0.12005773186683655, 0.16772186756134033, 0.04713594913482666, -0.03212665393948555, -0.01901765912771225, -0.03746156394481659, 0.0331382118165493, -0.0767710879445076, -0.008813032880425453, -0.0017509389435872436, -0.0483025424182415, 0.09114667028188705, -0.1636669635772705, 0.14297962188720703, 0.08749782294034958, 0.14094659686088562, 0.03247269615530968, 0.013433053158223629, -0.04284866154193878, -0.030694151297211647, -0.04933229833841324, -0.07096155732870102, 0.03098388761281967, 0.06798738986253738, 0.14202754199504852, -0.07273603230714798, -0.05378326028585434, -0.0015725698322057724, 0.04964352771639824, -0.018369410187005997, 0.004728503059595823, 0.01356502529233694, -0.08594819903373718, 0.03345122188329697, 0.04298021271824837, -0.00604361342266202, 0.13570551574230194, -0.03064708597958088, -0.09143909811973572, -0.001223270664922893, -0.045752983540296555, -0.013908457942306995, 0.050385694950819016, 0.0910487174987793, 0.0679110437631607, 0.042595840990543365, 0.022128352895379066, 0.08153600245714188, -0.09736915677785873, 0.06857714056968689, 0.044754501432180405, -0.07900018990039825, 0.03886233642697334, 0.08883017301559448, 0.028986580669879913, 0.03690187633037567, 0.0007327082566916943, 0.0740908682346344, -0.00406712805852294, -0.025221601128578186, -0.034910764545202255, 0.12732140719890594, -0.08631441742181778, -0.23709048330783844, -0.17970708012580872, -0.018589423969388008, -0.0867920070886612, -0.007731183897703886, 0.04230700805783272, -0.04877995699644089, -0.1427520513534546, -0.05395976081490517, 0.10003344714641571, 0.05084458738565445, 0.002769577084109187, -0.021246040239930153, 0.002147671300917864, 0.0748501569032669, -0.1412629634141922, 0.010179480537772179, 0.0022085749078541994, -0.05951710045337677, 0.02458888478577137, 0.08135601133108139, 0.044762153178453445, 0.07710766792297363, -0.03537280112504959, 0.022956375032663345, -0.002626969013363123, 0.15604263544082642, -0.01419601496309042, 0.04855586960911751, 0.2107180505990982, -0.005747013725340366, 0.06510882079601288, 0.06960940361022949, 0.013840017840266228, -0.0645093098282814, 0.024016210809350014, 0.09747149795293808, -0.015629742294549942, -0.18938559293746948, -0.07435017824172974, -0.05731416866183281, -0.05358128994703293, 0.016239028424024582, 0.04908033460378647, -0.005997279658913612, -0.0001554167247377336, -0.07313608378171921, 0.01811997778713703, 0.015378148294985294, 0.02023312635719776, 0.2043921798467636, 0.003113602753728628, 0.07516146451234818, -0.07149753719568253, -0.03304211050271988, 0.055946577340364456, 0.000015270426956703886, 0.0984690859913826, -0.07365374267101288, 0.1562298834323883, 0.04365483298897743, 0.11807147413492203, 0.03836681693792343, 0.08478795737028122, 0.010539091192185879, -0.027375347912311554, -0.008649598807096481, -0.07994478940963745, -0.10786759853363037, -0.0029914164915680885, 0.020584747195243835, -0.06772205978631973, -0.05439789220690727, 0.13126464188098907, 0.028183812275528908, 0.19639292359352112, -0.029672978445887566, -0.16280382871627808, -0.01771152764558792, 0.020569192245602608, -0.04258255660533905, -0.03455134481191635, 0.005579780321568251, 0.09812237322330475, -0.06482615321874619, -0.01279519870877266, -0.09052689373493195, 0.05073414742946625, -0.09205419570207596, 0.06496208161115646, 0.02300209365785122, 0.13935914635658264, 0.04718988016247749, 0.07123410701751709, -0.1242097020149231, 0.08636490255594254, 0.05170326679944992, 0.06487730890512466, -0.051520783454179764, 0.03892550617456436, 0.07017014920711517, 0.01244769711047411, 0.07687536627054214, -0.036676522344350815, -0.1492743343114853, -0.11219900101423264, -0.11995552480220795, 0.0682961568236351, 0.151626855134964, -0.04448269307613373, 0.0958946943283081, -0.09153547883033752, -0.0519859604537487, -0.006334108766168356, 0.1150929182767868, -0.07299306988716125, -0.164371520280838, 0.12041903287172318, -0.018243728205561638, 0.06264952570199966, -0.09069381654262543, -0.042530905455350876, -0.054746877402067184, 0.14195623993873596, -0.057474713772535324, -0.04590172320604324, -0.15867486596107483, 0.05588650703430176, 0.12249533832073212, -0.10882332175970078, 0.09413020312786102, 0.03098711371421814, 0.21339355409145355, -0.0009872198570519686, -0.16876524686813354, -0.011127883568406105, -0.06005840003490448, -0.13187144696712494, 0.06751731783151627, 0.09877165406942368, -0.06680253148078918, 0.021448684856295586, -0.0007986593991518021, 0.007368997670710087, 0.06405461579561234, -0.08287100493907928, -0.05794943869113922, 0.09384267032146454, 0.0012370389886200428, -0.021562788635492325, -0.09187362343072891, 0.0648634284734726, -0.006657326593995094, 0.0620950423181057, 0.1219155341386795, 0.23553378880023956, -0.07392499595880508, 0.10191365331411362, 0.12414024025201797, 0.014129126444458961, -0.30893588066101074, -0.07449936121702194, 0.010830186307430267, 0.03477853164076805, 0.0073554543778300285, -0.16745850443840027, 0.10663390159606934, 0.11533712595701218, -0.03834351897239685, -0.05185769498348236, -0.266035258769989, -0.14473894238471985, 0.10981612652540207, 0.06647397577762604, -0.002218901878222823, -0.029964536428451538, -0.019533934071660042, -0.11147705465555191, -0.1311049610376358, 0.02068900689482689, -0.12615667283535004, 0.12245013564825058, -0.017654288560152054, -0.029907310381531715, 0.028173381462693214, -0.06569570302963257, 0.10493986308574677, -0.05227162316441536, 0.10085155814886093, -0.02578149363398552, 0.1317373514175415, 0.05593070015311241, -0.0784246027469635, 0.10208677500486374, 0.0006694853655062616, 0.07599057257175446, -0.04041409119963646, -0.04394375905394554, -0.01743600144982338, 0.011638218536973, -0.00044769226224161685, -0.06053844839334488, -0.036031272262334824, 0.037752747535705566, 0.053107138723134995, -0.05274324119091034, 0.0032560720574110746, -0.013119987212121487, -0.038450196385383606, 0.14748899638652802, 0.036121662706136703, -0.052457358688116074, -0.09603425860404968, -0.02235664799809456, 0.009172778576612473, 0.07257376611232758, -0.044153422117233276, 0.12944048643112183, 0.08441207557916641, -0.017360201105475426, 0.0679539367556572, 0.020657572895288467, -0.03742922097444534, -0.04245388135313988, 0.07166598737239838, -0.1470104604959488, -0.2307918220758438, -0.03972722217440605, 0.005847736727446318, -0.04983849078416824, 0.021075991913676262, 0.15090994536876678, -0.04427339881658554, -0.007372257299721241, 0.05076499655842781, 0.008682950399816036, 0.03441411256790161, 0.15817365050315857, 0.047571126371622086, 0.06869383156299591, -0.12819598615169525, 0.12880679965019226, 0.036999672651290894, -0.10686348378658295, -0.007337887771427631, 0.06975480914115906, -0.10765097290277481, -0.06532489508390427, -0.04666874185204506, 0.07098075747489929, 0.04561832919716835, -0.018705714493989944, -0.0859711766242981, -0.06002221256494522, 0.011144079267978668, 0.11561890691518784, 0.03066100925207138, 0.09500020742416382, -0.0677771344780922, 0.016267968341708183, -0.12574701011180878, 0.08993613719940186, 0.016530238091945648, 0.060986362397670746, -0.14336182177066803, 0.07723032683134079, -0.03330561891198158, 0.03304516524076462, -0.06017261743545532, 0.009223992936313152, -0.04661569371819496, -0.052449628710746765, -0.16305385529994965, -0.007915595546364784, 0.0030100373551249504, -0.04351722449064255, 0.005047547165304422, 0.019542807713150978, -0.03577551618218422, 0.03623173385858536, -0.08406370878219604, -0.055339694023132324, -0.06353141367435455, 0.014136048965156078, -0.1343553364276886, 0.022185934707522392, 0.028988458216190338, -0.10204757750034332, 0.10471782088279724, 0.028714101761579514, 0.0236973874270916, 0.12923042476177216, 0.02722785249352455, -0.10800357162952423, 0.03293995186686516, 0.08216111361980438, 0.0394895076751709, -0.048473257571458817, -0.005060212686657906, -0.03347386047244072, -0.04016522318124771, -0.02741718851029873, 0.04392727464437485, -0.12479983270168304, 0.03637552633881569, -0.0938849225640297, 0.05103323608636856, -0.050807446241378784, -0.01449490711092949, 0.06536342948675156, 0.05601687729358673, 0.0880284532904625, -0.06625767797231674, -0.009060860611498356, -0.13996954262256622, -0.006424358114600182, 0.008231655694544315, -0.06717022508382797, -0.07389543950557709, -0.06282874196767807, 0.047730788588523865, -0.04256102815270424, 0.10828597098588943, -0.049896471202373505, -0.046167973428964615, 0.06394375115633011, -0.03636761009693146, -0.0445467047393322, 0.028766905888915062, 0.10218324512243271, 0.09008678793907166, 0.005697821732610464, 0.035410258919000626, -0.005619940347969532, 0.0024949468206614256, 0.005022809375077486, 0.19289657473564148, 0.18564382195472717, 0.04738464578986168, 0.09423407912254333, 0.008079199120402336, -0.06922560930252075, -0.09350758045911789, 0.11471103131771088, -0.1408555805683136, 0.09426464885473251, -0.061115507036447525, 0.038260430097579956, 0.1278662532567978, -0.14122308790683746, 0.08134249597787857, -0.010495913214981556, -0.046173520386219025, -0.15031957626342773, -0.018985874950885773, -0.09487411379814148, -0.09403616189956665, -0.005118037573993206, -0.09301658719778061, 0.022176459431648254, 0.06942915171384811, 0.039490438997745514, -0.0010251201456412673, 0.22513620555400848, -0.053053464740514755, -0.05593230947852135, 0.0023436753544956446, 0.015502219088375568, -0.05823638662695885, 0.08283991366624832, -0.06404419988393784, 0.028094692155718803, 0.05484578013420105, 0.04937804117798805, -0.0002410806919215247, 0.05665689334273338, 0.027136823162436485, -0.06283904612064362, -0.064330093562603, -0.0018711262382566929, 0.01409889291971922, 0.03529589623212814, 0.0553293451666832, -0.02401023730635643, -0.05429662764072418, -0.037775710225105286, 0.24750956892967224, -0.052134670317173004, -0.12781977653503418, -0.09072858840227127, 0.19784943759441376, 0.14640185236930847, 0.08137080073356628, 0.01878986693918705, -0.10473689436912537, -0.04134942963719368, 0.1445578783750534, 0.13374407589435577, -0.052095457911491394, 0.010932260192930698, 0.026501048356294632, -0.005291878245770931, -0.06490956991910934, 0.14852437376976013, 0.05598638206720352, 0.1569398045539856, -0.018470406532287598, 0.0020785871893167496, 0.030188854783773422, -0.07051380723714828, -0.04938492923974991, 0.2099827229976654, -0.05139956250786781, 0.012000429444015026, -0.025067152455449104, -0.02263515442609787, 0.012994015589356422, -0.17686472833156586, -0.0407334640622139, -0.06704044342041016, -0.13390636444091797, -0.01406867429614067, 0.005526075605303049, -0.009419663809239864, 0.053994014859199524, -0.01053278986364603, -0.010344001464545727, 0.16555263102054596, 0.015010926872491837, -0.0663456916809082, -0.022731414064764977, 0.10041394084692001, 0.06884150207042694, 0.19838064908981323, 0.022393787279725075, -0.008008034899830818, 0.06726988404989243, -0.036770012229681015, -0.10793335735797882, 0.05319724977016449, 0.037555377930402756, -0.20829829573631287, 0.0005614408291876316, 0.18945664167404175, -0.017673268914222717, 0.04676719382405281, -0.003511653048917651, -0.15976907312870026, -0.010981238447129726, 0.02506965771317482, -0.0467086099088192, -0.005598748102784157, 0.02308565378189087, -0.03098107874393463, 0.09165952354669571, 0.16767843067646027, 0.019912051036953926, 0.06451953202486038, -0.07887314260005951, 0.038998574018478394, 0.029198016971349716, 0.07642243802547455, -0.0062494417652487755, -0.15387439727783203, 0.0334794744849205, -0.02928970195353031, -0.03069717064499855, -0.2062443345785141, -0.10374540835618973, 0.029778514057397842, -0.004903886932879686, -0.02864294871687889, 0.14539776742458344, 0.05079402029514313, 0.04559981822967529, 0.012093803845345974, -0.1410355418920517, -0.041488584131002426, 0.028209226205945015, -0.16517461836338043, -0.056739311665296555 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # m2m100_418M-fr This model is a fine-tuned version of [facebook/m2m100_418M](https://huggingface.co/facebook/m2m100_418M) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.7021 - Bleu: 51.1340 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | |:-------------:|:-----:|:-----:|:---------------:|:-------:| | 0.749 | 1.0 | 23645 | 0.7021 | 51.1344 | ### Framework versions - Transformers 4.13.0.dev0 - Pytorch 1.10.0 - Datasets 1.15.2.dev0 - Tokenizers 0.10.3
{"license": "mit", "tags": ["translation", "generated_from_trainer"], "datasets": ["kde4"], "metrics": ["bleu"], "base_model": "facebook/m2m100_418M", "model-index": [{"name": "m2m100_418M-fr", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "kde4", "type": "kde4", "args": "en-fr"}, "metrics": [{"type": "bleu", "value": 51.1339693938271, "name": "Bleu"}]}]}]}
translation
NDugar/m2m100_418M-fr
[ "transformers", "pytorch", "safetensors", "m2m_100", "text2text-generation", "translation", "generated_from_trainer", "dataset:kde4", "base_model:facebook/m2m100_418M", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #safetensors #m2m_100 #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-facebook/m2m100_418M #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us
m2m100\_418M-fr =============== This model is a fine-tuned version of facebook/m2m100\_418M on the kde4 dataset. It achieves the following results on the evaluation set: * Loss: 0.7021 * Bleu: 51.1340 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 1 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.13.0.dev0 * Pytorch 1.10.0 * Datasets 1.15.2.dev0 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #safetensors #m2m_100 #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-facebook/m2m100_418M #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ 88, 113, 4, 36 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #m2m_100 #text2text-generation #translation #generated_from_trainer #dataset-kde4 #base_model-facebook/m2m100_418M #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ -0.15294378995895386, 0.13661065697669983, -0.0025238299276679754, 0.09148221462965012, 0.13545969128608704, 0.012738555669784546, 0.11036016047000885, 0.14408259093761444, -0.06755559146404266, 0.06081563979387283, 0.13478915393352509, 0.12158039212226868, 0.05546468123793602, 0.21700191497802734, -0.04513751342892647, -0.2444962114095688, 0.025266526266932487, 0.03376379236578941, -0.050994858145713806, 0.1411050409078598, 0.10243956744670868, -0.13554881513118744, 0.08096988499164581, 0.0025490291882306337, -0.13357989490032196, -0.023676354438066483, -0.0032918420620262623, -0.08309701830148697, 0.1210615411400795, 0.009738517925143242, 0.09639445692300797, 0.044528618454933167, 0.07364199310541153, -0.1551579385995865, 0.008311872370541096, 0.06201343238353729, 0.016139525920152664, 0.08682651817798615, 0.07946842908859253, -0.019362177699804306, 0.08881621807813644, -0.10594134032726288, 0.04341106116771698, 0.015523447655141354, -0.11413802951574326, -0.3084818422794342, -0.09591678529977798, 0.08240364491939545, 0.07902345806360245, 0.09385465830564499, -0.01108979620039463, 0.12027186155319214, -0.0335894376039505, 0.09788235276937485, 0.2385028600692749, -0.2742161750793457, -0.04575731232762337, -0.001373810926452279, 0.01849151775240898, 0.06825913488864899, -0.08620966225862503, -0.0332200787961483, 0.03887397423386574, 0.046553198248147964, 0.15249410271644592, -0.009273185394704342, 0.010021079331636429, -0.03858324885368347, -0.14676962792873383, -0.07991746068000793, 0.20586618781089783, 0.054475873708724976, -0.04722366854548454, -0.07648598402738571, -0.07387140393257141, -0.14560192823410034, -0.0254674032330513, -0.001244922517798841, 0.01675363816320896, -0.04972779378294945, -0.06939171999692917, 0.003996393643319607, -0.09220084547996521, -0.056326352059841156, -0.01110288966447115, 0.13695141673088074, 0.06802467256784439, 0.017822030931711197, -0.01811876893043518, 0.11416604369878769, 0.02587355673313141, -0.16805772483348846, 0.006216497626155615, -0.014289654791355133, 0.041899219155311584, 0.0020023623947054148, -0.034696340560913086, -0.029477572068572044, 0.018436575308442116, 0.11172041296958923, -0.11553696542978287, 0.05094948410987854, 0.032371360808610916, 0.02718871459364891, -0.06512302905321121, 0.1322871297597885, -0.06315867602825165, -0.04470996931195259, 0.033038586378097534, 0.11680001020431519, 0.03345897048711777, -0.0037726557347923517, -0.0720718502998352, -0.01956983283162117, 0.1590188890695572, 0.03918200731277466, -0.011285294778645039, 0.06446587294340134, -0.05587783083319664, -0.04844158515334129, 0.07042687386274338, -0.10896090418100357, -0.011739355511963367, 0.019598035141825676, -0.09212423115968704, -0.04601290449500084, 0.0285468939691782, -0.0056401207111775875, -0.03133568912744522, 0.06492014229297638, -0.07415354996919632, -0.03155294433236122, -0.07308531552553177, -0.11643757671117783, 0.009094854816794395, -0.053526587784290314, 0.006525668781250715, -0.11681778728961945, -0.19174058735370636, -0.037740398198366165, 0.018245983868837357, -0.011826244182884693, -0.07585856318473816, -0.044939491897821426, -0.07761089503765106, 0.02650139480829239, -0.04328538849949837, 0.10528583079576492, -0.0639185681939125, 0.12041828781366348, 0.032456353306770325, 0.06418006122112274, 0.005740280728787184, 0.0613086111843586, -0.09210169315338135, 0.04212810844182968, -0.15706056356430054, 0.0702572762966156, -0.06784357875585556, 0.01949823461472988, -0.10481226444244385, -0.1255015730857849, 0.004268243443220854, -0.027843836694955826, 0.10434302687644958, 0.13257458806037903, -0.13110962510108948, -0.07715544104576111, 0.21104378998279572, -0.0942397341132164, -0.1359669417142868, 0.10247126966714859, -0.026874316856265068, -0.012897210195660591, 0.0678028091788292, 0.17217767238616943, 0.10753300786018372, -0.0630359798669815, -0.02670901082456112, -0.015711350366473198, 0.06994984298944473, -0.06831479072570801, 0.1070963591337204, -0.012956224381923676, -0.01105565670877695, 0.029754875227808952, -0.07564012706279755, 0.07499553263187408, -0.10975434631109238, -0.07353415340185165, -0.03376753255724907, -0.0888671800494194, 0.07326828688383102, 0.05917489156126976, 0.05283267796039581, -0.10935389250516891, -0.08740293979644775, 0.017875822260975838, 0.12004153430461884, -0.08665862679481506, 0.014025614596903324, -0.052208155393600464, 0.11621668934822083, -0.0641021579504013, -0.014918250031769276, -0.17820315062999725, -0.05437096208333969, 0.036663495004177094, -0.030704108998179436, -0.04490097984671593, -0.029887907207012177, 0.06182343512773514, 0.08440445363521576, -0.05599026754498482, -0.09043476730585098, -0.062308404594659805, 0.002442200668156147, -0.10414954274892807, -0.18086721003055573, -0.05744292959570885, -0.023334216326475143, 0.11552183330059052, -0.16647356748580933, 0.037822820246219635, -0.004901417065411806, 0.12313631922006607, 0.012221322394907475, -0.02358153834939003, -0.005028108134865761, 0.06268308311700821, -0.017733512446284294, -0.07177162170410156, 0.08066301047801971, 0.012679205276072025, -0.07546498626470566, 0.029140090569853783, -0.1251601278781891, 0.1465817540884018, 0.12315652519464493, -0.004418469034135342, -0.05390454828739166, -0.01511376816779375, -0.07199041545391083, -0.03724639490246773, -0.023098787292838097, 0.0011740566696971655, 0.1184849962592125, 0.01591365411877632, 0.15045315027236938, -0.09795426577329636, -0.040462177246809006, 0.03186009079217911, -0.014654903672635555, -0.012235534377396107, 0.13150939345359802, 0.012365437112748623, -0.09904628992080688, 0.14993470907211304, 0.15718874335289001, -0.020191499963402748, 0.13930292427539825, -0.06741318851709366, -0.07817886769771576, -0.029170826077461243, 0.001367298187687993, -0.006312709301710129, 0.10637611150741577, -0.09480904042720795, 0.0007594022899866104, 0.046854063868522644, 0.03578278049826622, 0.02447402849793434, -0.1783938854932785, -0.0145374471321702, 0.032501768320798874, -0.05445853993296623, -0.007584963459521532, 0.0206924919039011, 0.01479070819914341, 0.10301680117845535, 0.0035669703502207994, -0.04447080194950104, 0.029223432764410973, -0.0071461689658463, -0.08556491136550903, 0.22712351381778717, -0.11157883703708649, -0.19714613258838654, -0.09855291247367859, -0.0029675995465368032, -0.06243168190121651, -0.004635503049939871, 0.067149318754673, -0.06731676310300827, -0.028778444975614548, -0.07470287382602692, 0.01716405712068081, -0.01444597914814949, 0.02153925783932209, 0.013219517655670643, -0.01908041164278984, 0.07572557777166367, -0.11473773419857025, -0.023691803216934204, -0.016413146629929543, -0.03118690475821495, 0.0572010837495327, 0.018038317561149597, 0.08666139841079712, 0.10822867602109909, -0.014903786592185497, 0.026163984090089798, -0.03049592487514019, 0.22591230273246765, -0.04112039878964424, -0.025815917178988457, 0.1705259382724762, 0.029590634629130363, 0.07661870121955872, 0.10514581948518753, 0.022992268204689026, -0.0715334489941597, 0.012330739758908749, 0.006048956885933876, -0.019913693889975548, -0.21528451144695282, -0.06350336223840714, -0.03618746995925903, 0.01100084651261568, 0.1076938807964325, 0.020278243348002434, -0.0011887430446222425, 0.07830120623111725, -0.027098029851913452, 0.02384103089570999, -0.016965506598353386, 0.0735192745923996, 0.11308174580335617, 0.032278671860694885, 0.13409173488616943, -0.018311981111764908, -0.011902617290616035, 0.04687097668647766, -0.023483073338866234, 0.2317143678665161, -0.06860052794218063, 0.10934237390756607, 0.07275165617465973, 0.22174155712127686, 0.0068336413241922855, 0.08075610548257828, -0.006851338781416416, 0.012655134312808514, -0.00974732544273138, -0.05045035108923912, -0.07661312073469162, -0.006359935738146305, -0.028618881478905678, 0.06258738040924072, -0.1651606261730194, 0.0233346875756979, 0.029685908928513527, 0.27664732933044434, 0.0666259229183197, -0.36291930079460144, -0.1093887984752655, -0.00041904719546437263, 0.0014014355838298798, -0.05447997897863388, 0.017603177577257156, 0.11252404749393463, -0.12299297004938126, 0.05249161645770073, -0.07562205940485, 0.07701161503791809, -0.06130437180399895, 0.015665290877223015, 0.041612401604652405, 0.08752559870481491, 0.000086321568232961, 0.07038553059101105, -0.27367928624153137, 0.2699205279350281, -0.009320605546236038, 0.0635150820016861, -0.06265100836753845, -0.0030864786822348833, 0.03142741322517395, 0.03792252019047737, 0.07932265847921371, 0.006923677399754524, -0.04636745527386665, -0.19290755689144135, -0.12122160196304321, 0.026354366913437843, 0.06825289875268936, -0.051369551569223404, 0.10887142270803452, -0.0198415145277977, 0.005974940489977598, 0.03711067885160446, 0.01032779086381197, -0.11402764171361923, -0.06477730721235275, 0.00888838805258274, 0.03275362402200699, 0.029309215024113655, -0.10669026523828506, -0.12143959105014801, -0.043983981013298035, 0.11989492923021317, -0.026184547692537308, -0.1038956418633461, -0.1211320161819458, 0.08871682733297348, 0.11318530887365341, -0.11496273428201675, 0.05380275472998619, -0.0023894752375781536, 0.09596244245767593, 0.021441198885440826, -0.07969768345355988, 0.08301651477813721, -0.08222486078739166, -0.21423155069351196, -0.03266284242272377, 0.15027186274528503, 0.03822861611843109, 0.04025021195411682, -0.00911853089928627, 0.01916288584470749, -0.025330733507871628, -0.07984748482704163, 0.03083699755370617, 0.039264459162950516, 0.12375561147928238, 0.037693608552217484, -0.03552364185452461, -0.033795226365327835, -0.05914918705821037, -0.030574027448892593, 0.15797023475170135, 0.26166626811027527, -0.08522377163171768, 0.030625149607658386, 0.0826697051525116, -0.049061719328165054, -0.18258069455623627, -0.020067015662789345, 0.09645556658506393, 0.01665973849594593, 0.02842196822166443, -0.1660565435886383, 0.03900465369224548, 0.09257225692272186, -0.027559105306863785, 0.07755055278539658, -0.32671093940734863, -0.13384754955768585, 0.12198410928249359, 0.15331178903579712, 0.11657654494047165, -0.1502126008272171, -0.05364622548222542, -0.007650016341358423, -0.14243188500404358, 0.11443089693784714, -0.07441911846399307, 0.10650714486837387, -0.05185593664646149, 0.0414576455950737, 0.014504970982670784, -0.061919551342725754, 0.1275058388710022, -0.001682619214989245, 0.06373897194862366, -0.05539054423570633, 0.01922593079507351, 0.08133009076118469, -0.06980889290571213, 0.0470217689871788, -0.06351762264966965, 0.08334973454475403, -0.14221714437007904, -0.016600267961621284, -0.12761732935905457, 0.04821569845080376, -0.0399751141667366, -0.0564461313188076, -0.020042063668370247, 0.05133645981550217, 0.07525571435689926, -0.013720606453716755, 0.09869711846113205, 0.034236352890729904, 0.13788242638111115, 0.13333866000175476, 0.06155133247375488, -0.01265646144747734, -0.06370285153388977, -0.043800968676805496, -0.02888183295726776, 0.04927212744951248, -0.11390324681997299, 0.022398866713047028, 0.13163025677204132, 0.01753215305507183, 0.11497769504785538, 0.05812963843345642, -0.06061078608036041, 0.017244188115000725, 0.06248769164085388, -0.16225874423980713, -0.10454702377319336, -0.03255941718816757, -0.01531032845377922, -0.12426178902387619, 0.04808301851153374, 0.10924496501684189, -0.07274984568357468, -0.033130258321762085, -0.039291247725486755, 0.024196289479732513, -0.012463738210499287, 0.21092729270458221, 0.07827622443437576, 0.0638657957315445, -0.1011940985918045, 0.11146476119756699, 0.03181520476937294, -0.09214106947183609, 0.032003872096538544, 0.05998595058917999, -0.11749277263879776, -0.034885942935943604, 0.05944299325346947, 0.11778533458709717, -0.037808071821928024, -0.0655774176120758, -0.12313881516456604, -0.11323042958974838, 0.07929538935422897, 0.08252888917922974, 0.06929724663496017, 0.03216105327010155, -0.007688574492931366, -0.014641908928751945, -0.09877485781908035, 0.13113199174404144, 0.08760039508342743, 0.060834091156721115, -0.14924588799476624, 0.1479201763868332, 0.0015014685923233628, 0.055805504322052, -0.013515269383788109, 0.017554454505443573, -0.08090956509113312, -0.005577944219112396, -0.1397433876991272, -0.003597756614908576, -0.05182011052966118, -0.0166936032474041, -0.04010814428329468, -0.06000940129160881, -0.04182815924286842, 0.04783928021788597, -0.09641909599304199, -0.050002988427877426, -0.0010349865769967437, 0.050847847014665604, -0.12406019866466522, -0.04947616904973984, 0.009285146370530128, -0.08811338245868683, 0.09088105708360672, 0.06843209266662598, 0.026723943650722504, 0.03123491071164608, -0.03480244427919388, 0.0043693953193724155, 0.036455683410167694, 0.006060573272407055, 0.03887515142560005, -0.12799806892871857, 0.005864739418029785, 0.0014601658331230283, -0.00035700638545677066, 0.010982968844473362, 0.09272497147321701, -0.13670186698436737, -0.005311612505465746, -0.0019776534754782915, -0.006635873578488827, -0.07327041774988174, 0.047661393880844116, 0.0636037290096283, 0.03919272869825363, 0.18021534383296967, -0.09217961132526398, 0.038810644298791885, -0.23102967441082, -0.011993749998509884, -0.018532203510403633, -0.11581587791442871, -0.0820733979344368, -0.012913723476231098, 0.08805304765701294, -0.04470194876194, 0.09376948326826096, -0.00985286757349968, 0.04370075464248657, 0.01846281811594963, -0.046944476664066315, 0.029244860634207726, 0.01746280863881111, 0.21185815334320068, 0.002316694473847747, -0.03439987823367119, 0.06843824684619904, 0.043812159448862076, 0.07388746738433838, 0.0804341584444046, 0.17654402554035187, 0.16784799098968506, 0.051770538091659546, 0.06509388238191605, 0.036291539669036865, -0.06182188168168068, -0.17811575531959534, -0.02010762318968773, -0.02422131784260273, 0.11096913367509842, 0.003594575449824333, 0.1847461611032486, 0.07996828854084015, -0.18881572782993317, 0.046216391026973724, -0.0384429469704628, -0.0760108232498169, -0.10112927854061127, -0.09074633568525314, -0.07762852311134338, -0.14363501965999603, -0.004060219507664442, -0.13322053849697113, 0.0024491038639098406, 0.08496230840682983, 0.01230314839631319, -0.027952302247285843, 0.13828718662261963, 0.05154775083065033, -0.02685810811817646, 0.07824284583330154, 0.01480371505022049, -0.016620179638266563, -0.05111917480826378, -0.09061519056558609, 0.020694680511951447, -0.0024320161901414394, 0.04692443832755089, -0.0441490039229393, -0.08413609117269516, 0.04428717494010925, 0.01364832278341055, -0.10920380055904388, 0.004851618316024542, 0.013366000726819038, 0.07274346798658371, 0.0837680771946907, 0.024957068264484406, 0.006845733616501093, -0.01652815379202366, 0.22161485254764557, -0.051877908408641815, -0.054001402109861374, -0.09035448729991913, 0.2154417634010315, 0.042222075164318085, -0.033464424312114716, 0.07348062843084335, -0.0895707905292511, -0.00044156084186397493, 0.16227500140666962, 0.17621268332004547, -0.036004360765218735, -0.004457777366042137, 0.0003618896589614451, -0.019499074667692184, -0.01802225410938263, 0.11455986648797989, 0.13526971638202667, 0.05317681282758713, -0.0842764750123024, 0.014283287338912487, -0.05460396409034729, 0.004300777800381184, -0.05183985456824303, 0.07081031799316406, 0.0021238187327980995, -0.011019284836947918, -0.03565932437777519, 0.05544360727071762, -0.02133350446820259, -0.10383784025907516, 0.017328305169939995, -0.1829744577407837, -0.16845932602882385, -0.0414416566491127, 0.04150470346212387, 0.045603834092617035, 0.057812079787254333, -0.007916830480098724, 0.03881102055311203, 0.10626229643821716, -0.011778319254517555, -0.0639408752322197, -0.10194501280784607, 0.08917238563299179, -0.07989128679037094, 0.212749183177948, -0.03250594809651375, 0.06166379153728485, 0.12227579951286316, 0.034264370799064636, -0.13126669824123383, 0.0557272732257843, 0.05584998428821564, -0.007204100023955107, 0.03821299597620964, 0.10915204882621765, -0.017018241807818413, 0.05670441687107086, 0.03080866113305092, -0.11483746767044067, -0.009298897348344326, -0.006339904386550188, -0.030342863872647285, -0.05296989902853966, -0.006373157259076834, -0.0402073934674263, 0.13756056129932404, 0.1807718276977539, -0.07039240002632141, -0.02087211422622204, -0.06637344509363174, 0.015132089145481586, 0.05708564445376396, 0.05243794247508049, -0.007521912455558777, -0.2341819554567337, 0.009601658210158348, 0.059430450201034546, 0.013619769364595413, -0.25153690576553345, -0.07412899285554886, -0.0071881674230098724, -0.0789119154214859, -0.0943128690123558, 0.09998071193695068, 0.013176021166145802, 0.04725835844874382, -0.04548920691013336, 0.009521689265966415, -0.09542753547430038, 0.17085111141204834, -0.17328843474388123, -0.08598515391349792 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa large model fine-tuned with MNLI task. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, you need to specify **--sharded_ddp** ```bash cd transformers/examples/text-classification/ export TASK_NAME=mrpc python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\\n--task_name $TASK_NAME --do_train --do_eval --max_seq_length 128 --per_device_train_batch_size 4 \\\n--learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v1", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/v2xl-again-mnli
[ "transformers", "pytorch", "deberta-v2", "text-classification", "deberta-v1", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa large model fine-tuned with MNLI task. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, you need to specify --sharded\_ddp If you find DeBERTa useful for your work, please cite the following paper:
[ "#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 79, 32, 186 ]
[ "passage: TAGS\n#transformers #pytorch #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n#### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, you need to specify --sharded\\_ddp\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.07816742360591888, 0.13616801798343658, -0.005215546581894159, 0.0071556949988007545, 0.08393720537424088, 0.011373884975910187, 0.03641301020979881, 0.10123459994792938, -0.030207747593522072, 0.0708547905087471, 0.07803372293710709, 0.13947169482707977, 0.05464140698313713, 0.1091979518532753, 0.00023583954316563904, -0.2802789807319641, 0.05288274958729744, 0.04083477333188057, -0.03686793893575668, 0.06251747906208038, 0.1316070258617401, -0.10212480276823044, 0.03527116775512695, -0.018747922033071518, -0.07652037590742111, 0.03291360288858414, -0.0039020732510834932, 0.019112804904580116, 0.1331808865070343, 0.08768901228904724, 0.06488601118326187, 0.11793217808008194, 0.04800994694232941, -0.23978623747825623, 0.018941089510917664, 0.046576257795095444, 0.03342972323298454, 0.058214422315359116, 0.06801700592041016, 0.049832187592983246, 0.19702966511249542, -0.20105789601802826, -0.00510593643411994, 0.05854136496782303, -0.062221184372901917, -0.14233876764774323, -0.11939948052167892, 0.08237028121948242, 0.13738642632961273, 0.0383499376475811, -0.04231730103492737, 0.15867482125759125, -0.0045034377835690975, 0.05834275856614113, 0.1478152722120285, -0.2481803297996521, -0.028837254270911217, 0.133414164185524, 0.03583810105919838, -0.01817409135401249, -0.046194374561309814, 0.07110543549060822, 0.02634836733341217, 0.0044885617680847645, 0.04058896750211716, -0.03857816010713577, 0.05504139885306358, 0.005362899042665958, -0.1208261102437973, -0.012803957797586918, 0.054932139813899994, 0.0182994082570076, -0.048719968646764755, -0.20462486147880554, -0.044308342039585114, 0.009305155836045742, -0.0496959462761879, -0.20617040991783142, 0.029491325840353966, -0.06651154160499573, 0.07749693095684052, -0.1349421888589859, -0.08108820021152496, -0.02238370105624199, 0.010486787185072899, 0.0347861722111702, 0.03352845832705498, -0.06661131232976913, 0.029374809935688972, 0.08386952430009842, -0.15861743688583374, -0.06853585690259933, -0.04946030676364899, -0.07951714098453522, -0.0953909158706665, -0.06040596589446068, -0.046103060245513916, -0.08574742078781128, 0.024662116542458534, 0.22334828972816467, 0.055367302149534225, 0.03863299638032913, -0.0024325114209204912, -0.03316411375999451, 0.011741714552044868, 0.10527527332305908, -0.0777830183506012, -0.0560147799551487, 0.07861199229955673, 0.004684958141297102, -0.001053660991601646, -0.02154862880706787, -0.03369193151593208, -0.11040155589580536, 0.06895207613706589, 0.14460615813732147, 0.0339583121240139, 0.04076356068253517, 0.00028786982875317335, -0.057649556547403336, 0.21576736867427826, -0.16123145818710327, -0.02508273348212242, -0.047605566680431366, -0.016730673611164093, -0.0018535488052293658, -0.03614942729473114, 0.035231951624155045, -0.08899889886379242, 0.08178076148033142, -0.04827699065208435, -0.06950927525758743, -0.007193543016910553, -0.08590387552976608, 0.035207223147153854, -0.07453466206789017, -0.01795821078121662, -0.1134466901421547, -0.10876738280057907, -0.023123687133193016, 0.015487135387957096, -0.01300401147454977, -0.049021001905202866, 0.029920456930994987, -0.00004405238723848015, -0.03198300302028656, -0.037953056395053864, 0.028774436563253403, -0.00736414548009634, 0.030479244887828827, 0.03356894478201866, 0.026243358850479126, -0.06388941407203674, 0.03235277161002159, -0.0325847752392292, 0.027448993176221848, -0.19645744562149048, 0.07126873731613159, -0.0778174176812172, -0.043861426413059235, -0.16936011612415314, -0.04987245053052902, 0.005145900882780552, 0.018605859950184822, 0.027880225330591202, 0.10582258552312851, -0.11941984295845032, -0.028446361422538757, 0.06964242458343506, -0.11671076714992523, -0.02363702468574047, 0.06490586698055267, -0.0026029644068330526, 0.0015396041562780738, 0.12122882157564163, 0.16298097372055054, 0.17676879465579987, -0.1445063352584839, -0.02721201255917549, -0.017691368237137794, -0.016176369041204453, 0.01731831766664982, 0.04093489795923233, 0.0020361451897770166, 0.05610142648220062, 0.004117683973163366, -0.15765666961669922, -0.07621901482343674, -0.03172188997268677, -0.07013331353664398, -0.015073340386152267, -0.01408948004245758, 0.03331751748919487, 0.004202980548143387, 0.015219245105981827, 0.03085341677069664, -0.08414862304925919, 0.046216003596782684, 0.1183469295501709, -0.04249882698059082, 0.014991598203778267, -0.11121708154678345, 0.020017480477690697, -0.03758276626467705, 0.013530373573303223, -0.16114072501659393, -0.16320012509822845, 0.014929902739822865, -0.11138530820608139, 0.04343773424625397, 0.12737081944942474, 0.03042936511337757, 0.043727532029151917, -0.0662824735045433, 0.06407368928194046, -0.03393932804465294, -0.013337144628167152, -0.05600970610976219, -0.06379201263189316, -0.10373956710100174, -0.043828584253787994, 0.07558136433362961, -0.22625914216041565, 0.017505943775177002, 0.09691057354211807, 0.12156511098146439, 0.06452520936727524, -0.027351384982466698, 0.009399144910275936, 0.03498097509145737, 0.024931147694587708, -0.047089483588933945, 0.03398237004876137, -0.028057320043444633, -0.0919029638171196, 0.04925863444805145, -0.09734181314706802, 0.03580231964588165, 0.06017051264643669, 0.09004312008619308, -0.04822656884789467, -0.04210413992404938, -0.05239314213395119, -0.018085306510329247, -0.04031173884868622, -0.06644248962402344, 0.03442789614200592, 0.06214165315032005, 0.07310034334659576, -0.05891253054141998, -0.06521671265363693, -0.028127964586019516, 0.05003596469759941, -0.05912385880947113, 0.07033614814281464, 0.032368846237659454, -0.09351906925439835, 0.024815401062369347, 0.000767381105106324, 0.04479321837425232, 0.04808598384261131, 0.00622972147539258, -0.07703584432601929, -0.004018906969577074, -0.005689640063792467, 0.0028267931193113327, 0.037304505705833435, -0.00865353923290968, 0.031969573348760605, 0.06161539629101753, 0.015047564171254635, 0.08226732164621353, -0.08991711586713791, 0.0004030505660921335, 0.030835453420877457, -0.07959160208702087, -0.05341038107872009, 0.09202812612056732, -0.014590635895729065, 0.06657595932483673, 0.0320165790617466, 0.07796189188957214, -0.014559701085090637, -0.004702285397797823, -0.09919271618127823, 0.11099740117788315, -0.0160345658659935, -0.20030002295970917, -0.1619589477777481, 0.012691743671894073, -0.07996334135532379, 0.011563929729163647, 0.06325443834066391, -0.07078257948160172, -0.11053401976823807, -0.03146292641758919, 0.12845800817012787, -0.013890444301068783, -0.03473995253443718, -0.023304341360926628, -0.017707670107483864, 0.037749066948890686, -0.13134628534317017, 0.013009889051318169, 0.003297889605164528, -0.01174726989120245, 0.010051783174276352, 0.0022108182311058044, 0.13146498799324036, 0.08679123967885971, -0.03995761275291443, 0.003654377767816186, -0.026034055277705193, 0.23455116152763367, -0.08419326692819595, 0.047869179397821426, 0.22742241621017456, 0.009886852465569973, 0.04074491187930107, 0.12159223109483719, 0.025418560951948166, -0.10109885036945343, 0.011210368946194649, 0.1193646565079689, -0.003970560152083635, -0.24357333779335022, -0.07141634821891785, -0.044308289885520935, -0.07941010594367981, 0.06523166596889496, 0.056784871965646744, 0.061785321682691574, 0.021359432488679886, -0.08345697075128555, -0.0032369273249059916, 0.04732038453221321, 0.06295816600322723, 0.19281518459320068, 0.003713029669597745, 0.08885037153959274, -0.052368760108947754, -0.06588307023048401, 0.056095320731401443, 0.02995341457426548, 0.13404059410095215, 0.02534003183245659, 0.14574651420116425, 0.08353506773710251, 0.022693462669849396, 0.0003535931755322963, 0.05424746498465538, 0.001535677583888173, -0.028892846778035164, 0.0015078630531206727, -0.08773784339427948, -0.0523267537355423, 0.025614382699131966, 0.06403001397848129, -0.1242300495505333, -0.07469604164361954, 0.005249510519206524, -0.008004390634596348, 0.16475819051265717, 0.02544478513300419, -0.20154686272144318, -0.03094266727566719, 0.02348238043487072, -0.05668806657195091, -0.03527284413576126, -0.04258313030004501, 0.01556860376149416, -0.0657706931233406, 0.06481710821390152, -0.057692285627126694, 0.09213360399007797, -0.06980405002832413, 0.03148242458701134, -0.026154538616538048, 0.0538255050778389, 0.02134540118277073, 0.10217180848121643, -0.15647578239440918, 0.163034126162529, 0.05685671418905258, 0.10971257090568542, -0.06786197423934937, 0.030375059694051743, 0.02231699973344803, 0.04375500604510307, 0.18485672771930695, -0.027394723147153854, -0.07169634848833084, -0.15131229162216187, -0.08716030418872833, 0.0498008206486702, 0.1040131226181984, -0.08801895380020142, 0.07664655894041061, -0.028813466429710388, -0.008386108092963696, 0.0027821934781968594, 0.05406738817691803, -0.22139909863471985, -0.15291398763656616, 0.12282709032297134, 0.003736568847671151, 0.03329021483659744, -0.031383588910102844, -0.05082013085484505, -0.04038179665803909, 0.2155141532421112, -0.029526516795158386, -0.0640145018696785, -0.14057739078998566, 0.0679314061999321, 0.13393135368824005, -0.08336445689201355, 0.0382259264588356, 0.02226773090660572, 0.17265358567237854, -0.04458742216229439, -0.0730687826871872, -0.030226003378629684, -0.026797376573085785, -0.0943727120757103, 0.06038763374090195, 0.12728655338287354, 0.034229762852191925, 0.013449190184473991, 0.057481519877910614, 0.020823845639824867, 0.09346705675125122, -0.09496384859085083, 0.001878999057225883, 0.033470503985881805, 0.059980399906635284, -0.04215817153453827, -0.09874676913022995, 0.010371445678174496, -0.11347053945064545, -0.021068690344691277, 0.12896312773227692, 0.2565489709377289, -0.06490616500377655, 0.07950887829065323, 0.15470126271247864, -0.047691021114587784, -0.22992290556430817, -0.03575846925377846, 0.016459988430142403, 0.02606399916112423, -0.0029456631746143103, -0.18646591901779175, 0.09894019365310669, 0.12051600962877274, -0.015097873285412788, -0.008530300110578537, -0.29912346601486206, -0.1492730975151062, 0.08724796026945114, 0.0748368427157402, -0.07466179877519608, -0.0932573676109314, -0.07883531600236893, -0.0883622020483017, -0.11769890040159225, 0.07907488197088242, -0.05641748383641243, 0.07832107692956924, -0.006845884490758181, -0.012643324211239815, 0.03799055889248848, -0.034739527851343155, 0.18364863097667694, 0.07710009068250656, 0.06787195801734924, -0.034732967615127563, 0.019910085946321487, -0.012938219122588634, -0.059860337525606155, 0.12259820103645325, 0.055930063128471375, 0.033970966935157776, -0.136023610830307, -0.09661278128623962, -0.013132840394973755, 0.07025613635778427, -0.02857259288430214, -0.0756300613284111, -0.041560981422662735, 0.06177297234535217, 0.039381932467222214, -0.034366559237241745, 0.022111302241683006, -0.12009777128696442, 0.02435537427663803, 0.03317273408174515, 0.13408277928829193, -0.05492720007896423, -0.026148132979869843, 0.0128767229616642, -0.020375540480017662, 0.07566852867603302, -0.02430056408047676, 0.08402358740568161, 0.13897249102592468, 0.020012574270367622, 0.08395352214574814, -0.01166172418743372, -0.02684430591762066, -0.04003237932920456, 0.0620947927236557, -0.09926170855760574, -0.20940272510051727, -0.012515174224972725, -0.06646980345249176, -0.039416879415512085, 0.034398067742586136, 0.15665294229984283, -0.02609032765030861, -0.0436481349170208, 0.05717146396636963, 0.02760249935090542, 0.002591713098809123, 0.15424036979675293, 0.048339203000068665, 0.06877190619707108, -0.0707097202539444, 0.05152580142021179, 0.08564925938844681, -0.15967844426631927, -0.01231309212744236, 0.040541380643844604, -0.06880120187997818, -0.08224519342184067, -0.05729388818144798, 0.0027702085208147764, -0.03430269658565521, -0.0469493605196476, 0.027608338743448257, -0.04199492558836937, 0.034271158277988434, 0.07880709320306778, 0.028298400342464447, 0.0449444018304348, -0.03541319817304611, 0.010555780492722988, -0.13960598409175873, 0.02361406572163105, 0.06573878228664398, 0.07073309272527695, -0.13750512897968292, 0.0700286254286766, -0.023112408816814423, 0.04130464792251587, -0.03641018643975258, 0.02356204204261303, -0.0447976179420948, -0.04043422266840935, -0.021174350753426552, 0.04613317921757698, -0.0946553573012352, -0.004977969918400049, -0.017449703067541122, -0.0006516044959425926, -0.026689833030104637, 0.0005521932616829872, -0.07149725407361984, -0.05317950248718262, -0.06427282094955444, 0.06331231445074081, -0.13161592185497284, 0.020773526281118393, 0.017362741753458977, -0.08465482294559479, 0.08232927322387695, -0.07723517715930939, 0.000284650013782084, 0.05567842721939087, -0.0781976729631424, -0.034030161798000336, 0.02443874627351761, 0.05138568580150604, -0.0034973903093487024, -0.1342732161283493, -0.011745240539312363, 0.021495074033737183, -0.0031556347385048866, 0.017812738195061684, 0.055512867867946625, -0.14673559367656708, -0.053349416702985764, -0.07749941200017929, -0.04484790563583374, -0.04512617364525795, 0.03738098591566086, 0.09454120695590973, 0.03196733072400093, 0.107289157807827, -0.05172770842909813, 0.022323841229081154, -0.12687845528125763, 0.01729756034910679, -0.021861866116523743, -0.011602995917201042, 0.04574275761842728, -0.067192442715168, 0.04690347611904144, -0.042266685515642166, 0.12690035998821259, -0.060046445578336716, 0.015318820253014565, 0.06750260293483734, -0.08502145111560822, -0.05424267053604126, -0.0009951837128028274, 0.21000224351882935, 0.07595351338386536, 0.016469458118081093, 0.025430943816900253, 0.02465573512017727, -0.0443863607943058, 0.10166937857866287, 0.2347017228603363, 0.23324744403362274, 0.023740539327263832, 0.06244739145040512, 0.006514877546578646, -0.09508246928453445, -0.07618185877799988, 0.08360598236322403, -0.022207951173186302, 0.059115104377269745, -0.02640300802886486, 0.06230170652270317, 0.15464255213737488, -0.15822824835777283, 0.0751209482550621, -0.0065008485689759254, -0.07270929962396622, -0.1314680278301239, -0.09418877214193344, -0.09002982825040817, -0.019615693017840385, -0.005765438545495272, -0.1123010665178299, 0.008569372817873955, 0.026421643793582916, 0.012501486577093601, -0.015950007364153862, 0.1447780281305313, -0.12144596874713898, -0.057273875921964645, 0.03287611901760101, -0.006277555599808693, -0.04582049325108528, 0.028804223984479904, -0.002318800427019596, 0.036177776753902435, 0.04904578626155853, 0.06572879105806351, 0.0043024844489991665, 0.0363834984600544, -0.0015784046845510602, -0.08342017978429794, -0.04798147827386856, -0.01721656322479248, 0.06671852618455887, 0.08387338370084763, 0.13555577397346497, 0.005828519351780415, -0.07308220118284225, -0.026012182235717773, 0.17996299266815186, 0.01616932824254036, -0.1299978643655777, -0.08969159424304962, 0.23534078896045685, 0.112303227186203, 0.04780730605125427, 0.013103138655424118, -0.10947148501873016, 0.06918281316757202, 0.13363437354564667, 0.16133452951908112, 0.03350267931818962, -0.012291948311030865, 0.007313128560781479, 0.017088953405618668, -0.019243689253926277, 0.1088935136795044, 0.03207132965326309, 0.19928888976573944, -0.03952968493103981, -0.0015639086486771703, -0.02266795188188553, -0.01282348670065403, -0.03446894511580467, 0.18904438614845276, 0.04166930168867111, -0.022001808509230614, -0.03285093605518341, 0.11586762964725494, 0.02223379723727703, -0.1558983027935028, 0.022417062893509865, -0.0452524796128273, -0.11863887310028076, 0.002192480955272913, -0.0750562995672226, -0.04547184333205223, 0.06116325408220291, -0.036015987396240234, -0.02861499786376953, 0.15162420272827148, 0.021086469292640686, -0.07434172183275223, 0.0027584019117057323, 0.12268725782632828, 0.12098818272352219, 0.19024328887462616, -0.0026493724435567856, 0.11523439735174179, 0.09235416352748871, -0.029434340074658394, -0.10144612938165665, 0.06318221986293793, 0.034680213779211044, -0.049247149378061295, 0.0950779840350151, 0.1542167067527771, -0.0040142168290913105, 0.025065718218684196, 0.02677156776189804, -0.17786312103271484, 0.0049175056628882885, 0.06269215792417526, -0.05482038855552673, -0.053160130977630615, 0.09205134958028793, -0.10686492919921875, 0.09909277409315109, 0.12210041284561157, 0.010823915712535381, 0.05388697609305382, -0.07131131738424301, 0.0580594465136528, 0.030465396121144295, 0.1213141679763794, -0.021949931979179382, -0.1694798320531845, 0.030794434249401093, -0.03442162275314331, 0.020430754870176315, -0.2380400002002716, -0.07406137883663177, 0.007932137697935104, 0.007474489975720644, -0.015521571040153503, 0.13007010519504547, 0.015496011823415756, 0.002817243104800582, -0.007683678064495325, -0.23730985820293427, -0.024654226377606392, 0.06889636069536209, -0.15273414552211761, -0.05173753947019577 ]
null
null
transformers
This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) on the GLUE MNLI dataset. It achieves the following results on the evaluation set: - Loss: 0.4103 - Accuracy: 0.9175 ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-06 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 50 - num_epochs: 2.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.3631 | 1.0 | 49088 | 0.3129 | 0.9130 | | 0.2267 | 2.0 | 98176 | 0.4157 | 0.9153 | ### Framework versions - Transformers 4.13.0.dev0 - Pytorch 1.10.0 - Datasets 1.15.2.dev0 - Tokenizers 0.10.3
{"language": "en", "license": "mit", "tags": ["deberta-v1", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification", "base_model": "microsoft/deberta-v3-large"}
zero-shot-classification
NDugar/v3-Large-mnli
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "deberta-v1", "deberta-mnli", "zero-shot-classification", "en", "base_model:microsoft/deberta-v3-large", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #base_model-microsoft/deberta-v3-large #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
This model is a fine-tuned version of microsoft/deberta-v3-large on the GLUE MNLI dataset. It achieves the following results on the evaluation set: * Loss: 0.4103 * Accuracy: 0.9175 ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 6e-06 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 50 * num\_epochs: 2.0 ### Training results ### Framework versions * Transformers 4.13.0.dev0 * Pytorch 1.10.0 * Datasets 1.15.2.dev0 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* num\\_epochs: 2.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #base_model-microsoft/deberta-v3-large #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* num\\_epochs: 2.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ 93, 116, 4, 36 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v1 #deberta-mnli #zero-shot-classification #en #base_model-microsoft/deberta-v3-large #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 50\n* num\\_epochs: 2.0### Training results### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0\n* Datasets 1.15.2.dev0\n* Tokenizers 0.10.3" ]
[ -0.14522477984428406, 0.11980360746383667, -0.003271405352279544, 0.07499635964632034, 0.12223662436008453, 0.027837593108415604, 0.12825588881969452, 0.11533161252737045, -0.097632996737957, 0.07902070134878159, 0.13557009398937225, 0.11741150170564651, 0.03511925786733627, 0.13933609426021576, -0.05656364560127258, -0.2864876985549927, 0.03662019222974777, 0.04611179977655411, -0.12062612175941467, 0.12097122520208359, 0.11115232110023499, -0.12487815320491791, 0.067132867872715, 0.015752609819173813, -0.11314968764781952, -0.01673153042793274, -0.009894944727420807, -0.10392370820045471, 0.11752206832170486, 0.013278673402965069, 0.10940995812416077, 0.04214181378483772, 0.08055230230093002, -0.14367729425430298, 0.000884869194123894, 0.08039430528879166, 0.014341160655021667, 0.09274562448263168, 0.07075608521699905, -0.007112679071724415, 0.10636939853429794, -0.12689705193042755, 0.08918013423681259, 0.030295947566628456, -0.10576529055833817, -0.32223379611968994, -0.07481846958398819, 0.09196627885103226, 0.09458129853010178, 0.0793725773692131, 0.0028753618244081736, 0.12724460661411285, -0.049925997853279114, 0.08270470052957535, 0.22623804211616516, -0.25119665265083313, -0.06552844494581223, 0.00985582172870636, 0.059382010251283646, 0.03300349786877632, -0.09699410200119019, -0.025727655738592148, 0.03688417747616768, 0.03633536025881767, 0.13642673194408417, 0.009464725852012634, 0.07528947293758392, -0.01910649985074997, -0.13953684270381927, -0.0710061565041542, 0.15656132996082306, 0.03271254524588585, -0.06181729957461357, -0.09624102711677551, -0.07688082754611969, -0.14960837364196777, -0.024823442101478577, -0.01525737252086401, 0.029888488352298737, -0.0683041587471962, -0.06263657659292221, 0.0069625223986804485, -0.07839599251747131, -0.09071695804595947, 0.016742639243602753, 0.24710096418857574, 0.06261856853961945, 0.009369690902531147, -0.02764653041958809, 0.13701219856739044, 0.06794522702693939, -0.1775219589471817, -0.030816230922937393, -0.0032521167304366827, 0.020666275173425674, 0.006716629024595022, -0.04745049029588699, 0.008503587916493416, 0.02276560850441456, 0.1879802942276001, -0.11930246651172638, 0.04652749001979828, 0.06326697766780853, 0.01080180611461401, -0.08717907965183258, 0.1938091516494751, -0.05357780307531357, -0.02962159365415573, 0.034808795899152756, 0.11894907057285309, 0.07842954993247986, -0.013702241703867912, -0.07298608124256134, -0.034099798649549484, 0.1574147790670395, 0.0630839616060257, -0.005823955871164799, 0.049150269478559494, -0.03999171778559685, -0.04174555093050003, 0.1126321479678154, -0.10566584020853043, 0.010500623844563961, 0.03009813092648983, -0.0711568221449852, -0.01781480200588703, 0.025589382275938988, -0.027791133150458336, -0.02736721932888031, 0.12768657505512238, -0.08148430287837982, -0.05615846812725067, -0.09621120989322662, -0.10774713009595871, 0.015776794403791428, -0.054738599807024, 0.015004835091531277, -0.11886093765497208, -0.157185360789299, 0.00032187809119932353, 0.02502300776541233, -0.020122086629271507, -0.0674993172287941, -0.006463746074587107, -0.11674967408180237, 0.0368657223880291, -0.026121364906430244, 0.10051856189966202, -0.05741655081510544, 0.11710497736930847, 0.07556341588497162, 0.05534741282463074, 0.0004924153327010572, 0.03975604102015495, -0.07332028448581696, 0.05403228476643562, -0.18455806374549866, 0.04704364761710167, -0.06792320311069489, 0.01817859709262848, -0.0815679207444191, -0.1465306431055069, 0.04849115014076233, -0.014070390723645687, 0.09015344828367233, 0.13851575553417206, -0.08991510421037674, -0.08267831057310104, 0.11818952113389969, -0.11337609589099884, -0.12390986829996109, 0.07979729026556015, -0.006673578172922134, -0.05825906619429588, 0.0432780496776104, 0.0956956297159195, 0.1273152381181717, -0.09581530839204788, -0.03339502960443497, -0.01919378526508808, 0.06786360591650009, -0.02389391139149666, 0.10379525274038315, 0.012146350927650928, 0.00042309300624765456, 0.02502094767987728, -0.12704786658287048, 0.028107602149248123, -0.09001089632511139, -0.07328007370233536, -0.051598623394966125, -0.07955574989318848, 0.06735511869192123, 0.05876690894365311, 0.025176413357257843, -0.10367199033498764, -0.13266223669052124, 0.07313194125890732, 0.1270843744277954, -0.07364536821842194, 0.011988459154963493, -0.07662832736968994, 0.07642636448144913, -0.037907179445028305, -0.005681424867361784, -0.17309047281742096, -0.08904382586479187, 0.025350462645292282, -0.06649800390005112, -0.0561845488846302, -0.05068525671958923, 0.06551363319158554, 0.10671296715736389, -0.06825943291187286, -0.10213660448789597, -0.10555876046419144, 0.006944252643734217, -0.06891375035047531, -0.1909574270248413, -0.09775283932685852, -0.023700563237071037, 0.11617105454206467, -0.15703414380550385, 0.050193969160318375, 0.03316519781947136, 0.13400380313396454, 0.026194153353571892, -0.010823660530149937, -0.014353315345942974, 0.06390504539012909, -0.037020087242126465, -0.08711935579776764, 0.03687986359000206, 0.009863042272627354, -0.09657347947359085, -0.02912227436900139, -0.13912667334079742, 0.18871620297431946, 0.12558089196681976, 0.012087504379451275, -0.05817588418722153, 0.03182191029191017, -0.08364338427782059, -0.035327330231666565, -0.018933063372969627, -0.018450574949383736, 0.08248437196016312, 0.015274276956915855, 0.13148058950901031, -0.09274791926145554, -0.06187957525253296, 0.02579304203391075, 0.010816225782036781, -0.029707426205277443, 0.10103849321603775, 0.020763671025633812, -0.050052858889102936, 0.1355038732290268, 0.1426127403974533, -0.08486085385084152, 0.12340345233678818, -0.07144144177436829, -0.06639781594276428, -0.02864838019013405, -0.010666310787200928, 0.03964683786034584, 0.14253400266170502, -0.09049436450004578, 0.008901680819690228, 0.029198387637734413, 0.00646611163392663, 0.005411664489656687, -0.1922304481267929, -0.006950286217033863, 0.024642013013362885, -0.049126867204904556, -0.022848062217235565, 0.01224031113088131, 0.016396723687648773, 0.09807097166776657, 0.014723606407642365, -0.0441802479326725, 0.035392966121435165, 0.0030874954536557198, -0.06672731786966324, 0.2289004772901535, -0.08672627806663513, -0.18501268327236176, -0.1375941038131714, 0.046506546437740326, -0.08796176314353943, 0.00921096932142973, 0.053137049078941345, -0.07315732538700104, -0.05090347304940224, -0.05090338736772537, 0.03358893841505051, -0.01169179379940033, 0.04516838863492012, -0.002759420545771718, 0.009717712178826332, 0.10637419670820236, -0.1304238736629486, -0.005527518689632416, -0.009227040223777294, -0.03912178799510002, 0.03279950097203255, 0.06028827279806137, 0.08871301263570786, 0.10992784798145294, -0.012602309696376324, 0.0014745367225259542, -0.015610476955771446, 0.20370037853717804, -0.08110469579696655, -0.021735163405537605, 0.19684135913848877, 0.010813139379024506, 0.05211153253912926, 0.08177401125431061, 0.039397597312927246, -0.07472869753837585, 0.0014485693536698818, 0.008553271181881428, -0.01947551965713501, -0.21332579851150513, -0.04031185805797577, -0.034396905452013016, 0.011392028070986271, 0.09820518642663956, 0.041863616555929184, 0.018991569057106972, 0.05160205066204071, -0.05865698680281639, 0.02803732454776764, 0.00545423524454236, 0.10993415117263794, 0.14564712345600128, 0.04643988236784935, 0.13363346457481384, -0.028602534905076027, -0.03742077574133873, 0.028686394914984703, -0.018780052661895752, 0.1630525439977646, -0.04437622055411339, 0.09724510461091995, 0.04079844430088997, 0.1416083723306656, 0.0030110434163361788, 0.09657927602529526, 0.02450023591518402, -0.0022593364119529724, 0.010805235244333744, -0.06098398566246033, -0.051362328231334686, 0.009410673752427101, -0.04829178750514984, 0.09111528098583221, -0.14809146523475647, 0.007211694028228521, 0.03477780148386955, 0.2929527759552002, 0.060011908411979675, -0.36889779567718506, -0.1426604986190796, 0.011083045974373817, -0.016443954780697823, -0.06747956573963165, 0.01197486650198698, 0.08031944930553436, -0.08965805917978287, 0.08905704319477081, -0.0786585733294487, 0.06614670157432556, -0.058062002062797546, 0.030345065519213676, 0.07628180831670761, 0.1262907087802887, 0.01086727250367403, 0.0551898218691349, -0.2633070945739746, 0.21507099270820618, 0.0034053099807351828, 0.09027311205863953, -0.052810560911893845, 0.02538982592523098, 0.03387696295976639, 0.07517874240875244, 0.045978616923093796, -0.02249976433813572, -0.018738947808742523, -0.165763720870018, -0.09877818077802658, 0.026645779609680176, 0.0935819000005722, -0.05802258104085922, 0.12299561500549316, -0.03209364786744118, -0.019091304391622543, 0.05555417761206627, -0.047568682581186295, -0.10037326067686081, -0.06383918970823288, 0.0380159467458725, 0.003137576626613736, 0.04735303297638893, -0.12578830122947693, -0.12311957031488419, 0.0028100223280489445, 0.12533263862133026, -0.1270235776901245, -0.09766966849565506, -0.13333269953727722, 0.06054186448454857, 0.1260322481393814, -0.08130136132240295, 0.06719683855772018, -0.013135318644344807, 0.14537376165390015, 0.019687972962856293, -0.06190509721636772, 0.07614545524120331, -0.09685944020748138, -0.2600916028022766, -0.024485258385539055, 0.14678969979286194, 0.005110487807542086, 0.032650288194417953, -0.027022510766983032, 0.030407212674617767, -0.017846664413809776, -0.08585763722658157, 0.027942651882767677, 0.02933739684522152, 0.07945635169744492, 0.014362768270075321, -0.022312603890895844, 0.004175199661403894, -0.030045054852962494, 0.0008617165731266141, 0.08608614653348923, 0.2716325521469116, -0.08288826793432236, -0.008846339769661427, 0.06045421585440636, -0.027481399476528168, -0.18362367153167725, -0.00810093805193901, 0.08224526047706604, 0.012270993553102016, -0.01462058536708355, -0.16708172857761383, 0.05269563943147659, 0.09866083413362503, -0.03454874828457832, 0.12852726876735687, -0.2840149998664856, -0.1375885307788849, 0.11763414740562439, 0.14018964767456055, 0.0582391694188118, -0.1694301813840866, -0.04813433066010475, -0.036308012902736664, -0.10647932440042496, 0.10842304676771164, -0.04319452494382858, 0.1014157235622406, -0.06035865098237991, 0.0174593236297369, 0.014557803981006145, -0.06421330571174622, 0.16383396089076996, -0.039705727249383926, 0.08226162195205688, -0.03131217882037163, 0.012632704339921474, 0.09941154718399048, -0.07576481252908707, 0.04943980276584625, -0.08094938099384308, 0.06870236247777939, -0.08204551786184311, -0.030650729313492775, -0.10311632603406906, 0.0565766915678978, -0.06918399035930634, -0.05342366546392441, -0.01838763803243637, 0.0344993956387043, 0.012599636800587177, -0.03148002550005913, 0.15327981114387512, 0.039258699864149094, 0.1346341371536255, 0.129060760140419, 0.0871908888220787, -0.0029319541063159704, -0.0792851597070694, -0.009555325843393803, -0.02862364985048771, 0.0963711142539978, -0.13141748309135437, 0.01769328862428665, 0.10590042173862457, 0.04031577333807945, 0.08767303079366684, 0.08145107328891754, -0.055831946432590485, -0.006707014050334692, 0.07253778725862503, -0.15898381173610687, -0.10843100398778915, -0.03532790765166283, 0.0016656406223773956, -0.127463236451149, 0.09657131135463715, 0.10866696387529373, -0.092061348259449, -0.019424214959144592, 0.0025667883455753326, 0.015593630261719227, -0.009356098249554634, 0.1905503273010254, 0.08420120924711227, 0.09018655121326447, -0.09227181226015091, 0.1092882826924324, 0.04030955210328102, -0.10388272255659103, 0.010141248814761639, 0.07120466232299805, -0.08514045923948288, -0.023093784227967262, 0.0063370829448103905, 0.059281282126903534, -0.06086369603872299, -0.07080435752868652, -0.16493213176727295, -0.14530444145202637, 0.0812503769993782, 0.10718406736850739, 0.0711405947804451, 0.0501532144844532, -0.01696914993226528, 0.049413006752729416, -0.12314824759960175, 0.1297401785850525, 0.040015943348407745, 0.09839672595262527, -0.18024751543998718, 0.12840324640274048, 0.01918458752334118, 0.027731288224458694, -0.010560838505625725, 0.025628628209233284, -0.11074836552143097, -0.01054518111050129, -0.15984663367271423, -0.019880175590515137, -0.028386374935507774, 0.00284383911639452, -0.02222231775522232, -0.053611695766448975, -0.04810661822557449, 0.05466149374842644, -0.07081449031829834, -0.05832955613732338, 0.000377564545487985, 0.03342314437031746, -0.1350288838148117, -0.029843207448720932, 0.015280324034392834, -0.13943029940128326, 0.09453975409269333, 0.047032084316015244, 0.037474166601896286, 0.02733752317726612, -0.014691098593175411, 0.006108996458351612, 0.048445772379636765, 0.00020604018936865032, 0.04063636064529419, -0.1503196507692337, 0.006756239105015993, -0.01922203041613102, -0.012349718250334263, 0.0015741150127723813, 0.08586061000823975, -0.13943836092948914, 0.015490073710680008, 0.004688003566116095, -0.03258852660655975, -0.07176413387060165, 0.027510695159435272, 0.10031116008758545, 0.004309434909373522, 0.1823621243238449, -0.08174590766429901, 0.02518283575773239, -0.21670353412628174, -0.004424778278917074, -0.02990502491593361, -0.1130540519952774, -0.10960687696933746, 0.011685369536280632, 0.08790166676044464, -0.04316290467977524, 0.07790065556764603, -0.03651732578873634, 0.08271664381027222, 0.050731487572193146, -0.029680196195840836, 0.030324047431349754, 0.05317290127277374, 0.2051430493593216, 0.020082058385014534, -0.015253744088113308, 0.0699370950460434, 0.01972026191651821, 0.08423694968223572, 0.04930763319134712, 0.16932985186576843, 0.1455530971288681, -0.0042822048999369144, 0.10412170737981796, 0.054796453565359116, -0.063443124294281, -0.20685142278671265, 0.0008001754758879542, 0.011779195629060268, 0.11013121902942657, -0.00029845957760699093, 0.13311946392059326, 0.12652409076690674, -0.17364634573459625, 0.023598864674568176, -0.0415545292198658, -0.06428813189268112, -0.10197915136814117, -0.0260409414768219, -0.06757209450006485, -0.14004504680633545, -0.004867188166826963, -0.12439432740211487, -0.013627364300191402, 0.06059815362095833, 0.004712786991149187, -0.02143963985145092, 0.1631702333688736, 0.08614397794008255, 0.014246158301830292, 0.08089756965637207, 0.04467101767659187, -0.015511110424995422, 0.0028829898219555616, -0.09376140683889389, 0.0053953975439071655, -0.01144855935126543, 0.03780995309352875, -0.04591987654566765, -0.09605325013399124, 0.05490259826183319, 0.031604498624801636, -0.10375440120697021, 0.03939656540751457, 0.01756931282579899, 0.05640292167663574, 0.10049653053283691, 0.00100495177321136, 0.003454858437180519, -0.014983716420829296, 0.2142522633075714, -0.07946758717298508, -0.07348298281431198, -0.08351273089647293, 0.27542710304260254, 0.033627670258283615, -0.02547888085246086, 0.06897518038749695, -0.09659609943628311, -0.04085623100399971, 0.11585201323032379, 0.15216690301895142, -0.012130276300013065, -0.008264019154012203, 0.008095746859908104, -0.022341661155223846, -0.032694846391677856, 0.10328894853591919, 0.13337263464927673, 0.10201236605644226, -0.05666821449995041, -0.00511543033644557, -0.042038608342409134, -0.006141582038253546, -0.05001392588019371, 0.09076821804046631, 0.0005502870189957321, -0.03227633237838745, -0.03883277252316475, 0.04720992222428322, -0.0125273447483778, -0.11556307226419449, 0.08840882033109665, -0.1749495267868042, -0.17396384477615356, -0.0329577699303627, 0.06920633465051651, -0.009385297074913979, 0.0770101249217987, 0.008398479782044888, -0.012401865795254707, 0.10427819192409515, -0.00959744118154049, -0.04450279846787453, -0.09930357336997986, 0.09126147627830505, -0.012768573127686977, 0.22030115127563477, -0.040043916553258896, 0.06192352622747421, 0.12262513488531113, 0.03485797718167305, -0.14825564622879028, 0.04277568310499191, 0.07463732361793518, -0.1086152046918869, 0.039325498044490814, 0.11982683837413788, -0.033673908561468124, 0.07500723749399185, 0.02217881754040718, -0.1482878178358078, -0.04020008072257042, 0.0157748032361269, -0.06663438677787781, -0.0570661686360836, -0.007500556297600269, -0.036723505705595016, 0.11063554137945175, 0.19355030357837677, -0.06147731840610504, -0.03139026463031769, -0.0608990378677845, 0.04417894035577774, 0.07545654475688934, 0.042170483618974686, 0.0019340954022482038, -0.2431352585554123, 0.034711021929979324, 0.03569735214114189, 0.011497176252305508, -0.24080824851989746, -0.079709492623806, 0.02273273840546608, -0.05993231385946274, -0.08855603635311127, 0.07486582547426224, 0.009227770380675793, 0.03586157411336899, -0.05103355646133423, 0.0074785612523555756, -0.09953037649393082, 0.1663469523191452, -0.18589964509010315, -0.08719271421432495 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/v3large-1epoch
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 92, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.05248786509037018, 0.10994835942983627, -0.005467510782182217, 0.04471215978264809, 0.09340216964483261, 0.04317917674779892, 0.019159981980919838, 0.13753889501094818, -0.021467166021466255, 0.06453929841518402, 0.049614597111940384, 0.04965709149837494, 0.04941791296005249, 0.08628464490175247, 0.002091106493026018, -0.21190305054187775, 0.06878571957349777, -0.03285076841711998, -0.09602037072181702, 0.04823007434606552, 0.11009711027145386, -0.11148811876773834, 0.05665912479162216, -0.007223171181976795, -0.037485551089048386, 0.030948560684919357, -0.001939448993653059, -0.006325188558548689, 0.10173650830984116, 0.0890059843659401, 0.055929794907569885, 0.12018140405416489, 0.06449942290782928, -0.1728953868150711, 0.03449626639485359, 0.08411046117544174, 0.03662298619747162, 0.06345472484827042, 0.07317227870225906, 0.053459346294403076, 0.03994544968008995, -0.11860650032758713, -0.013163705356419086, 0.04170243814587593, -0.022797267884016037, -0.17511224746704102, -0.09047289937734604, 0.006148539949208498, 0.0653214156627655, 0.03214886784553528, -0.013773745857179165, 0.09361980110406876, -0.08202794939279556, 0.040785059332847595, 0.19829455018043518, -0.2606689929962158, -0.027256496250629425, 0.06657399982213974, -0.017625262960791588, -0.0026975125074386597, -0.048790596425533295, 0.04051545634865761, -0.032310646027326584, 0.0077842650935053825, 0.07620234787464142, -0.013087994419038296, 0.00013908474647905678, 0.03725731372833252, -0.11067276448011398, -0.016884751617908478, -0.027969568967819214, -0.021628787741065025, -0.024520695209503174, -0.12658067047595978, -0.11892341822385788, -0.03543556481599808, -0.08263849467039108, -0.11719171702861786, 0.03533704951405525, -0.030529486015439034, 0.00935217086225748, -0.09885779023170471, -0.08550583571195602, -0.04299522563815117, -0.004897931590676308, 0.07045207917690277, 0.08120732009410858, -0.005597282666712999, 0.027444377541542053, 0.11959850788116455, -0.04764223098754883, -0.10345450043678284, -0.08126498013734818, -0.10083092004060745, -0.12172450125217438, 0.006879835855215788, -0.05531016364693642, -0.1157587468624115, 0.071403868496418, 0.1510118544101715, 0.11119783669710159, 0.11262045055627823, 0.026551665738224983, -0.027646029368042946, -0.037310343235731125, 0.16481496393680573, -0.08345787972211838, -0.054155103862285614, 0.10928547382354736, 0.0353289358317852, -0.003256253432482481, -0.013772360980510712, -0.03007045015692711, -0.07050500065088272, 0.09307830035686493, 0.08559049665927887, -0.03394328057765961, 0.06243322044610977, 0.021599024534225464, -0.048851024359464645, 0.11316288262605667, -0.15826861560344696, -0.024500321596860886, 0.020350584760308266, -0.048306889832019806, 0.035138726234436035, 0.0012692899908870459, -0.03387856483459473, -0.0789613425731659, 0.06624576449394226, -0.03179024159908295, -0.06369300186634064, -0.07520440965890884, -0.08526137471199036, -0.006864764727652073, 0.0464736633002758, -0.03995409235358238, -0.1344885230064392, -0.16391143202781677, 0.0059351311065256596, 0.027171818539500237, -0.027258755639195442, 0.03292093053460121, 0.0364459790289402, -0.015845920890569687, -0.0136746596544981, -0.011638876050710678, 0.039490312337875366, -0.02795335277915001, 0.0531499870121479, 0.0492517463862896, 0.07137174904346466, 0.02152988500893116, 0.012016968801617622, -0.05834314599633217, 0.023680612444877625, -0.23617972433567047, 0.11041712760925293, -0.11168105900287628, -0.07843003422021866, -0.09913486242294312, -0.05038474500179291, -0.028555143624544144, 0.004398890770971775, 0.029520373791456223, 0.0980367437005043, -0.15326951444149017, -0.042638346552848816, 0.132512167096138, -0.10517658293247223, -0.03401520103216171, 0.12262032181024551, -0.005840891040861607, -0.039171576499938965, 0.057403564453125, 0.0699680745601654, 0.2642686665058136, -0.15451079607009888, -0.058331143110990524, 0.03448306769132614, -0.028762033209204674, 0.022280052304267883, 0.03471627086400986, 0.07840503752231598, 0.07179949432611465, 0.05245451256632805, -0.13173367083072662, -0.014567672275006771, -0.029741134494543076, -0.053795505315065384, -0.03912070021033287, -0.02277691289782524, 0.062261808663606644, -0.027926919981837273, 0.037596527487039566, 0.04468581825494766, -0.09262204170227051, 0.02143244631588459, 0.13683822751045227, -0.05640053749084473, 0.005285004619508982, -0.13528308272361755, 0.02695525996387005, -0.009925128892064095, 0.05139392614364624, -0.13466285169124603, -0.16400644183158875, 0.04721270129084587, -0.20991376042366028, -0.0016785846091806889, 0.14498445391654968, 0.07670548558235168, 0.05799094960093498, -0.057894036173820496, 0.02159368060529232, -0.03444049879908562, -0.024430962279438972, -0.022354163229465485, -0.03988219425082207, -0.08141976594924927, -0.0012596710585057735, 0.14720800518989563, -0.10657649487257004, 0.013742748647928238, 0.08398186415433884, 0.14900992810726166, 0.025587650015950203, -0.015193620696663857, -0.019582772627472878, -0.03853331878781319, 0.04094058275222778, -0.07080685347318649, 0.0007934097666293383, 0.01058710552752018, -0.044090189039707184, 0.06802945584058762, -0.11370473355054855, 0.07483518868684769, 0.06561322510242462, 0.12442725151777267, 0.0015725580742582679, 0.02464107610285282, -0.032818105071783066, -0.03609941899776459, -0.04524993523955345, -0.06886724382638931, 0.026152141392230988, 0.07893149554729462, 0.11724889278411865, -0.08105127513408661, -0.05388174206018448, -0.0033652526326477528, 0.02398104965686798, 0.0077776541002094746, 0.04027590528130531, 0.03142917901277542, -0.10126753151416779, 0.008144508115947247, 0.03952757269144058, -0.024691684171557426, 0.10158303380012512, -0.028517084196209908, -0.0935220867395401, -0.002102983184158802, -0.01647019386291504, -0.004117236472666264, 0.02396971732378006, 0.10021873563528061, 0.07118392735719681, 0.04731997475028038, 0.0031030243262648582, 0.06354942917823792, -0.0956677570939064, 0.06526966392993927, 0.028451357036828995, -0.07633986324071884, 0.03165639191865921, 0.07824761420488358, 0.034009091556072235, 0.03349834308028221, 0.012114671990275383, 0.05815312638878822, 0.0126205924898386, -0.029167447239160538, -0.03784281015396118, 0.13161076605319977, -0.07466860860586166, -0.23739922046661377, -0.18560710549354553, -0.020509371533989906, -0.06202395632863045, -0.020378297194838524, 0.036769673228263855, -0.05554155260324478, -0.11999281495809555, -0.029191819950938225, 0.08500637114048004, 0.0551152229309082, -0.007675317581743002, -0.008838021196424961, 0.01482202298939228, 0.0767645314335823, -0.13429218530654907, 0.016754312440752983, 0.006903713569045067, -0.05598214641213417, 0.04259858652949333, 0.059820860624313354, 0.05197055637836456, 0.07039818167686462, -0.03718186542391777, 0.010996921919286251, 0.0061325025744736195, 0.21486301720142365, -0.03441797196865082, 0.059645939618349075, 0.22849783301353455, -0.002838515443727374, 0.07406511902809143, 0.07057229429483414, 0.028315775096416473, -0.04708164557814598, 0.02515488676726818, 0.0865270122885704, -0.007152179721742868, -0.2053491771221161, -0.0523148812353611, -0.04808277264237404, -0.06752238422632217, 0.0025697918608784676, 0.05295886471867561, -0.02004297450184822, 0.012696322984993458, -0.05761154741048813, 0.011503773741424084, 0.014804588630795479, 0.023530403152108192, 0.2057102769613266, 0.02799692004919052, 0.0746997743844986, -0.06259556114673615, -0.03975057974457741, 0.05752290040254593, 0.005493388045579195, 0.10943197458982468, -0.07146921753883362, 0.13935622572898865, 0.04467201977968216, 0.10450833290815353, 0.05114821717143059, 0.06168511137366295, 0.015564057976007462, -0.03046846203505993, -0.007380894385278225, -0.0877595916390419, -0.09284864366054535, 0.005342492368072271, 0.012680776417255402, -0.07192347943782806, -0.02180839329957962, 0.16562466323375702, 0.025622453540563583, 0.2184903621673584, -0.023376774042844772, -0.15911461412906647, -0.0217476524412632, -0.0019195827189832926, -0.048296231776475906, -0.04126065969467163, -0.005157396197319031, 0.0571865476667881, -0.07673674076795578, -0.016997020691633224, -0.07628607749938965, 0.05704515427350998, -0.07825256139039993, 0.06118202582001686, 0.05898342654109001, 0.15965154767036438, 0.05286767706274986, 0.08352065831422806, -0.15170086920261383, 0.06644449383020401, 0.046357207000255585, 0.0979679748415947, -0.04999963194131851, 0.046936556696891785, 0.06495996564626694, -0.023974411189556122, 0.09356680512428284, -0.039156340062618256, -0.10310575366020203, -0.11532759666442871, -0.1157265231013298, 0.07230009138584137, 0.14092355966567993, -0.0011153907980769873, 0.09969937056303024, -0.08713936060667038, -0.03793153539299965, -0.00916187185794115, 0.08623848110437393, -0.0856265053153038, -0.1486063301563263, 0.11341067403554916, -0.0356278270483017, 0.04122563451528549, -0.07965865731239319, -0.03276865556836128, -0.07004372030496597, 0.13218143582344055, -0.09084780514240265, -0.07014522701501846, -0.12851998209953308, 0.025070512667298317, 0.14649488031864166, -0.10682293772697449, 0.0694500133395195, 0.01810644567012787, 0.192337304353714, 0.020692704245448112, -0.14891543984413147, -0.02793753519654274, -0.06852144002914429, -0.1334713101387024, 0.053375471383333206, 0.10083025693893433, -0.04522741958498955, 0.03457140177488327, 0.016633057966828346, 0.003461731132119894, 0.03812361881136894, -0.0871719941496849, -0.053291842341423035, 0.031642813235521317, -0.04741973057389259, -0.028490344062447548, -0.0760255679488182, 0.056536704301834106, -0.0375211164355278, 0.03856245055794716, 0.1079038754105568, 0.2538522481918335, -0.07664060592651367, 0.08494670689105988, 0.14096105098724365, -0.00045794108882546425, -0.2816721200942993, -0.09079640358686447, 0.05522730201482773, 0.03392026200890541, -0.004510658327490091, -0.18727391958236694, 0.13304854929447174, 0.1299201101064682, -0.028250761330127716, -0.05205953121185303, -0.28024742007255554, -0.13687776029109955, 0.10703638195991516, 0.026750817894935608, -0.0096009885892272, -0.04904387146234512, -0.03379761055111885, -0.08941569924354553, -0.13432638347148895, 0.030866077169775963, -0.103683240711689, 0.1194305494427681, -0.01932978630065918, -0.00847755279392004, 0.027148155495524406, -0.05625760555267334, 0.1177000030875206, -0.04954642429947853, 0.07984868437051773, -0.007457729894667864, 0.10690005123615265, 0.02935246005654335, -0.07160631567239761, 0.08407698571681976, 0.022145647555589676, 0.07168105244636536, -0.04636939615011215, -0.041990119963884354, -0.037669964134693146, 0.013483725488185883, -0.0036553058307617903, -0.051872368901968, -0.035508863627910614, 0.05151776596903801, 0.043052129447460175, -0.04712517932057381, -0.02618650160729885, -0.017274418845772743, -0.06008946895599365, 0.12942689657211304, 0.05467183142900467, -0.04869813099503517, -0.09927435219287872, -0.020192112773656845, 0.012116116471588612, 0.0631769672036171, -0.010645243339240551, 0.12130718678236008, 0.09778590500354767, -0.005686087533831596, 0.06400512903928757, 0.0174678023904562, -0.036258455365896225, -0.03368772193789482, 0.058192551136016846, -0.1342407912015915, -0.23887553811073303, -0.03365613892674446, 0.0033499363344162703, -0.0824025422334671, -0.008862907066941261, 0.1475677490234375, -0.035012777894735336, -0.026465512812137604, 0.054771967232227325, 0.005821890663355589, 0.02404177375137806, 0.1899624764919281, 0.045333560556173325, 0.07043519616127014, -0.11974886059761047, 0.1125115379691124, 0.051046088337898254, -0.056233443319797516, -0.01958274096250534, 0.08773767203092575, -0.09649501740932465, -0.05990760773420334, -0.03373216837644577, 0.054946575313806534, 0.022463973611593246, -0.02342107892036438, -0.07997974008321762, -0.05560607090592384, 0.01349085196852684, 0.10498231649398804, 0.03161397948861122, 0.07918791472911835, -0.057261642068624496, 0.012957372702658176, -0.10870305448770523, 0.0915551409125328, 0.0546000637114048, 0.054642703384160995, -0.1312563419342041, 0.07490182667970657, -0.040271297097206116, 0.04325323551893234, -0.05855598300695419, 0.018446533009409904, -0.06295198202133179, -0.04241223633289337, -0.19051052629947662, 0.026791071519255638, -0.002484687604010105, -0.03227226808667183, 0.005562789272516966, 0.027616767212748528, -0.028694186359643936, 0.034056589007377625, -0.07711833715438843, -0.04772215709090233, -0.052961889654397964, 0.03711063414812088, -0.13134793937206268, 0.013920022174715996, 0.030782021582126617, -0.09536928683519363, 0.0935196727514267, 0.015912605449557304, 0.014630092307925224, 0.10151588171720505, -0.0014111794298514724, -0.11602471023797989, 0.0279350895434618, 0.08665609359741211, 0.026108605787158012, -0.060855619609355927, 0.008410155773162842, -0.025302859023213387, -0.039376724511384964, -0.028250757604837418, 0.04287896305322647, -0.11909769475460052, 0.05288703367114067, -0.0908200815320015, 0.049233585596084595, -0.06038779765367508, 0.008487991988658905, 0.07544833421707153, 0.07238227874040604, 0.11592923104763031, -0.08440879732370377, -0.002295872662216425, -0.14113588631153107, -0.005996012594550848, 0.0029736917931586504, -0.05111832171678543, -0.06929165124893188, -0.05692100524902344, 0.06236356124281883, -0.0374278761446476, 0.0928216278553009, -0.04955766722559929, -0.012813023291528225, 0.05942251905798912, -0.044647589325904846, -0.07193353772163391, 0.02002868987619877, 0.0810893177986145, 0.09356385469436646, 0.012770130299031734, 0.03364058583974838, -0.009982559829950333, -0.017038246616721153, 0.00570025946944952, 0.1734001189470291, 0.20598065853118896, 0.05310660973191261, 0.09354518353939056, 0.01848558895289898, -0.04549426585435867, -0.0879007875919342, 0.12807253003120422, -0.12106625735759735, 0.08555696159601212, -0.0651877224445343, 0.08981695771217346, 0.10637377947568893, -0.11087625473737717, 0.06504566222429276, -0.04535943642258644, -0.056134458631277084, -0.14657913148403168, -0.011908870190382004, -0.09575273841619492, -0.07153306156396866, -0.006902842782437801, -0.0933193638920784, 0.035647906363010406, 0.051662761718034744, 0.041779566556215286, -0.018097981810569763, 0.18661251664161682, -0.04402817040681839, -0.06023552641272545, 0.012558347545564175, -0.0015414515510201454, -0.032215557992458344, 0.09013161808252335, -0.044651567935943604, 0.03691178187727928, 0.06949005275964737, 0.05469094216823578, 0.006606907118111849, 0.04984856769442558, 0.013365286402404308, -0.07340014725923538, -0.07678894698619843, -0.005060891155153513, 0.03026263229548931, 0.02459152415394783, 0.057619836181402206, -0.008795030415058136, -0.04754363000392914, -0.038990963250398636, 0.2243104726076126, -0.043232716619968414, -0.15351690351963043, -0.11050417274236679, 0.20992766320705414, 0.09460540115833282, 0.07465030252933502, 0.020717481151223183, -0.10164400190114975, -0.035813771188259125, 0.16164207458496094, 0.10229302197694778, -0.0820680484175682, -0.011812807992100716, 0.033906951546669006, -0.001267618965357542, -0.0699521079659462, 0.1539791375398636, 0.05177602916955948, 0.17261435091495514, -0.039768241345882416, 0.026755133643746376, 0.009156722575426102, -0.052469197660684586, -0.06932633370161057, 0.19067862629890442, -0.03324173018336296, 0.026992594823241234, -0.030103102326393127, 0.0008431829628534615, -0.015995625406503677, -0.17456240952014923, -0.07776188105344772, -0.03505165874958038, -0.1397048532962799, -0.010166577063500881, -0.03467061370611191, -0.009251022711396217, 0.0576566644012928, -0.018798671662807465, -0.0060266125947237015, 0.15200944244861603, -0.0008376374607905746, -0.043094802647829056, -0.0009960895404219627, 0.12062244862318039, 0.06621508300304413, 0.17722442746162415, 0.031548481434583664, 0.027270294725894928, 0.07912461459636688, -0.030707918107509613, -0.12157728523015976, 0.03210127353668213, 0.04177055135369301, -0.21426956355571747, 0.021677173674106598, 0.15935364365577698, -0.007145410403609276, 0.039672497659921646, 0.020146293565630913, -0.12697573006153107, -0.017480501905083656, 0.039941493421792984, -0.033559828996658325, -0.032726459205150604, 0.03951573744416237, -0.03302349895238876, 0.09040691703557968, 0.19200623035430908, 0.01607685722410679, 0.056299492716789246, -0.08590008318424225, 0.04563035070896149, 0.03404022753238678, 0.08396018296480179, -0.025193924084305763, -0.14727675914764404, 0.03707307204604149, -0.019828327000141144, -0.013500611297786236, -0.17609667778015137, -0.09780727326869965, 0.014040857553482056, 0.008958694525063038, -0.009530560113489628, 0.13986051082611084, 0.04514285922050476, 0.05398614704608917, 0.01974545046687126, -0.13092140853405, -0.048084113746881485, 0.02893124148249626, -0.1576765477657318, -0.057555731385946274 ]
null
null
transformers
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B | |---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------| | | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S | | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- | | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- | | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- | | [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 | | [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7| | [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9| |**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** | -------- #### Notes. - <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. - <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, we recommand using **deepspeed** as it's faster and saves memory. Run with `Deepspeed`, ```bash pip install datasets pip install deepspeed # Download the deepspeed config file wget https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/ds_config.json -O ds_config.json export TASK_NAME=mnli output_dir="ds_results" num_gpus=8 batch_size=8 python -m torch.distributed.launch --nproc_per_node=${num_gpus} \\ run_glue.py \\ --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME \\ --do_train \\ --do_eval \\ --max_seq_length 256 \\ --per_device_train_batch_size ${batch_size} \\ --learning_rate 3e-6 \\ --num_train_epochs 3 \\ --output_dir $output_dir \\ --overwrite_output_dir \\ --logging_steps 10 \\ --logging_dir $output_dir \\ --deepspeed ds_config.json ``` You can also run with `--sharded_ddp` ```bash cd transformers/examples/text-classification/ export TASK_NAME=mnli python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\ --task_name $TASK_NAME --do_train --do_eval --max_seq_length 256 --per_device_train_batch_size 8 \\ --learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16 ``` ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
{"language": "en", "license": "mit", "tags": ["deberta-v3", "deberta-v2`", "deberta-mnli"], "tasks": "mnli", "thumbnail": "https://huggingface.co/front/thumbnails/microsoft.png", "pipeline_tag": "zero-shot-classification"}
zero-shot-classification
NDugar/v3large-2epoch
[ "transformers", "pytorch", "safetensors", "deberta-v2", "text-classification", "deberta-v3", "deberta-v2`", "deberta-mnli", "zero-shot-classification", "en", "arxiv:2006.03654", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.03654" ]
[ "en" ]
TAGS #transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us
DeBERTa: Decoding-enhanced BERT with Disentangled Attention ----------------------------------------------------------- DeBERTa improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the official repository for more details and updates. This is the DeBERTa V2 xxlarge model with 48 layers, 1536 hidden size. The total parameters are 1.5B and it is trained with 160GB raw data. ### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks. --- #### Notes. * 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks. * 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory. Run with 'Deepspeed', You can also run with '--sharded\_ddp' If you find DeBERTa useful for your work, please cite the following paper:
[ "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ "TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---", "#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ 92, 32, 215 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #deberta-v2 #text-classification #deberta-v3 #deberta-v2` #deberta-mnli #zero-shot-classification #en #arxiv-2006.03654 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Fine-tuning on NLU tasks\n\n\nWe present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.\n\n\n\n\n\n---#### Notes.\n\n\n* 1 Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on DeBERTa-Large-MNLI, DeBERTa-XLarge-MNLI, DeBERTa-V2-XLarge-MNLI, DeBERTa-V2-XXLarge-MNLI. The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.\n* 2 To try the XXLarge model with HF transformers, we recommand using deepspeed as it's faster and saves memory.\n\n\nRun with 'Deepspeed',\n\n\nYou can also run with '--sharded\\_ddp'\n\n\nIf you find DeBERTa useful for your work, please cite the following paper:" ]
[ -0.05248786509037018, 0.10994835942983627, -0.005467510782182217, 0.04471215978264809, 0.09340216964483261, 0.04317917674779892, 0.019159981980919838, 0.13753889501094818, -0.021467166021466255, 0.06453929841518402, 0.049614597111940384, 0.04965709149837494, 0.04941791296005249, 0.08628464490175247, 0.002091106493026018, -0.21190305054187775, 0.06878571957349777, -0.03285076841711998, -0.09602037072181702, 0.04823007434606552, 0.11009711027145386, -0.11148811876773834, 0.05665912479162216, -0.007223171181976795, -0.037485551089048386, 0.030948560684919357, -0.001939448993653059, -0.006325188558548689, 0.10173650830984116, 0.0890059843659401, 0.055929794907569885, 0.12018140405416489, 0.06449942290782928, -0.1728953868150711, 0.03449626639485359, 0.08411046117544174, 0.03662298619747162, 0.06345472484827042, 0.07317227870225906, 0.053459346294403076, 0.03994544968008995, -0.11860650032758713, -0.013163705356419086, 0.04170243814587593, -0.022797267884016037, -0.17511224746704102, -0.09047289937734604, 0.006148539949208498, 0.0653214156627655, 0.03214886784553528, -0.013773745857179165, 0.09361980110406876, -0.08202794939279556, 0.040785059332847595, 0.19829455018043518, -0.2606689929962158, -0.027256496250629425, 0.06657399982213974, -0.017625262960791588, -0.0026975125074386597, -0.048790596425533295, 0.04051545634865761, -0.032310646027326584, 0.0077842650935053825, 0.07620234787464142, -0.013087994419038296, 0.00013908474647905678, 0.03725731372833252, -0.11067276448011398, -0.016884751617908478, -0.027969568967819214, -0.021628787741065025, -0.024520695209503174, -0.12658067047595978, -0.11892341822385788, -0.03543556481599808, -0.08263849467039108, -0.11719171702861786, 0.03533704951405525, -0.030529486015439034, 0.00935217086225748, -0.09885779023170471, -0.08550583571195602, -0.04299522563815117, -0.004897931590676308, 0.07045207917690277, 0.08120732009410858, -0.005597282666712999, 0.027444377541542053, 0.11959850788116455, -0.04764223098754883, -0.10345450043678284, -0.08126498013734818, -0.10083092004060745, -0.12172450125217438, 0.006879835855215788, -0.05531016364693642, -0.1157587468624115, 0.071403868496418, 0.1510118544101715, 0.11119783669710159, 0.11262045055627823, 0.026551665738224983, -0.027646029368042946, -0.037310343235731125, 0.16481496393680573, -0.08345787972211838, -0.054155103862285614, 0.10928547382354736, 0.0353289358317852, -0.003256253432482481, -0.013772360980510712, -0.03007045015692711, -0.07050500065088272, 0.09307830035686493, 0.08559049665927887, -0.03394328057765961, 0.06243322044610977, 0.021599024534225464, -0.048851024359464645, 0.11316288262605667, -0.15826861560344696, -0.024500321596860886, 0.020350584760308266, -0.048306889832019806, 0.035138726234436035, 0.0012692899908870459, -0.03387856483459473, -0.0789613425731659, 0.06624576449394226, -0.03179024159908295, -0.06369300186634064, -0.07520440965890884, -0.08526137471199036, -0.006864764727652073, 0.0464736633002758, -0.03995409235358238, -0.1344885230064392, -0.16391143202781677, 0.0059351311065256596, 0.027171818539500237, -0.027258755639195442, 0.03292093053460121, 0.0364459790289402, -0.015845920890569687, -0.0136746596544981, -0.011638876050710678, 0.039490312337875366, -0.02795335277915001, 0.0531499870121479, 0.0492517463862896, 0.07137174904346466, 0.02152988500893116, 0.012016968801617622, -0.05834314599633217, 0.023680612444877625, -0.23617972433567047, 0.11041712760925293, -0.11168105900287628, -0.07843003422021866, -0.09913486242294312, -0.05038474500179291, -0.028555143624544144, 0.004398890770971775, 0.029520373791456223, 0.0980367437005043, -0.15326951444149017, -0.042638346552848816, 0.132512167096138, -0.10517658293247223, -0.03401520103216171, 0.12262032181024551, -0.005840891040861607, -0.039171576499938965, 0.057403564453125, 0.0699680745601654, 0.2642686665058136, -0.15451079607009888, -0.058331143110990524, 0.03448306769132614, -0.028762033209204674, 0.022280052304267883, 0.03471627086400986, 0.07840503752231598, 0.07179949432611465, 0.05245451256632805, -0.13173367083072662, -0.014567672275006771, -0.029741134494543076, -0.053795505315065384, -0.03912070021033287, -0.02277691289782524, 0.062261808663606644, -0.027926919981837273, 0.037596527487039566, 0.04468581825494766, -0.09262204170227051, 0.02143244631588459, 0.13683822751045227, -0.05640053749084473, 0.005285004619508982, -0.13528308272361755, 0.02695525996387005, -0.009925128892064095, 0.05139392614364624, -0.13466285169124603, -0.16400644183158875, 0.04721270129084587, -0.20991376042366028, -0.0016785846091806889, 0.14498445391654968, 0.07670548558235168, 0.05799094960093498, -0.057894036173820496, 0.02159368060529232, -0.03444049879908562, -0.024430962279438972, -0.022354163229465485, -0.03988219425082207, -0.08141976594924927, -0.0012596710585057735, 0.14720800518989563, -0.10657649487257004, 0.013742748647928238, 0.08398186415433884, 0.14900992810726166, 0.025587650015950203, -0.015193620696663857, -0.019582772627472878, -0.03853331878781319, 0.04094058275222778, -0.07080685347318649, 0.0007934097666293383, 0.01058710552752018, -0.044090189039707184, 0.06802945584058762, -0.11370473355054855, 0.07483518868684769, 0.06561322510242462, 0.12442725151777267, 0.0015725580742582679, 0.02464107610285282, -0.032818105071783066, -0.03609941899776459, -0.04524993523955345, -0.06886724382638931, 0.026152141392230988, 0.07893149554729462, 0.11724889278411865, -0.08105127513408661, -0.05388174206018448, -0.0033652526326477528, 0.02398104965686798, 0.0077776541002094746, 0.04027590528130531, 0.03142917901277542, -0.10126753151416779, 0.008144508115947247, 0.03952757269144058, -0.024691684171557426, 0.10158303380012512, -0.028517084196209908, -0.0935220867395401, -0.002102983184158802, -0.01647019386291504, -0.004117236472666264, 0.02396971732378006, 0.10021873563528061, 0.07118392735719681, 0.04731997475028038, 0.0031030243262648582, 0.06354942917823792, -0.0956677570939064, 0.06526966392993927, 0.028451357036828995, -0.07633986324071884, 0.03165639191865921, 0.07824761420488358, 0.034009091556072235, 0.03349834308028221, 0.012114671990275383, 0.05815312638878822, 0.0126205924898386, -0.029167447239160538, -0.03784281015396118, 0.13161076605319977, -0.07466860860586166, -0.23739922046661377, -0.18560710549354553, -0.020509371533989906, -0.06202395632863045, -0.020378297194838524, 0.036769673228263855, -0.05554155260324478, -0.11999281495809555, -0.029191819950938225, 0.08500637114048004, 0.0551152229309082, -0.007675317581743002, -0.008838021196424961, 0.01482202298939228, 0.0767645314335823, -0.13429218530654907, 0.016754312440752983, 0.006903713569045067, -0.05598214641213417, 0.04259858652949333, 0.059820860624313354, 0.05197055637836456, 0.07039818167686462, -0.03718186542391777, 0.010996921919286251, 0.0061325025744736195, 0.21486301720142365, -0.03441797196865082, 0.059645939618349075, 0.22849783301353455, -0.002838515443727374, 0.07406511902809143, 0.07057229429483414, 0.028315775096416473, -0.04708164557814598, 0.02515488676726818, 0.0865270122885704, -0.007152179721742868, -0.2053491771221161, -0.0523148812353611, -0.04808277264237404, -0.06752238422632217, 0.0025697918608784676, 0.05295886471867561, -0.02004297450184822, 0.012696322984993458, -0.05761154741048813, 0.011503773741424084, 0.014804588630795479, 0.023530403152108192, 0.2057102769613266, 0.02799692004919052, 0.0746997743844986, -0.06259556114673615, -0.03975057974457741, 0.05752290040254593, 0.005493388045579195, 0.10943197458982468, -0.07146921753883362, 0.13935622572898865, 0.04467201977968216, 0.10450833290815353, 0.05114821717143059, 0.06168511137366295, 0.015564057976007462, -0.03046846203505993, -0.007380894385278225, -0.0877595916390419, -0.09284864366054535, 0.005342492368072271, 0.012680776417255402, -0.07192347943782806, -0.02180839329957962, 0.16562466323375702, 0.025622453540563583, 0.2184903621673584, -0.023376774042844772, -0.15911461412906647, -0.0217476524412632, -0.0019195827189832926, -0.048296231776475906, -0.04126065969467163, -0.005157396197319031, 0.0571865476667881, -0.07673674076795578, -0.016997020691633224, -0.07628607749938965, 0.05704515427350998, -0.07825256139039993, 0.06118202582001686, 0.05898342654109001, 0.15965154767036438, 0.05286767706274986, 0.08352065831422806, -0.15170086920261383, 0.06644449383020401, 0.046357207000255585, 0.0979679748415947, -0.04999963194131851, 0.046936556696891785, 0.06495996564626694, -0.023974411189556122, 0.09356680512428284, -0.039156340062618256, -0.10310575366020203, -0.11532759666442871, -0.1157265231013298, 0.07230009138584137, 0.14092355966567993, -0.0011153907980769873, 0.09969937056303024, -0.08713936060667038, -0.03793153539299965, -0.00916187185794115, 0.08623848110437393, -0.0856265053153038, -0.1486063301563263, 0.11341067403554916, -0.0356278270483017, 0.04122563451528549, -0.07965865731239319, -0.03276865556836128, -0.07004372030496597, 0.13218143582344055, -0.09084780514240265, -0.07014522701501846, -0.12851998209953308, 0.025070512667298317, 0.14649488031864166, -0.10682293772697449, 0.0694500133395195, 0.01810644567012787, 0.192337304353714, 0.020692704245448112, -0.14891543984413147, -0.02793753519654274, -0.06852144002914429, -0.1334713101387024, 0.053375471383333206, 0.10083025693893433, -0.04522741958498955, 0.03457140177488327, 0.016633057966828346, 0.003461731132119894, 0.03812361881136894, -0.0871719941496849, -0.053291842341423035, 0.031642813235521317, -0.04741973057389259, -0.028490344062447548, -0.0760255679488182, 0.056536704301834106, -0.0375211164355278, 0.03856245055794716, 0.1079038754105568, 0.2538522481918335, -0.07664060592651367, 0.08494670689105988, 0.14096105098724365, -0.00045794108882546425, -0.2816721200942993, -0.09079640358686447, 0.05522730201482773, 0.03392026200890541, -0.004510658327490091, -0.18727391958236694, 0.13304854929447174, 0.1299201101064682, -0.028250761330127716, -0.05205953121185303, -0.28024742007255554, -0.13687776029109955, 0.10703638195991516, 0.026750817894935608, -0.0096009885892272, -0.04904387146234512, -0.03379761055111885, -0.08941569924354553, -0.13432638347148895, 0.030866077169775963, -0.103683240711689, 0.1194305494427681, -0.01932978630065918, -0.00847755279392004, 0.027148155495524406, -0.05625760555267334, 0.1177000030875206, -0.04954642429947853, 0.07984868437051773, -0.007457729894667864, 0.10690005123615265, 0.02935246005654335, -0.07160631567239761, 0.08407698571681976, 0.022145647555589676, 0.07168105244636536, -0.04636939615011215, -0.041990119963884354, -0.037669964134693146, 0.013483725488185883, -0.0036553058307617903, -0.051872368901968, -0.035508863627910614, 0.05151776596903801, 0.043052129447460175, -0.04712517932057381, -0.02618650160729885, -0.017274418845772743, -0.06008946895599365, 0.12942689657211304, 0.05467183142900467, -0.04869813099503517, -0.09927435219287872, -0.020192112773656845, 0.012116116471588612, 0.0631769672036171, -0.010645243339240551, 0.12130718678236008, 0.09778590500354767, -0.005686087533831596, 0.06400512903928757, 0.0174678023904562, -0.036258455365896225, -0.03368772193789482, 0.058192551136016846, -0.1342407912015915, -0.23887553811073303, -0.03365613892674446, 0.0033499363344162703, -0.0824025422334671, -0.008862907066941261, 0.1475677490234375, -0.035012777894735336, -0.026465512812137604, 0.054771967232227325, 0.005821890663355589, 0.02404177375137806, 0.1899624764919281, 0.045333560556173325, 0.07043519616127014, -0.11974886059761047, 0.1125115379691124, 0.051046088337898254, -0.056233443319797516, -0.01958274096250534, 0.08773767203092575, -0.09649501740932465, -0.05990760773420334, -0.03373216837644577, 0.054946575313806534, 0.022463973611593246, -0.02342107892036438, -0.07997974008321762, -0.05560607090592384, 0.01349085196852684, 0.10498231649398804, 0.03161397948861122, 0.07918791472911835, -0.057261642068624496, 0.012957372702658176, -0.10870305448770523, 0.0915551409125328, 0.0546000637114048, 0.054642703384160995, -0.1312563419342041, 0.07490182667970657, -0.040271297097206116, 0.04325323551893234, -0.05855598300695419, 0.018446533009409904, -0.06295198202133179, -0.04241223633289337, -0.19051052629947662, 0.026791071519255638, -0.002484687604010105, -0.03227226808667183, 0.005562789272516966, 0.027616767212748528, -0.028694186359643936, 0.034056589007377625, -0.07711833715438843, -0.04772215709090233, -0.052961889654397964, 0.03711063414812088, -0.13134793937206268, 0.013920022174715996, 0.030782021582126617, -0.09536928683519363, 0.0935196727514267, 0.015912605449557304, 0.014630092307925224, 0.10151588171720505, -0.0014111794298514724, -0.11602471023797989, 0.0279350895434618, 0.08665609359741211, 0.026108605787158012, -0.060855619609355927, 0.008410155773162842, -0.025302859023213387, -0.039376724511384964, -0.028250757604837418, 0.04287896305322647, -0.11909769475460052, 0.05288703367114067, -0.0908200815320015, 0.049233585596084595, -0.06038779765367508, 0.008487991988658905, 0.07544833421707153, 0.07238227874040604, 0.11592923104763031, -0.08440879732370377, -0.002295872662216425, -0.14113588631153107, -0.005996012594550848, 0.0029736917931586504, -0.05111832171678543, -0.06929165124893188, -0.05692100524902344, 0.06236356124281883, -0.0374278761446476, 0.0928216278553009, -0.04955766722559929, -0.012813023291528225, 0.05942251905798912, -0.044647589325904846, -0.07193353772163391, 0.02002868987619877, 0.0810893177986145, 0.09356385469436646, 0.012770130299031734, 0.03364058583974838, -0.009982559829950333, -0.017038246616721153, 0.00570025946944952, 0.1734001189470291, 0.20598065853118896, 0.05310660973191261, 0.09354518353939056, 0.01848558895289898, -0.04549426585435867, -0.0879007875919342, 0.12807253003120422, -0.12106625735759735, 0.08555696159601212, -0.0651877224445343, 0.08981695771217346, 0.10637377947568893, -0.11087625473737717, 0.06504566222429276, -0.04535943642258644, -0.056134458631277084, -0.14657913148403168, -0.011908870190382004, -0.09575273841619492, -0.07153306156396866, -0.006902842782437801, -0.0933193638920784, 0.035647906363010406, 0.051662761718034744, 0.041779566556215286, -0.018097981810569763, 0.18661251664161682, -0.04402817040681839, -0.06023552641272545, 0.012558347545564175, -0.0015414515510201454, -0.032215557992458344, 0.09013161808252335, -0.044651567935943604, 0.03691178187727928, 0.06949005275964737, 0.05469094216823578, 0.006606907118111849, 0.04984856769442558, 0.013365286402404308, -0.07340014725923538, -0.07678894698619843, -0.005060891155153513, 0.03026263229548931, 0.02459152415394783, 0.057619836181402206, -0.008795030415058136, -0.04754363000392914, -0.038990963250398636, 0.2243104726076126, -0.043232716619968414, -0.15351690351963043, -0.11050417274236679, 0.20992766320705414, 0.09460540115833282, 0.07465030252933502, 0.020717481151223183, -0.10164400190114975, -0.035813771188259125, 0.16164207458496094, 0.10229302197694778, -0.0820680484175682, -0.011812807992100716, 0.033906951546669006, -0.001267618965357542, -0.0699521079659462, 0.1539791375398636, 0.05177602916955948, 0.17261435091495514, -0.039768241345882416, 0.026755133643746376, 0.009156722575426102, -0.052469197660684586, -0.06932633370161057, 0.19067862629890442, -0.03324173018336296, 0.026992594823241234, -0.030103102326393127, 0.0008431829628534615, -0.015995625406503677, -0.17456240952014923, -0.07776188105344772, -0.03505165874958038, -0.1397048532962799, -0.010166577063500881, -0.03467061370611191, -0.009251022711396217, 0.0576566644012928, -0.018798671662807465, -0.0060266125947237015, 0.15200944244861603, -0.0008376374607905746, -0.043094802647829056, -0.0009960895404219627, 0.12062244862318039, 0.06621508300304413, 0.17722442746162415, 0.031548481434583664, 0.027270294725894928, 0.07912461459636688, -0.030707918107509613, -0.12157728523015976, 0.03210127353668213, 0.04177055135369301, -0.21426956355571747, 0.021677173674106598, 0.15935364365577698, -0.007145410403609276, 0.039672497659921646, 0.020146293565630913, -0.12697573006153107, -0.017480501905083656, 0.039941493421792984, -0.033559828996658325, -0.032726459205150604, 0.03951573744416237, -0.03302349895238876, 0.09040691703557968, 0.19200623035430908, 0.01607685722410679, 0.056299492716789246, -0.08590008318424225, 0.04563035070896149, 0.03404022753238678, 0.08396018296480179, -0.025193924084305763, -0.14727675914764404, 0.03707307204604149, -0.019828327000141144, -0.013500611297786236, -0.17609667778015137, -0.09780727326869965, 0.014040857553482056, 0.008958694525063038, -0.009530560113489628, 0.13986051082611084, 0.04514285922050476, 0.05398614704608917, 0.01974545046687126, -0.13092140853405, -0.048084113746881485, 0.02893124148249626, -0.1576765477657318, -0.057555731385946274 ]
null
null
transformers
# MS-BERT ## Introduction This repository provides codes and models of MS-BERT. MS-BERT was pre-trained on notes from neurological examination for Multiple Sclerosis (MS) patients at St. Michael's Hospital in Toronto, Canada. ## Data The dataset contained approximately 75,000 clinical notes, for about 5000 patients, totaling to over 35.7 million words. These notes were collected from patients who visited St. Michael's Hospital MS Clinic between 2015 to 2019. The notes contained a variety of information pertaining to a neurological exam. For example, a note can contain information on the patient's condition, their progress over time and diagnosis. The gender split within the dataset was observed to be 72% female and 28% male ([which reflects the natural discrepancy seen in MS][1]). Further sections will describe how MS-BERT was pre trained through the use of these clinically relevant and rich neurological notes. ## Data pre-processing The data was pre-processed to remove any identifying information. This includes information on: patient names, doctor names, hospital names, patient identification numbers, phone numbers, addresses, and time. In order to de-identify the information, we used a curated database that contained patient and doctor information. This curated database was paired with regular expressions to find and remove any identifying pieces of information. Each of these identifiers were replaced with a specific token. These tokens were chosen based on three criteria: (1) they belong to the current BERT vocab, (2), they have relatively the same semantic meaning as the word they are replacing, and (3), the token is not found in the original unprocessed dataset. The replacements that met the criteria above were as follows: Female first names -> Lucie Male first names -> Ezekiel Last/family names -> Salamanca. Dates -> 2010s Patient IDs -> 999 Phone numbers -> 1718 Addresses -> Silesia Time -> 1610 Locations/Hospital/Clinic names -> Troy ## Pre-training The starting point for our model is the already pre-trained and fine-tuned BLUE-BERT base. We further pre-train it using the masked language modelling task from the huggingface transformers [library](https://github.com/huggingface). The hyperparameters can be found in the config file in this repository or [here](https://s3.amazonaws.com/models.huggingface.co/bert/NLP4H/ms_bert/config.json) ## Acknowledgements We would like to thank the researchers and staff at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital, for providing consistent support and guidance throughout this project. We would also like to thank Dr. Marzyeh Ghassemi, Taylor Killan, Nathan Ng and Haoran Zhang for providing us the opportunity to work on this exciting project. ## Disclaimer MS-BERT shows the results of research conducted at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital. The results produced by MS-BERT are not intended for direct diagnostic use or medical decision-making without review and oversight by a clinical professional. Individuals should not make decisions about their health solely on the basis of the results produced by MS-BERT. St. Michael’s Hospital does not independently verify the validity or utility of the results produced by MS-BERT. If you have questions about the results produced by MS-BERT please consult a healthcare professional. If you would like more information about the research conducted at DSAA please contact [Zhen Yang](mailto:[email protected]). If you would like more information on neurological examination notes please contact [Dr. Tony Antoniou](mailto:[email protected]) or [Dr. Jiwon Oh](mailto:[email protected]) from the MS clinic at St. Michael's Hospital. [1]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3707353/
{}
fill-mask
NLP4H/ms_bert
[ "transformers", "pytorch", "jax", "safetensors", "bert", "fill-mask", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #safetensors #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us
# MS-BERT ## Introduction This repository provides codes and models of MS-BERT. MS-BERT was pre-trained on notes from neurological examination for Multiple Sclerosis (MS) patients at St. Michael's Hospital in Toronto, Canada. ## Data The dataset contained approximately 75,000 clinical notes, for about 5000 patients, totaling to over 35.7 million words. These notes were collected from patients who visited St. Michael's Hospital MS Clinic between 2015 to 2019. The notes contained a variety of information pertaining to a neurological exam. For example, a note can contain information on the patient's condition, their progress over time and diagnosis. The gender split within the dataset was observed to be 72% female and 28% male ([which reflects the natural discrepancy seen in MS][1]). Further sections will describe how MS-BERT was pre trained through the use of these clinically relevant and rich neurological notes. ## Data pre-processing The data was pre-processed to remove any identifying information. This includes information on: patient names, doctor names, hospital names, patient identification numbers, phone numbers, addresses, and time. In order to de-identify the information, we used a curated database that contained patient and doctor information. This curated database was paired with regular expressions to find and remove any identifying pieces of information. Each of these identifiers were replaced with a specific token. These tokens were chosen based on three criteria: (1) they belong to the current BERT vocab, (2), they have relatively the same semantic meaning as the word they are replacing, and (3), the token is not found in the original unprocessed dataset. The replacements that met the criteria above were as follows: Female first names -> Lucie Male first names -> Ezekiel Last/family names -> Salamanca. Dates -> 2010s Patient IDs -> 999 Phone numbers -> 1718 Addresses -> Silesia Time -> 1610 Locations/Hospital/Clinic names -> Troy ## Pre-training The starting point for our model is the already pre-trained and fine-tuned BLUE-BERT base. We further pre-train it using the masked language modelling task from the huggingface transformers library. The hyperparameters can be found in the config file in this repository or here ## Acknowledgements We would like to thank the researchers and staff at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital, for providing consistent support and guidance throughout this project. We would also like to thank Dr. Marzyeh Ghassemi, Taylor Killan, Nathan Ng and Haoran Zhang for providing us the opportunity to work on this exciting project. ## Disclaimer MS-BERT shows the results of research conducted at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital. The results produced by MS-BERT are not intended for direct diagnostic use or medical decision-making without review and oversight by a clinical professional. Individuals should not make decisions about their health solely on the basis of the results produced by MS-BERT. St. Michael’s Hospital does not independently verify the validity or utility of the results produced by MS-BERT. If you have questions about the results produced by MS-BERT please consult a healthcare professional. If you would like more information about the research conducted at DSAA please contact Zhen Yang. If you would like more information on neurological examination notes please contact Dr. Tony Antoniou or Dr. Jiwon Oh from the MS clinic at St. Michael's Hospital. [1]: URL
[ "# MS-BERT", "## Introduction\n\nThis repository provides codes and models of MS-BERT.\nMS-BERT was pre-trained on notes from neurological examination for Multiple Sclerosis (MS) patients at St. Michael's Hospital in Toronto, Canada.", "## Data\n\nThe dataset contained approximately 75,000 clinical notes, for about 5000 patients, totaling to over 35.7 million words.\nThese notes were collected from patients who visited St. Michael's Hospital MS Clinic between 2015 to 2019.\nThe notes contained a variety of information pertaining to a neurological exam.\nFor example, a note can contain information on the patient's condition, their progress over time and diagnosis.\nThe gender split within the dataset was observed to be 72% female and 28% male ([which reflects the natural discrepancy seen in MS][1]).\nFurther sections will describe how MS-BERT was pre trained through the use of these clinically relevant and rich neurological notes.", "## Data pre-processing\n\nThe data was pre-processed to remove any identifying information. This includes information on: patient names, doctor names, hospital names, patient identification numbers, phone numbers, addresses, and time. In order to de-identify the information, we used a curated database that contained patient and doctor information. This curated database was paired with regular expressions to find and remove any identifying pieces of information. Each of these identifiers were replaced with a specific token. These tokens were chosen based on three criteria: (1) they belong to the current BERT vocab, (2), they have relatively the same semantic meaning as the word they are replacing, and (3), the token is not found in the original unprocessed dataset. The replacements that met the criteria above were as follows: \n\nFemale first names -> Lucie\n\nMale first names -> Ezekiel\n\nLast/family names -> Salamanca.\n\nDates -> 2010s\n\nPatient IDs -> 999\n\nPhone numbers -> 1718\n\nAddresses -> Silesia\n\nTime -> 1610\n\nLocations/Hospital/Clinic names -> Troy", "## Pre-training\n\nThe starting point for our model is the already pre-trained and fine-tuned BLUE-BERT base. We further pre-train it using the masked language modelling task from the huggingface transformers library. \n\nThe hyperparameters can be found in the config file in this repository or here", "## Acknowledgements\n\nWe would like to thank the researchers and staff at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital, for providing consistent support and guidance throughout this project.\nWe would also like to thank Dr. Marzyeh Ghassemi, Taylor Killan, Nathan Ng and Haoran Zhang for providing us the opportunity to work on this exciting project.", "## Disclaimer\n\nMS-BERT shows the results of research conducted at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital. The results produced by MS-BERT are not intended for direct diagnostic use or medical decision-making without review and oversight by a clinical professional. Individuals should not make decisions about their health solely on the basis of the results produced by MS-BERT. St. Michael’s Hospital does not independently verify the validity or utility of the results produced by MS-BERT. If you have questions about the results produced by MS-BERT please consult a healthcare professional. If you would like more information about the research conducted at DSAA please contact Zhen Yang. If you would like more information on neurological examination notes please contact Dr. Tony Antoniou or Dr. Jiwon Oh from the MS clinic at St. Michael's Hospital.\n\n[1]: URL" ]
[ "TAGS\n#transformers #pytorch #jax #safetensors #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n", "# MS-BERT", "## Introduction\n\nThis repository provides codes and models of MS-BERT.\nMS-BERT was pre-trained on notes from neurological examination for Multiple Sclerosis (MS) patients at St. Michael's Hospital in Toronto, Canada.", "## Data\n\nThe dataset contained approximately 75,000 clinical notes, for about 5000 patients, totaling to over 35.7 million words.\nThese notes were collected from patients who visited St. Michael's Hospital MS Clinic between 2015 to 2019.\nThe notes contained a variety of information pertaining to a neurological exam.\nFor example, a note can contain information on the patient's condition, their progress over time and diagnosis.\nThe gender split within the dataset was observed to be 72% female and 28% male ([which reflects the natural discrepancy seen in MS][1]).\nFurther sections will describe how MS-BERT was pre trained through the use of these clinically relevant and rich neurological notes.", "## Data pre-processing\n\nThe data was pre-processed to remove any identifying information. This includes information on: patient names, doctor names, hospital names, patient identification numbers, phone numbers, addresses, and time. In order to de-identify the information, we used a curated database that contained patient and doctor information. This curated database was paired with regular expressions to find and remove any identifying pieces of information. Each of these identifiers were replaced with a specific token. These tokens were chosen based on three criteria: (1) they belong to the current BERT vocab, (2), they have relatively the same semantic meaning as the word they are replacing, and (3), the token is not found in the original unprocessed dataset. The replacements that met the criteria above were as follows: \n\nFemale first names -> Lucie\n\nMale first names -> Ezekiel\n\nLast/family names -> Salamanca.\n\nDates -> 2010s\n\nPatient IDs -> 999\n\nPhone numbers -> 1718\n\nAddresses -> Silesia\n\nTime -> 1610\n\nLocations/Hospital/Clinic names -> Troy", "## Pre-training\n\nThe starting point for our model is the already pre-trained and fine-tuned BLUE-BERT base. We further pre-train it using the masked language modelling task from the huggingface transformers library. \n\nThe hyperparameters can be found in the config file in this repository or here", "## Acknowledgements\n\nWe would like to thank the researchers and staff at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital, for providing consistent support and guidance throughout this project.\nWe would also like to thank Dr. Marzyeh Ghassemi, Taylor Killan, Nathan Ng and Haoran Zhang for providing us the opportunity to work on this exciting project.", "## Disclaimer\n\nMS-BERT shows the results of research conducted at the Data Science and Advanced Analytics (DSAA) department, St. Michael’s Hospital. The results produced by MS-BERT are not intended for direct diagnostic use or medical decision-making without review and oversight by a clinical professional. Individuals should not make decisions about their health solely on the basis of the results produced by MS-BERT. St. Michael’s Hospital does not independently verify the validity or utility of the results produced by MS-BERT. If you have questions about the results produced by MS-BERT please consult a healthcare professional. If you would like more information about the research conducted at DSAA please contact Zhen Yang. If you would like more information on neurological examination notes please contact Dr. Tony Antoniou or Dr. Jiwon Oh from the MS clinic at St. Michael's Hospital.\n\n[1]: URL" ]
[ 44, 5, 57, 156, 241, 74, 83, 197 ]
[ "passage: TAGS\n#transformers #pytorch #jax #safetensors #bert #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n# MS-BERT## Introduction\n\nThis repository provides codes and models of MS-BERT.\nMS-BERT was pre-trained on notes from neurological examination for Multiple Sclerosis (MS) patients at St. Michael's Hospital in Toronto, Canada.## Data\n\nThe dataset contained approximately 75,000 clinical notes, for about 5000 patients, totaling to over 35.7 million words.\nThese notes were collected from patients who visited St. Michael's Hospital MS Clinic between 2015 to 2019.\nThe notes contained a variety of information pertaining to a neurological exam.\nFor example, a note can contain information on the patient's condition, their progress over time and diagnosis.\nThe gender split within the dataset was observed to be 72% female and 28% male ([which reflects the natural discrepancy seen in MS][1]).\nFurther sections will describe how MS-BERT was pre trained through the use of these clinically relevant and rich neurological notes.## Data pre-processing\n\nThe data was pre-processed to remove any identifying information. This includes information on: patient names, doctor names, hospital names, patient identification numbers, phone numbers, addresses, and time. In order to de-identify the information, we used a curated database that contained patient and doctor information. This curated database was paired with regular expressions to find and remove any identifying pieces of information. Each of these identifiers were replaced with a specific token. These tokens were chosen based on three criteria: (1) they belong to the current BERT vocab, (2), they have relatively the same semantic meaning as the word they are replacing, and (3), the token is not found in the original unprocessed dataset. The replacements that met the criteria above were as follows: \n\nFemale first names -> Lucie\n\nMale first names -> Ezekiel\n\nLast/family names -> Salamanca.\n\nDates -> 2010s\n\nPatient IDs -> 999\n\nPhone numbers -> 1718\n\nAddresses -> Silesia\n\nTime -> 1610\n\nLocations/Hospital/Clinic names -> Troy" ]
[ -0.056577228009700775, 0.18573398888111115, -0.006034610327333212, -0.029948275536298752, 0.11465708911418915, 0.030131306499242783, -0.00518714589998126, 0.06790777295827866, -0.03116903267800808, 0.19022339582443237, -0.023643100634217262, -0.012575715780258179, 0.09405025094747543, 0.050780683755874634, 0.0754624754190445, -0.24353942275047302, 0.06661300361156464, 0.013554180040955544, 0.11774235218763351, 0.05347880348563194, 0.053798869252204895, -0.06552251428365707, 0.02896939404308796, 0.012870298698544502, 0.03928851708769798, -0.05223703011870384, 0.02218252420425415, 0.021570555865764618, 0.1084696426987648, 0.020831190049648285, 0.007107709068804979, 0.02008393220603466, 0.04921210929751396, -0.2124287486076355, 0.0015959821175783873, 0.06173821911215782, -0.015263823792338371, 0.032702937722206116, 0.05195603147149086, -0.020129846408963203, 0.13421235978603363, -0.18064039945602417, 0.07557422667741776, 0.010708436369895935, -0.08469633758068085, -0.26227501034736633, -0.175317645072937, 0.03699258714914322, 0.08287972956895828, 0.10144467651844025, -0.000024259674319182523, 0.13721337914466858, -0.04384506866335869, 0.028445061296224594, 0.14581185579299927, -0.1789907068014145, 0.037582479417324066, -0.028970954939723015, 0.0960758775472641, 0.04383033886551857, -0.077034592628479, 0.016864586621522903, -0.05956706032156944, 0.00020364335796330124, 0.06213489547371864, 0.003607087070122361, 0.019158996641635895, -0.03231918439269066, -0.10682181268930435, -0.04670322313904762, 0.15689030289649963, -0.044322870671749115, -0.044768888503313065, -0.0816611647605896, -0.0600753128528595, -0.012137426994740963, 0.014062219299376011, -0.07250121235847473, 0.0164367463439703, -0.05199296399950981, 0.03981222212314606, -0.02273327298462391, -0.06099133938550949, -0.02552943117916584, -0.0907249003648758, -0.005148735363036394, 0.03015691414475441, 0.027478860691189766, -0.00342554016970098, 0.0063546933233737946, -0.019363025203347206, 0.01125331874936819, -0.054776743054389954, -0.02121923863887787, -0.09975016117095947, -0.056771617382764816, -0.06294916570186615, -0.11984360963106155, -0.027950268238782883, 0.06950712949037552, -0.09113077074289322, 0.03605209290981293, -0.03929874300956726, -0.02981184795498848, 0.06260942667722702, 0.09631910175085068, -0.04094052314758301, -0.025392435491085052, -0.04380180686712265, -0.05590222030878067, 0.06182941421866417, 0.02366235852241516, 0.020939888432621956, 0.045134928077459335, 0.05574895441532135, 0.08025412261486053, -0.038121361285448074, 0.06051091104745865, -0.009104174561798573, -0.003946603275835514, 0.17127016186714172, -0.08844960480928421, 0.016214799135923386, 0.04375043511390686, 0.02321646735072136, 0.03267078101634979, 0.08897312730550766, -0.04812216758728027, -0.11443399637937546, 0.08469124883413315, -0.048005834221839905, -0.019505752250552177, -0.026656921952962875, -0.08466840535402298, 0.06552237272262573, -0.072528176009655, -0.06637781113386154, -0.1121121421456337, -0.10516522824764252, -0.08798830956220627, -0.07600084692239761, -0.014796612784266472, 0.08908401429653168, 0.02287161909043789, 0.06333097070455551, -0.008054269477725029, -0.04036320745944977, -0.05390249565243721, -0.03215963393449783, 0.016583867371082306, -0.1572897881269455, 0.012497236020863056, -0.028636673465371132, 0.052174802869558334, -0.03594721108675003, 0.013162885792553425, -0.07576365768909454, 0.010219261050224304, -0.042274460196495056, -0.051988810300827026, -0.12877154350280762, -0.043767157942056656, -0.07545577734708786, 0.0193878635764122, 0.026580171659588814, 0.0916743278503418, -0.14625506103038788, -0.03369235619902611, 0.15425953269004822, -0.11303851753473282, -0.0012576471781358123, 0.05591309443116188, 0.03715629503130913, 0.022547973319888115, 0.09977196902036667, 0.1451602429151535, 0.05863272398710251, -0.0516534149646759, -0.07356147468090057, -0.15514367818832397, -0.055067382752895355, 0.013058671727776527, 0.06209155172109604, -0.12129686772823334, 0.06422580778598785, 0.018482189625501633, -0.03824903815984726, -0.0895329937338829, 0.009477794170379639, -0.015875937417149544, 0.07575046271085739, -0.050526347011327744, -0.06238194555044174, 0.03185494244098663, -0.03559805825352669, 0.0022937424946576357, 0.001933743478730321, 0.028256267309188843, 0.06607972085475922, -0.04094589129090309, 0.06109045445919037, -0.062186889350414276, -0.06991785019636154, 0.0967949703335762, 0.014805284328758717, -0.1826191544532776, -0.10739700496196747, 0.03647471219301224, -0.08725675940513611, -0.0434521846473217, -0.025849727913737297, 0.001868165098130703, 0.031455155462026596, -0.05497559532523155, 0.028774257749319077, 0.12164555490016937, 0.01097209844738245, -0.10867089778184891, -0.07893376052379608, -0.0400005467236042, -0.04900262877345085, 0.0290055088698864, -0.02667142078280449, 0.002850314136594534, 0.007838409394025803, 0.11155533045530319, 0.04042435809969902, -0.10599274188280106, 0.021259471774101257, 0.040900714695453644, 0.06836064904928207, 0.0044041951186954975, 0.013667518272995949, -0.013028107583522797, 0.056158021092414856, 0.14741885662078857, -0.1994781643152237, -0.16071094572544098, 0.009152625687420368, 0.07122057676315308, -0.05782734602689743, -0.0032691792584955692, -0.027049999684095383, 0.010997248813509941, -0.12815319001674652, -0.14004665613174438, 0.07417649030685425, 0.01608193852007389, 0.03878564015030861, 0.028120363131165504, -0.03314942121505737, -0.008963601663708687, 0.03074433095753193, -0.1240372583270073, 0.030849505215883255, 0.0920967236161232, -0.16167408227920532, 0.020977184176445007, -0.06784413754940033, 0.09274419397115707, 0.048197295516729355, 0.018275244161486626, -0.08669295161962509, -0.0412108413875103, -0.0586080364882946, 0.05367179214954376, 0.05346197262406349, -0.09757090359926224, -0.04876931756734848, 0.051122117787599564, 0.034998178482055664, 0.05965517833828926, 0.018032923340797424, 0.05043864995241165, 0.014569677412509918, 0.022824527695775032, -0.06901249289512634, -0.014899197965860367, 0.008410251699388027, 0.15123142302036285, 0.035764724016189575, 0.09083868563175201, -0.03253054618835449, -0.030322669073939323, -0.19499318301677704, 0.1379319727420807, -0.14311397075653076, -0.2128145694732666, -0.11133427172899246, -0.0659479945898056, 0.05099795386195183, 0.02695438638329506, 0.023899657651782036, -0.039868082851171494, -0.09923699498176575, -0.13417652249336243, 0.07568544149398804, 0.05995961278676987, -0.03600955754518509, -0.06648901104927063, -0.03518137335777283, 0.03313226252794266, -0.06432631611824036, 0.03193841129541397, -0.04985915496945381, 0.0013747138436883688, 0.0652657300233841, -0.07816779613494873, 0.09157201647758484, 0.09923780709505081, 0.04165637493133545, -0.04760219529271126, -0.01693584769964218, 0.06237995624542236, -0.06532761454582214, 0.07571946829557419, 0.07351773977279663, -0.009219985455274582, 0.01128468383103609, 0.06909532845020294, -0.01240470726042986, -0.020130932331085205, 0.02939683012664318, 0.11059748381376266, -0.010702105239033699, -0.23271919786930084, -0.12707127630710602, -0.057821594178676605, -0.08926337212324142, 0.025153227150440216, 0.06387680768966675, -0.0006965791690163314, -0.04248008504509926, -0.10364198684692383, -0.06272394210100174, 0.033506400883197784, 0.06879935413599014, 0.1210545226931572, -0.011607058346271515, 0.08199537545442581, -0.018558518961071968, 0.025916069746017456, 0.05593946576118469, -0.07087640464305878, 0.1935892552137375, -0.00020333014253992587, 0.11891793459653854, 0.0707874521613121, 0.03853049874305725, 0.014691827818751335, 0.05864621698856354, 0.06729013472795486, 0.06229282543063164, -0.023092065006494522, -0.05320371687412262, -0.053743164986371994, 0.01752231828868389, 0.04278842359781265, -0.11974900960922241, -0.03278521075844765, -0.14226531982421875, 0.03736690431833267, 0.18440495431423187, 0.044103045016527176, -0.07124901562929153, -0.06825665384531021, 0.03878185525536537, -0.07116781175136566, -0.11305272579193115, -0.009211842902004719, 0.021964507177472115, -0.09778092056512833, 0.06665265560150146, -0.010803630575537682, 0.0824345126748085, -0.06935712695121765, 0.03187663480639458, 0.05158451944589615, -0.08616700023412704, -0.03262662515044212, 0.045395832508802414, -0.07630075514316559, 0.11857651174068451, 0.03674948588013649, 0.05347570404410362, -0.03646612912416458, -0.025311024859547615, -0.009549226611852646, 0.09516099095344543, 0.08650149405002594, 0.04760344326496124, -0.15872225165367126, -0.07091368734836578, -0.0006348124006763101, 0.016410401090979576, 0.12901155650615692, -0.08605075627565384, 0.09420136362314224, -0.021360941231250763, 0.015990210697054863, -0.03954797610640526, -0.06431509554386139, -0.05440158024430275, -0.09502778947353363, 0.1256508082151413, -0.07193922996520996, 0.10937156528234482, -0.013550030998885632, -0.025101233273744583, -0.1309565156698227, 0.03391953185200691, -0.14206010103225708, -0.03057928755879402, -0.16495493054389954, -0.01299060694873333, 0.10531634092330933, -0.07581555098295212, 0.029810428619384766, 0.04983402416110039, 0.10219983011484146, 0.006675169337540865, -0.10506229847669601, -0.0076414006762206554, -0.047456640750169754, -0.17139288783073425, -0.06784974038600922, 0.14715193212032318, 0.09180770814418793, 0.014161376282572746, -0.008152448572218418, 0.12703661620616913, 0.014373759739100933, -0.040155064314603806, 0.1392008513212204, 0.19007176160812378, 0.10163699090480804, 0.08404641598463058, -0.09926682710647583, -0.1316932588815689, -0.07747647911310196, -0.05242658779025078, 0.04103628545999527, 0.10325797647237778, -0.033545080572366714, 0.08160998672246933, 0.173047736287117, -0.08761364966630936, -0.2416541427373886, 0.0030755430925637484, -0.02056763879954815, -0.04636988788843155, 0.09640011191368103, -0.16506002843379974, 0.04194917902350426, 0.07710906118154526, 0.014516989700496197, -0.10309449583292007, -0.05529512092471123, -0.07590895146131516, 0.04035410284996033, 0.014701642096042633, 0.05623776465654373, -0.0862145647406578, -0.06202012673020363, 0.021394671872258186, -0.10813204944133759, 0.052158355712890625, -0.043236952275037766, 0.05252177640795708, -0.014357115142047405, 0.026343854144215584, 0.04946158826351166, -0.03865667060017586, 0.08465930819511414, 0.006455276161432266, 0.014163879677653313, -0.02754436992108822, -0.024218831211328506, 0.040960200130939484, -0.041892942041158676, 0.07591669261455536, 0.15065091848373413, 0.01307174563407898, -0.10950195044279099, -0.04569167271256447, -0.07488784939050674, -0.01563536748290062, -0.04089527577161789, 0.018937848508358, -0.0830204114317894, 0.09551063925027847, 0.13468819856643677, -0.059187598526477814, 0.05290558934211731, -0.10258954763412476, 0.0371435210108757, 0.047876425087451935, 0.11927320063114166, 0.0763777494430542, -0.06406527012586594, 0.003642963944002986, -0.08552181720733643, 0.05696939677000046, 0.0056236907839775085, 0.08903780579566956, 0.114197738468647, 0.046240802854299545, 0.0850444883108139, 0.007873700000345707, -0.14123214781284332, -0.019618740305304527, 0.022117840126156807, -0.07954093813896179, -0.21211892366409302, 0.005739507265388966, 0.018655499443411827, -0.14449824392795563, 0.06939241290092468, 0.07666784524917603, 0.02250387892127037, -0.025019433349370956, -0.021501822397112846, 0.09828086197376251, 0.045671191066503525, 0.13599874079227448, -0.026923924684524536, 0.004549970850348473, -0.08200876414775848, 0.07408366352319717, 0.10954221338033676, -0.1096450686454773, 0.01582343503832817, 0.07496502250432968, -0.10934045165777206, -0.08055189251899719, -0.013281654566526413, 0.006372722797095776, -0.02767064794898033, -0.0779464989900589, 0.019305547699332237, -0.11688245832920074, 0.05954665690660477, 0.23245780169963837, -0.0963551253080368, 0.09709436446428299, -0.011696506291627884, 0.023733973503112793, -0.07883067429065704, 0.04349862039089203, -0.03928111866116524, 0.028070839121937752, 0.07202591001987457, 0.1498737931251526, -0.015571897849440575, 0.008999268524348736, -0.01630506105720997, -0.0017899922095239162, -0.05733932927250862, -0.022507021203637123, -0.08162504434585571, -0.05745408684015274, -0.06025538966059685, -0.031672295182943344, -0.03388763964176178, 0.007840021513402462, 0.014938685111701488, -0.0011950915213674307, 0.010421373881399632, -0.05971388518810272, 0.021588321775197983, 0.04485948011279106, -0.13103197515010834, 0.008440306410193443, 0.030211221426725388, -0.043599311262369156, 0.09041023999452591, 0.002429001731798053, 0.007792131509631872, 0.05883469805121422, 0.05065038427710533, 0.008376502431929111, -0.026414915919303894, 0.07312825322151184, 0.006105555687099695, -0.11620725691318512, -0.042335234582424164, -0.004803538788110018, -0.12394751608371735, 0.06404441595077515, 0.05085613206028938, 0.008260550908744335, 0.07807107269763947, -0.10058452934026718, 0.0143594890832901, -0.07433562725782394, 0.018259325996041298, -0.008235003799200058, 0.02957615628838539, 0.10068341344594955, -0.04456868767738342, 0.004271810874342918, -0.10717997699975967, -0.00021206784003879875, 0.00705476850271225, -0.0060704657807946205, -0.08351994305849075, 0.04735333472490311, 0.08392820507287979, 0.054345015436410904, 0.12366162240505219, -0.05161761865019798, 0.011079528369009495, 0.05377988517284393, 0.09020485728979111, -0.06784512102603912, -0.010131043381989002, -0.002985274652019143, -0.01078849844634533, -0.04257557541131973, 0.011227935552597046, 0.005635848734527826, -0.1083141416311264, 0.08879038691520691, 0.20074935257434845, 0.08185489475727081, 0.2438667118549347, -0.021344663575291634, -0.07424850761890411, -0.10356505960226059, -0.10156925022602081, -0.09171763062477112, -0.0023985090665519238, -0.049860503524541855, -0.012569510377943516, -0.09675297886133194, 0.13311083614826202, -0.12571823596954346, 0.1278257519006729, -0.07026831060647964, -0.05207524076104164, -0.094608373939991, -0.10296596586704254, 0.0037086205556988716, 0.01936887390911579, -0.06148430332541466, -0.09277279675006866, 0.02054998278617859, 0.07624104619026184, -0.02095613442361355, 0.013682075776159763, 0.022904321551322937, -0.17286615073680878, -0.058522190898656845, 0.021662741899490356, 0.02014424093067646, 0.045541778206825256, 0.03600377216935158, 0.06912058591842651, 0.11401313543319702, -0.00988749135285616, 0.06882265955209732, 0.058724623173475266, 0.08828762173652649, 0.03263286501169205, -0.08561456203460693, -0.08156117796897888, 0.0820946991443634, -0.005192173179239035, 0.027586981654167175, 0.24719031155109406, 0.09409486502408981, -0.006611525546759367, 0.0369349829852581, 0.14572769403457642, 0.00018620386254042387, 0.08411277085542679, -0.13680151104927063, 0.12456893175840378, 0.08087266236543655, -0.00825525727123022, -0.003094677347689867, -0.13005498051643372, 0.07945752888917923, 0.17511488497257233, 0.02249845303595066, 0.08825992047786713, 0.02994905784726143, 0.012375006452202797, 0.02262747474014759, 0.0912596806883812, 0.08193396776914597, 0.06968466192483902, 0.20026008784770966, -0.008979151956737041, 0.03146087005734444, -0.024258168414235115, 0.02403581701219082, -0.033330485224723816, 0.05225890502333641, 0.0010574274929240346, 0.00468531483784318, -0.030751174315810204, 0.08264097571372986, -0.05525151267647743, -0.22653299570083618, 0.041722193360328674, -0.05249951779842377, -0.031023891642689705, 0.013254846446216106, -0.02317030541598797, 0.0019277350511401892, -0.014333732426166534, 0.056442346423864365, 0.07390199601650238, 0.24642148613929749, 0.036208365112543106, -0.049629971385002136, -0.12233657389879227, 0.09339539706707001, 0.04958563297986984, 0.1605168730020523, -0.004106435459107161, -0.00427292101085186, 0.024938201531767845, -0.0029447467532008886, -0.08953230828046799, 0.1356683075428009, -0.03494155406951904, -0.06903592497110367, 0.056535616517066956, 0.14226745069026947, 0.01345449686050415, 0.15849293768405914, 0.05358331650495529, -0.030132731422781944, 0.06476256251335144, 0.042328204959630966, -0.04098924249410629, -0.042065296322107315, 0.17060363292694092, -0.06892508268356323, 0.11889669299125671, 0.11035461723804474, 0.023701922968029976, 0.04434872046113014, -0.049744751304388046, -0.05312293395400047, 0.0036088183987885714, 0.13307665288448334, 0.0241232942789793, -0.1631198525428772, 0.06220925226807594, -0.15955398976802826, 0.052142396569252014, -0.13890871405601501, -0.04594932869076729, 0.046836238354444504, -0.06950586289167404, 0.008348583243787289, 0.08353796601295471, -0.0349288284778595, -0.012005050666630268, -0.08028507977724075, -0.07976116240024567, 0.06386948376893997, 0.13230381906032562, -0.10058934986591339, -0.027898263186216354 ]
null
null
transformers
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*
{"language": ["si"], "license": ["mit"]}
fill-mask
NLPC-UOM/SinBERT-large
[ "transformers", "pytorch", "roberta", "fill-mask", "si", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "si" ]
TAGS #transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us
This is SinBERT-large model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*
[]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.050591934472322464, 0.021755702793598175, -0.008006316609680653, 0.023109525442123413, 0.1173422560095787, 0.030626652762293816, 0.15884624421596527, 0.08412586897611618, 0.098439060151577, -0.018652938306331635, 0.15864907205104828, 0.23091988265514374, 0.004313276614993811, 0.15920700132846832, -0.04942096769809723, -0.24731194972991943, 0.06998438388109207, 0.024149348959326744, -0.03767886757850647, 0.12188687175512314, 0.0871223732829094, -0.061658408492803574, 0.06543832272291183, -0.010430450551211834, -0.08927353471517563, 0.021802326664328575, 0.06526345014572144, -0.08914679288864136, 0.1394282877445221, 0.044772084802389145, 0.15505777299404144, 0.044967249035835266, -0.0165758915245533, -0.12249351292848587, 0.04184199869632721, -0.027510613203048706, -0.08311492949724197, 0.03072301484644413, -0.020604049786925316, -0.05473697558045387, 0.03610158711671829, 0.07074851542711258, 0.0022161954548209906, 0.0635056272149086, -0.14531400799751282, -0.13316236436367035, -0.05332968011498451, 0.06765146553516388, 0.044183820486068726, 0.056753337383270264, 0.022199153900146484, 0.1933409571647644, -0.11488129943609238, 0.07669957727193832, 0.11882254481315613, -0.30006951093673706, 0.004747388884425163, 0.07490549236536026, 0.07425065338611603, -0.05330498144030571, -0.025558114051818848, 0.06374412029981613, 0.04689822718501091, 0.004546920768916607, 0.018395433202385902, -0.06654002517461777, 0.010261854156851768, 0.017764683812856674, -0.05409367382526398, -0.07574256509542465, 0.12863758206367493, -0.043126821517944336, 0.0049995966255664825, 0.002975303679704666, -0.09533488005399704, -0.02192321978509426, -0.022725427523255348, -0.011198594234883785, -0.010058467276394367, 0.0730784684419632, 0.009870496578514576, -0.02496490068733692, -0.1341504007577896, 0.011249128729104996, -0.24781598150730133, 0.21831387281417847, 0.042341817170381546, 0.07478608936071396, -0.14993970096111298, 0.032446179538965225, -0.01891428232192993, -0.10630571097135544, 0.010110284201800823, -0.07890218496322632, 0.03734961897134781, 0.008956363424658775, -0.030059240758419037, 0.026614850386977196, 0.0917462483048439, 0.23568211495876312, 0.04497336223721504, 0.0038724932819604874, 0.03900599107146263, 0.11643928289413452, -0.011927605606615543, 0.05406658351421356, 0.03905190899968147, -0.0032111809123307467, 0.030604436993598938, -0.14485523104667664, 0.05859575793147087, -0.037171632051467896, -0.1412450522184372, -0.05159803107380867, -0.052909642457962036, 0.08983313292264938, 0.02598431147634983, 0.05604388192296028, -0.08889389038085938, 0.037732746452093124, 0.08433718234300613, -0.053362682461738586, -0.013229918666183949, -0.02647816576063633, 0.06374271959066391, 0.0514199323952198, 0.0036933657247573137, 0.0029142771381884813, -0.0008524816366843879, 0.12478234618902206, -0.08103299885988235, -0.028966860845685005, -0.04079842194914818, -0.04158633574843407, 0.062124621123075485, -0.15139785408973694, 0.04073091223835945, -0.18377017974853516, -0.17878369987010956, 0.04900844022631645, 0.053083907812833786, -0.008530013263225555, -0.036814745515584946, 0.04713711142539978, 0.00935443677008152, 0.007943247444927692, -0.054770369082689285, -0.07461941242218018, -0.05357908084988594, 0.11354707926511765, -0.006160577293485403, 0.09178217500448227, -0.1368788629770279, 0.028155764564871788, -0.12271592020988464, 0.013020443730056286, -0.07976391166448593, -0.05923790857195854, -0.044977497309446335, 0.1653629094362259, -0.0035916378255933523, -0.030951455235481262, -0.10145357251167297, 0.04371136799454689, -0.023898938670754433, 0.1820956915616989, -0.09497055411338806, -0.10888206958770752, 0.21805422008037567, -0.14110411703586578, -0.1686301976442337, 0.08516230434179306, 0.014378540217876434, 0.06350121647119522, 0.05394897237420082, 0.12251251190900803, 0.04685007035732269, -0.17240381240844727, 0.07732924073934555, 0.1135357990860939, -0.15753090381622314, -0.16576842963695526, 0.06579794734716415, -0.031998410820961, -0.07043245434761047, 0.04593932628631592, 0.03885846957564354, 0.11921340227127075, -0.05947595462203026, -0.07497801631689072, -0.017622267827391624, -0.036919381469488144, 0.0893460214138031, 0.03657286986708641, 0.08213216811418533, -0.09041652828454971, -0.01683838851749897, -0.06918618828058243, 0.010063487105071545, 0.09815219789743423, 0.026850534602999687, -0.11997922509908676, 0.12194175273180008, -0.0063081406988203526, -0.01322436798363924, -0.11142271757125854, -0.06017596647143364, -0.04011519253253937, 0.01259619276970625, -0.007643837947398424, 0.11753854900598526, 0.07976072281599045, -0.047422271221876144, -0.009534258395433426, -0.0023711302783340216, 0.11578580737113953, 0.029382361099123955, -0.013428855687379837, -0.10908713191747665, 0.03355276957154274, -0.055763330310583115, 0.010883855633437634, 0.006010865792632103, 0.01682361401617527, 0.010738704353570938, 0.1175321713089943, -0.035640884190797806, 0.05301870405673981, -0.07242736220359802, 0.01941610313951969, -0.04233291745185852, 0.013681739568710327, 0.10989829152822495, 0.03045118786394596, -0.052027132362127304, 0.21158362925052643, -0.1029428020119667, 0.32900407910346985, 0.1983247995376587, -0.21622496843338013, -0.020492779091000557, 0.027496790513396263, -0.03525153174996376, -0.01159580796957016, 0.03473326936364174, 0.007916728965938091, -0.021562887355685234, -0.006535433232784271, 0.14363828301429749, -0.01274896040558815, -0.008367200382053852, 0.034577153623104095, -0.08381704241037369, -0.05324168875813484, 0.026965441182255745, 0.16023260354995728, -0.1581190675497055, 0.19909481704235077, 0.25860342383384705, 0.027555078268051147, 0.15407633781433105, -0.023708157241344452, 0.003509500762447715, -0.019903946667909622, -0.03149259462952614, -0.017795279622077942, 0.08939072489738464, -0.14766524732112885, -0.01838412694633007, 0.06409000605344772, -0.05517591908574104, 0.03658156841993332, -0.12290366739034653, -0.08145111799240112, 0.016814490780234337, 0.022217722609639168, -0.07100408524274826, 0.14392095804214478, -0.012960716150701046, 0.06922166794538498, -0.011528313159942627, -0.10641634464263916, 0.11287028342485428, 0.011628503911197186, -0.0573701448738575, 0.1288471668958664, -0.10290449857711792, -0.2840704619884491, -0.16073419153690338, -0.16206155717372894, 0.039554890245199203, 0.041257765144109726, 0.08684303611516953, -0.050558313727378845, -0.05163794755935669, 0.08347249776124954, -0.023014752194285393, -0.022694705054163933, 0.045194972306489944, -0.08532359451055527, 0.028088992461562157, -0.04024394229054451, -0.07418694347143173, -0.08546998351812363, -0.015470930375158787, -0.020014377310872078, 0.1250854730606079, -0.0788731649518013, 0.09053310751914978, 0.08313828706741333, -0.004480581730604172, 0.05209716781973839, -0.030155370011925697, 0.17516584694385529, -0.07908747345209122, 0.004076098557561636, 0.19238100945949554, -0.0070248194970190525, 0.08760134130716324, 0.2050885111093521, 0.04010841250419617, -0.04262885823845863, -0.015956319868564606, -0.06098438799381256, -0.12356745451688766, -0.18789182603359222, -0.1054161861538887, -0.11294716596603394, 0.01674163155257702, 0.031429652124643326, 0.06800312548875809, 0.14061281085014343, 0.11563465744256973, 0.012444790452718735, -0.03725677356123924, -0.07258399575948715, 0.06507586687803268, 0.212574303150177, -0.03658689185976982, 0.12009898573160172, -0.08552532643079758, -0.12130878120660782, 0.08363447338342667, 0.032765116542577744, 0.09258618950843811, 0.14234356582164764, 0.01631159335374832, 0.09373423457145691, 0.16219133138656616, 0.1340932697057724, 0.10450520366430283, 0.053674232214689255, -0.05654620751738548, -0.012348517775535583, -0.014165983535349369, -0.04570310190320015, 0.04450669884681702, 0.07933278381824493, -0.09588659554719925, -0.019935818389058113, -0.12413667887449265, 0.043343763798475266, 0.11705521494150162, 0.06059858202934265, -0.19517596065998077, -0.007991977967321873, 0.06983734667301178, 0.007937063463032246, -0.03991718217730522, 0.04673563316464424, -0.031906608492136, -0.1289171278476715, 0.07570863515138626, -0.039919253438711166, 0.07591115683317184, 0.06677773594856262, 0.07482055574655533, -0.013097460381686687, -0.0867016389966011, 0.03525908291339874, 0.08430071920156479, -0.29410722851753235, 0.30191823840141296, -0.0031261921394616365, -0.00624937703832984, -0.065646231174469, -0.01177025306969881, 0.04684963822364807, 0.13821451365947723, 0.15114179253578186, 0.03616536036133766, -0.06905839592218399, -0.11867018789052963, 0.01830291375517845, 0.023695265874266624, 0.07166315615177155, -0.006497399415820837, -0.030128220096230507, -0.062226831912994385, -0.014070126228034496, -0.014770664274692535, 0.10776478797197342, -0.00897882878780365, -0.1177956834435463, 0.07067783921957016, 0.05891796946525574, -0.0005980636924505234, -0.027776727452874184, -0.043817367404699326, -0.14549489319324493, 0.17191529273986816, -0.091702900826931, -0.05070344731211662, -0.10471808910369873, -0.12625139951705933, 0.03340790048241615, -0.0965832844376564, 0.08233225345611572, -0.06914737820625305, -0.006034791469573975, -0.10953859239816666, -0.1473584622144699, 0.12190335988998413, -0.1220584288239479, -0.027683475986123085, -0.08471095561981201, 0.14041545987129211, -0.08291134983301163, 0.04391882196068764, 0.01555322203785181, 0.028234146535396576, -0.09107816964387894, -0.0724056139588356, 0.029293544590473175, -0.08028673380613327, 0.039991509169340134, -0.016783295199275017, -0.05770057067275047, -0.038838695734739304, 0.02065570466220379, -0.04272705689072609, 0.19395262002944946, 0.2787751853466034, -0.06130688264966011, 0.1652306318283081, 0.19389866292476654, -0.06834245473146439, -0.3348325788974762, -0.15330104529857635, -0.18924160301685333, -0.00262203230522573, 0.039756376296281815, -0.0995999351143837, 0.03919558599591255, 0.03538539633154869, -0.07742315530776978, 0.08404258638620377, -0.17764954268932343, -0.10197233408689499, 0.22148115932941437, 0.01088774111121893, 0.47456836700439453, -0.12282955646514893, -0.08088382333517075, -0.050221268087625504, -0.15218311548233032, 0.02762092649936676, 0.024099042639136314, 0.08109649270772934, -0.024746915325522423, 0.050660181790590286, 0.01513382326811552, -0.0837605819106102, 0.12714138627052307, -0.044774819165468216, 0.031336043030023575, -0.1250939816236496, -0.08068486303091049, 0.14078018069267273, 0.015075008384883404, -0.000017024576663970947, -0.05294191464781761, 0.007671106606721878, -0.030219420790672302, -0.02033252827823162, -0.09157484024763107, 0.11807975172996521, 0.010566304437816143, -0.08606815338134766, -0.009927838109433651, 0.01799631118774414, -0.008573832921683788, -0.03248434513807297, 0.17883317172527313, -0.02207224629819393, 0.1596788763999939, 0.05630655586719513, 0.027069764211773872, -0.12321722507476807, -0.04408915713429451, -0.07132025808095932, -0.09235309809446335, 0.05528583005070686, -0.02225716970860958, 0.021646222099661827, 0.10908228158950806, -0.0008713404531590641, 0.06815633177757263, 0.08529330044984818, 0.0033168073277920485, -0.002914582611992955, 0.17132233083248138, -0.16958922147750854, -0.04853539541363716, 0.0017645707121118903, -0.019547447562217712, 0.08614269644021988, 0.053756773471832275, 0.0696013793349266, 0.011011856608092785, -0.019527092576026917, -0.0008753761649131775, 0.014154796488583088, -0.07743364572525024, 0.043326083570718765, 0.05904213711619377, 0.03302483260631561, -0.11712110042572021, 0.023928595706820488, -0.028773874044418335, -0.1713922768831253, -0.027610182762145996, 0.07125840336084366, -0.1154884472489357, -0.11434871703386307, -0.026178868487477303, 0.03197098150849342, -0.14115841686725616, -0.06515196710824966, -0.07944085448980331, -0.1230631098151207, 0.048969849944114685, 0.1969773918390274, 0.10806266218423843, 0.09408999234437943, 0.0017032782780006528, -0.0335785411298275, -0.008349417708814144, -0.003927908837795258, -0.02944078855216503, 0.02616991102695465, -0.104154072701931, 0.023232081905007362, -0.007570318877696991, 0.13357044756412506, -0.09372954815626144, -0.043094977736473083, -0.1443197876214981, 0.05207586660981178, -0.05210496857762337, -0.0732283964753151, -0.11731022596359253, -0.07537982612848282, 0.03175284340977669, -0.08689343929290771, -0.059140440076589584, -0.030969180166721344, -0.11052776128053665, 0.02816484123468399, 0.056207314133644104, 0.0343519002199173, -0.07226783037185669, -0.02551295794546604, 0.12516142427921295, -0.016443943604826927, 0.07147104293107986, 0.10704057663679123, -0.0505019836127758, 0.07803482562303543, -0.1568131446838379, -0.08116715401411057, 0.09112861007452011, 0.009364469908177853, 0.05192669853568077, 0.04855629429221153, 0.020397471264004707, 0.0823701024055481, 0.00809909775853157, 0.06112604960799217, 0.028695331886410713, -0.11604636907577515, 0.08453771471977234, 0.043670088052749634, -0.15888088941574097, -0.01512014027684927, -0.09394258260726929, 0.08837137371301651, -0.01957358419895172, 0.15401722490787506, -0.04805395007133484, 0.06612566858530045, -0.035643454641103745, 0.02363845892250538, -0.023169830441474915, -0.12481854110956192, -0.03284522518515587, -0.07386983186006546, -0.03480781987309456, -0.004586056340485811, 0.2587958872318268, 0.008226155303418636, -0.04139374569058418, 0.07244918495416641, 0.08092755824327469, -0.009281995706260204, -0.009287863038480282, 0.20125798881053925, 0.07171128690242767, -0.03326113894581795, -0.10566828399896622, 0.07651367038488388, -0.01995013654232025, -0.08530662208795547, 0.08895325660705566, 0.11116973310709, 0.06004393473267555, 0.07463643699884415, 0.056320834904909134, 0.03407914564013481, -0.11109844595193863, -0.19517381489276886, -0.017459996044635773, 0.02278727851808071, 0.02344011515378952, 0.06466013193130493, 0.19179749488830566, -0.031829968094825745, 0.04372319206595421, -0.047385796904563904, 0.0024603407364338636, -0.19378431141376495, -0.1611379235982895, -0.06681805849075317, -0.05854221060872078, 0.05688551440834999, 0.016361072659492493, 0.006732676178216934, 0.13943986594676971, 0.042622897773981094, -0.049696583300828934, 0.061054471880197525, -0.040526896715164185, -0.01510708499699831, -0.010526162572205067, -0.0054362453520298, 0.01240385603159666, -0.050650160759687424, -0.02937047742307186, -0.13970676064491272, -0.03845541551709175, -0.03816451504826546, -0.0024171967525035143, -0.03533999249339104, 0.011920741759240627, -0.11455266922712326, -0.09912336617708206, -0.059690069407224655, 0.03897004947066307, -0.015913957729935646, 0.08729181438684464, -0.006604430731385946, 0.06859897822141647, 0.02801515720784664, 0.11767607927322388, -0.04338083043694496, -0.14028288424015045, -0.0462213009595871, 0.1523163765668869, 0.055370163172483444, 0.0879065990447998, -0.007357776165008545, 0.025989549234509468, -0.06315873563289642, 0.28880277276039124, 0.32686182856559753, -0.0208609476685524, 0.0713084265589714, 0.02977585606276989, 0.026499463245272636, 0.06676537543535233, 0.09309699386358261, 0.04940341040492058, 0.28358545899391174, -0.09044936299324036, -0.021919772028923035, -0.05128936097025871, -0.041216108947992325, -0.09959081560373306, 0.06867557764053345, -0.013203146867454052, -0.03170536831021309, -0.04714158549904823, 0.07896358519792557, -0.13675321638584137, 0.09959133714437485, 0.10954934358596802, -0.15547709167003632, -0.0273777823895216, -0.0021241146605461836, 0.13903750479221344, 0.04568047821521759, 0.07892762869596481, -0.0424947626888752, -0.06412800401449203, 0.029353773221373558, 0.026118529960513115, -0.23387230932712555, -0.07240664213895798, 0.1019587442278862, 0.03410616144537926, 0.12309657782316208, -0.03012763150036335, 0.048740118741989136, 0.08979636430740356, 0.0706351175904274, -0.04144630953669548, 0.07619112730026245, 0.03903710097074509, -0.09622051566839218, -0.0452069528400898, -0.07013886421918869, 0.011271871626377106, -0.12221411615610123, 0.03154105320572853, -0.12117307633161545, 0.06076738238334656, -0.0720173716545105, -0.06073034182190895, -0.028776710852980614, 0.09910698980093002, -0.059345465153455734, 0.0633072629570961, 0.041798193007707596, 0.023768359795212746, -0.04269998148083687, -0.055672287940979004, -0.002117288066074252, 0.08340573310852051, -0.14078404009342194, -0.120066799223423, -0.04734553024172783, -0.04197065904736519, 0.029524995014071465, 0.000534452497959137, -0.15803496539592743, -0.051741886883974075, -0.11703240871429443, 0.014840089716017246, -0.14045701920986176, 0.03586776927113533, 0.08950865268707275, 0.045145053416490555, 0.013757213018834591, -0.09910336136817932, 0.03793834149837494, 0.03290550410747528, -0.12357325106859207, -0.09317777305841446 ]
null
null
transformers
This is SinBERT-small model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*
{"language": ["si"], "license": "mit"}
fill-mask
NLPC-UOM/SinBERT-small
[ "transformers", "pytorch", "roberta", "fill-mask", "si", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "si" ]
TAGS #transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us
This is SinBERT-small model. SinBERT models are pretrained on a large Sinhala monolingual corpus (sin-cc-15M) using RoBERTa. If you use this model, please cite *BERTifying Sinhala - A Comprehensive Analysis of Pre-trained Language Models for Sinhala Text Classification, LREC 2022*
[]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #si #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.050591934472322464, 0.021755702793598175, -0.008006316609680653, 0.023109525442123413, 0.1173422560095787, 0.030626652762293816, 0.15884624421596527, 0.08412586897611618, 0.098439060151577, -0.018652938306331635, 0.15864907205104828, 0.23091988265514374, 0.004313276614993811, 0.15920700132846832, -0.04942096769809723, -0.24731194972991943, 0.06998438388109207, 0.024149348959326744, -0.03767886757850647, 0.12188687175512314, 0.0871223732829094, -0.061658408492803574, 0.06543832272291183, -0.010430450551211834, -0.08927353471517563, 0.021802326664328575, 0.06526345014572144, -0.08914679288864136, 0.1394282877445221, 0.044772084802389145, 0.15505777299404144, 0.044967249035835266, -0.0165758915245533, -0.12249351292848587, 0.04184199869632721, -0.027510613203048706, -0.08311492949724197, 0.03072301484644413, -0.020604049786925316, -0.05473697558045387, 0.03610158711671829, 0.07074851542711258, 0.0022161954548209906, 0.0635056272149086, -0.14531400799751282, -0.13316236436367035, -0.05332968011498451, 0.06765146553516388, 0.044183820486068726, 0.056753337383270264, 0.022199153900146484, 0.1933409571647644, -0.11488129943609238, 0.07669957727193832, 0.11882254481315613, -0.30006951093673706, 0.004747388884425163, 0.07490549236536026, 0.07425065338611603, -0.05330498144030571, -0.025558114051818848, 0.06374412029981613, 0.04689822718501091, 0.004546920768916607, 0.018395433202385902, -0.06654002517461777, 0.010261854156851768, 0.017764683812856674, -0.05409367382526398, -0.07574256509542465, 0.12863758206367493, -0.043126821517944336, 0.0049995966255664825, 0.002975303679704666, -0.09533488005399704, -0.02192321978509426, -0.022725427523255348, -0.011198594234883785, -0.010058467276394367, 0.0730784684419632, 0.009870496578514576, -0.02496490068733692, -0.1341504007577896, 0.011249128729104996, -0.24781598150730133, 0.21831387281417847, 0.042341817170381546, 0.07478608936071396, -0.14993970096111298, 0.032446179538965225, -0.01891428232192993, -0.10630571097135544, 0.010110284201800823, -0.07890218496322632, 0.03734961897134781, 0.008956363424658775, -0.030059240758419037, 0.026614850386977196, 0.0917462483048439, 0.23568211495876312, 0.04497336223721504, 0.0038724932819604874, 0.03900599107146263, 0.11643928289413452, -0.011927605606615543, 0.05406658351421356, 0.03905190899968147, -0.0032111809123307467, 0.030604436993598938, -0.14485523104667664, 0.05859575793147087, -0.037171632051467896, -0.1412450522184372, -0.05159803107380867, -0.052909642457962036, 0.08983313292264938, 0.02598431147634983, 0.05604388192296028, -0.08889389038085938, 0.037732746452093124, 0.08433718234300613, -0.053362682461738586, -0.013229918666183949, -0.02647816576063633, 0.06374271959066391, 0.0514199323952198, 0.0036933657247573137, 0.0029142771381884813, -0.0008524816366843879, 0.12478234618902206, -0.08103299885988235, -0.028966860845685005, -0.04079842194914818, -0.04158633574843407, 0.062124621123075485, -0.15139785408973694, 0.04073091223835945, -0.18377017974853516, -0.17878369987010956, 0.04900844022631645, 0.053083907812833786, -0.008530013263225555, -0.036814745515584946, 0.04713711142539978, 0.00935443677008152, 0.007943247444927692, -0.054770369082689285, -0.07461941242218018, -0.05357908084988594, 0.11354707926511765, -0.006160577293485403, 0.09178217500448227, -0.1368788629770279, 0.028155764564871788, -0.12271592020988464, 0.013020443730056286, -0.07976391166448593, -0.05923790857195854, -0.044977497309446335, 0.1653629094362259, -0.0035916378255933523, -0.030951455235481262, -0.10145357251167297, 0.04371136799454689, -0.023898938670754433, 0.1820956915616989, -0.09497055411338806, -0.10888206958770752, 0.21805422008037567, -0.14110411703586578, -0.1686301976442337, 0.08516230434179306, 0.014378540217876434, 0.06350121647119522, 0.05394897237420082, 0.12251251190900803, 0.04685007035732269, -0.17240381240844727, 0.07732924073934555, 0.1135357990860939, -0.15753090381622314, -0.16576842963695526, 0.06579794734716415, -0.031998410820961, -0.07043245434761047, 0.04593932628631592, 0.03885846957564354, 0.11921340227127075, -0.05947595462203026, -0.07497801631689072, -0.017622267827391624, -0.036919381469488144, 0.0893460214138031, 0.03657286986708641, 0.08213216811418533, -0.09041652828454971, -0.01683838851749897, -0.06918618828058243, 0.010063487105071545, 0.09815219789743423, 0.026850534602999687, -0.11997922509908676, 0.12194175273180008, -0.0063081406988203526, -0.01322436798363924, -0.11142271757125854, -0.06017596647143364, -0.04011519253253937, 0.01259619276970625, -0.007643837947398424, 0.11753854900598526, 0.07976072281599045, -0.047422271221876144, -0.009534258395433426, -0.0023711302783340216, 0.11578580737113953, 0.029382361099123955, -0.013428855687379837, -0.10908713191747665, 0.03355276957154274, -0.055763330310583115, 0.010883855633437634, 0.006010865792632103, 0.01682361401617527, 0.010738704353570938, 0.1175321713089943, -0.035640884190797806, 0.05301870405673981, -0.07242736220359802, 0.01941610313951969, -0.04233291745185852, 0.013681739568710327, 0.10989829152822495, 0.03045118786394596, -0.052027132362127304, 0.21158362925052643, -0.1029428020119667, 0.32900407910346985, 0.1983247995376587, -0.21622496843338013, -0.020492779091000557, 0.027496790513396263, -0.03525153174996376, -0.01159580796957016, 0.03473326936364174, 0.007916728965938091, -0.021562887355685234, -0.006535433232784271, 0.14363828301429749, -0.01274896040558815, -0.008367200382053852, 0.034577153623104095, -0.08381704241037369, -0.05324168875813484, 0.026965441182255745, 0.16023260354995728, -0.1581190675497055, 0.19909481704235077, 0.25860342383384705, 0.027555078268051147, 0.15407633781433105, -0.023708157241344452, 0.003509500762447715, -0.019903946667909622, -0.03149259462952614, -0.017795279622077942, 0.08939072489738464, -0.14766524732112885, -0.01838412694633007, 0.06409000605344772, -0.05517591908574104, 0.03658156841993332, -0.12290366739034653, -0.08145111799240112, 0.016814490780234337, 0.022217722609639168, -0.07100408524274826, 0.14392095804214478, -0.012960716150701046, 0.06922166794538498, -0.011528313159942627, -0.10641634464263916, 0.11287028342485428, 0.011628503911197186, -0.0573701448738575, 0.1288471668958664, -0.10290449857711792, -0.2840704619884491, -0.16073419153690338, -0.16206155717372894, 0.039554890245199203, 0.041257765144109726, 0.08684303611516953, -0.050558313727378845, -0.05163794755935669, 0.08347249776124954, -0.023014752194285393, -0.022694705054163933, 0.045194972306489944, -0.08532359451055527, 0.028088992461562157, -0.04024394229054451, -0.07418694347143173, -0.08546998351812363, -0.015470930375158787, -0.020014377310872078, 0.1250854730606079, -0.0788731649518013, 0.09053310751914978, 0.08313828706741333, -0.004480581730604172, 0.05209716781973839, -0.030155370011925697, 0.17516584694385529, -0.07908747345209122, 0.004076098557561636, 0.19238100945949554, -0.0070248194970190525, 0.08760134130716324, 0.2050885111093521, 0.04010841250419617, -0.04262885823845863, -0.015956319868564606, -0.06098438799381256, -0.12356745451688766, -0.18789182603359222, -0.1054161861538887, -0.11294716596603394, 0.01674163155257702, 0.031429652124643326, 0.06800312548875809, 0.14061281085014343, 0.11563465744256973, 0.012444790452718735, -0.03725677356123924, -0.07258399575948715, 0.06507586687803268, 0.212574303150177, -0.03658689185976982, 0.12009898573160172, -0.08552532643079758, -0.12130878120660782, 0.08363447338342667, 0.032765116542577744, 0.09258618950843811, 0.14234356582164764, 0.01631159335374832, 0.09373423457145691, 0.16219133138656616, 0.1340932697057724, 0.10450520366430283, 0.053674232214689255, -0.05654620751738548, -0.012348517775535583, -0.014165983535349369, -0.04570310190320015, 0.04450669884681702, 0.07933278381824493, -0.09588659554719925, -0.019935818389058113, -0.12413667887449265, 0.043343763798475266, 0.11705521494150162, 0.06059858202934265, -0.19517596065998077, -0.007991977967321873, 0.06983734667301178, 0.007937063463032246, -0.03991718217730522, 0.04673563316464424, -0.031906608492136, -0.1289171278476715, 0.07570863515138626, -0.039919253438711166, 0.07591115683317184, 0.06677773594856262, 0.07482055574655533, -0.013097460381686687, -0.0867016389966011, 0.03525908291339874, 0.08430071920156479, -0.29410722851753235, 0.30191823840141296, -0.0031261921394616365, -0.00624937703832984, -0.065646231174469, -0.01177025306969881, 0.04684963822364807, 0.13821451365947723, 0.15114179253578186, 0.03616536036133766, -0.06905839592218399, -0.11867018789052963, 0.01830291375517845, 0.023695265874266624, 0.07166315615177155, -0.006497399415820837, -0.030128220096230507, -0.062226831912994385, -0.014070126228034496, -0.014770664274692535, 0.10776478797197342, -0.00897882878780365, -0.1177956834435463, 0.07067783921957016, 0.05891796946525574, -0.0005980636924505234, -0.027776727452874184, -0.043817367404699326, -0.14549489319324493, 0.17191529273986816, -0.091702900826931, -0.05070344731211662, -0.10471808910369873, -0.12625139951705933, 0.03340790048241615, -0.0965832844376564, 0.08233225345611572, -0.06914737820625305, -0.006034791469573975, -0.10953859239816666, -0.1473584622144699, 0.12190335988998413, -0.1220584288239479, -0.027683475986123085, -0.08471095561981201, 0.14041545987129211, -0.08291134983301163, 0.04391882196068764, 0.01555322203785181, 0.028234146535396576, -0.09107816964387894, -0.0724056139588356, 0.029293544590473175, -0.08028673380613327, 0.039991509169340134, -0.016783295199275017, -0.05770057067275047, -0.038838695734739304, 0.02065570466220379, -0.04272705689072609, 0.19395262002944946, 0.2787751853466034, -0.06130688264966011, 0.1652306318283081, 0.19389866292476654, -0.06834245473146439, -0.3348325788974762, -0.15330104529857635, -0.18924160301685333, -0.00262203230522573, 0.039756376296281815, -0.0995999351143837, 0.03919558599591255, 0.03538539633154869, -0.07742315530776978, 0.08404258638620377, -0.17764954268932343, -0.10197233408689499, 0.22148115932941437, 0.01088774111121893, 0.47456836700439453, -0.12282955646514893, -0.08088382333517075, -0.050221268087625504, -0.15218311548233032, 0.02762092649936676, 0.024099042639136314, 0.08109649270772934, -0.024746915325522423, 0.050660181790590286, 0.01513382326811552, -0.0837605819106102, 0.12714138627052307, -0.044774819165468216, 0.031336043030023575, -0.1250939816236496, -0.08068486303091049, 0.14078018069267273, 0.015075008384883404, -0.000017024576663970947, -0.05294191464781761, 0.007671106606721878, -0.030219420790672302, -0.02033252827823162, -0.09157484024763107, 0.11807975172996521, 0.010566304437816143, -0.08606815338134766, -0.009927838109433651, 0.01799631118774414, -0.008573832921683788, -0.03248434513807297, 0.17883317172527313, -0.02207224629819393, 0.1596788763999939, 0.05630655586719513, 0.027069764211773872, -0.12321722507476807, -0.04408915713429451, -0.07132025808095932, -0.09235309809446335, 0.05528583005070686, -0.02225716970860958, 0.021646222099661827, 0.10908228158950806, -0.0008713404531590641, 0.06815633177757263, 0.08529330044984818, 0.0033168073277920485, -0.002914582611992955, 0.17132233083248138, -0.16958922147750854, -0.04853539541363716, 0.0017645707121118903, -0.019547447562217712, 0.08614269644021988, 0.053756773471832275, 0.0696013793349266, 0.011011856608092785, -0.019527092576026917, -0.0008753761649131775, 0.014154796488583088, -0.07743364572525024, 0.043326083570718765, 0.05904213711619377, 0.03302483260631561, -0.11712110042572021, 0.023928595706820488, -0.028773874044418335, -0.1713922768831253, -0.027610182762145996, 0.07125840336084366, -0.1154884472489357, -0.11434871703386307, -0.026178868487477303, 0.03197098150849342, -0.14115841686725616, -0.06515196710824966, -0.07944085448980331, -0.1230631098151207, 0.048969849944114685, 0.1969773918390274, 0.10806266218423843, 0.09408999234437943, 0.0017032782780006528, -0.0335785411298275, -0.008349417708814144, -0.003927908837795258, -0.02944078855216503, 0.02616991102695465, -0.104154072701931, 0.023232081905007362, -0.007570318877696991, 0.13357044756412506, -0.09372954815626144, -0.043094977736473083, -0.1443197876214981, 0.05207586660981178, -0.05210496857762337, -0.0732283964753151, -0.11731022596359253, -0.07537982612848282, 0.03175284340977669, -0.08689343929290771, -0.059140440076589584, -0.030969180166721344, -0.11052776128053665, 0.02816484123468399, 0.056207314133644104, 0.0343519002199173, -0.07226783037185669, -0.02551295794546604, 0.12516142427921295, -0.016443943604826927, 0.07147104293107986, 0.10704057663679123, -0.0505019836127758, 0.07803482562303543, -0.1568131446838379, -0.08116715401411057, 0.09112861007452011, 0.009364469908177853, 0.05192669853568077, 0.04855629429221153, 0.020397471264004707, 0.0823701024055481, 0.00809909775853157, 0.06112604960799217, 0.028695331886410713, -0.11604636907577515, 0.08453771471977234, 0.043670088052749634, -0.15888088941574097, -0.01512014027684927, -0.09394258260726929, 0.08837137371301651, -0.01957358419895172, 0.15401722490787506, -0.04805395007133484, 0.06612566858530045, -0.035643454641103745, 0.02363845892250538, -0.023169830441474915, -0.12481854110956192, -0.03284522518515587, -0.07386983186006546, -0.03480781987309456, -0.004586056340485811, 0.2587958872318268, 0.008226155303418636, -0.04139374569058418, 0.07244918495416641, 0.08092755824327469, -0.009281995706260204, -0.009287863038480282, 0.20125798881053925, 0.07171128690242767, -0.03326113894581795, -0.10566828399896622, 0.07651367038488388, -0.01995013654232025, -0.08530662208795547, 0.08895325660705566, 0.11116973310709, 0.06004393473267555, 0.07463643699884415, 0.056320834904909134, 0.03407914564013481, -0.11109844595193863, -0.19517381489276886, -0.017459996044635773, 0.02278727851808071, 0.02344011515378952, 0.06466013193130493, 0.19179749488830566, -0.031829968094825745, 0.04372319206595421, -0.047385796904563904, 0.0024603407364338636, -0.19378431141376495, -0.1611379235982895, -0.06681805849075317, -0.05854221060872078, 0.05688551440834999, 0.016361072659492493, 0.006732676178216934, 0.13943986594676971, 0.042622897773981094, -0.049696583300828934, 0.061054471880197525, -0.040526896715164185, -0.01510708499699831, -0.010526162572205067, -0.0054362453520298, 0.01240385603159666, -0.050650160759687424, -0.02937047742307186, -0.13970676064491272, -0.03845541551709175, -0.03816451504826546, -0.0024171967525035143, -0.03533999249339104, 0.011920741759240627, -0.11455266922712326, -0.09912336617708206, -0.059690069407224655, 0.03897004947066307, -0.015913957729935646, 0.08729181438684464, -0.006604430731385946, 0.06859897822141647, 0.02801515720784664, 0.11767607927322388, -0.04338083043694496, -0.14028288424015045, -0.0462213009595871, 0.1523163765668869, 0.055370163172483444, 0.0879065990447998, -0.007357776165008545, 0.025989549234509468, -0.06315873563289642, 0.28880277276039124, 0.32686182856559753, -0.0208609476685524, 0.0713084265589714, 0.02977585606276989, 0.026499463245272636, 0.06676537543535233, 0.09309699386358261, 0.04940341040492058, 0.28358545899391174, -0.09044936299324036, -0.021919772028923035, -0.05128936097025871, -0.041216108947992325, -0.09959081560373306, 0.06867557764053345, -0.013203146867454052, -0.03170536831021309, -0.04714158549904823, 0.07896358519792557, -0.13675321638584137, 0.09959133714437485, 0.10954934358596802, -0.15547709167003632, -0.0273777823895216, -0.0021241146605461836, 0.13903750479221344, 0.04568047821521759, 0.07892762869596481, -0.0424947626888752, -0.06412800401449203, 0.029353773221373558, 0.026118529960513115, -0.23387230932712555, -0.07240664213895798, 0.1019587442278862, 0.03410616144537926, 0.12309657782316208, -0.03012763150036335, 0.048740118741989136, 0.08979636430740356, 0.0706351175904274, -0.04144630953669548, 0.07619112730026245, 0.03903710097074509, -0.09622051566839218, -0.0452069528400898, -0.07013886421918869, 0.011271871626377106, -0.12221411615610123, 0.03154105320572853, -0.12117307633161545, 0.06076738238334656, -0.0720173716545105, -0.06073034182190895, -0.028776710852980614, 0.09910698980093002, -0.059345465153455734, 0.0633072629570961, 0.041798193007707596, 0.023768359795212746, -0.04269998148083687, -0.055672287940979004, -0.002117288066074252, 0.08340573310852051, -0.14078404009342194, -0.120066799223423, -0.04734553024172783, -0.04197065904736519, 0.029524995014071465, 0.000534452497959137, -0.15803496539592743, -0.051741886883974075, -0.11703240871429443, 0.014840089716017246, -0.14045701920986176, 0.03586776927113533, 0.08950865268707275, 0.045145053416490555, 0.013757213018834591, -0.09910336136817932, 0.03793834149837494, 0.03290550410747528, -0.12357325106859207, -0.09317777305841446 ]
null
null
transformers
# Wav2Vec2-Large-Japanese Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Japanese using the [Common Voice](https://huggingface.co/datasets/common_voice), [JSUT](https://sites.google.com/site/shinnosuketakamichi/publication/jsut), [TEDxJP](https://github.com/laboroai/TEDxJP-10K) and some other data. This model is a model trained on public data. If you want to use trained model with more 600 hours of data and higher accuracy please contact [email protected] When using this model, make sure that your speech input is sampled at 16kHz. ## Usage The model can be used directly (without a language model) as follows: ```python import torch import librosa from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor LANG_ID = "ja" MODEL_ID = "NTQAI/wav2vec2-large-japanese" SAMPLES = 3 test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]") processor = Wav2Vec2Processor.from_pretrained(MODEL_ID) model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID) # Preprocessing the datasets. # We need to read the audio files as arrays def speech_file_to_array_fn(batch): speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000) batch["speech"] = speech_array batch["sentence"] = batch["sentence"].upper() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits predicted_ids = torch.argmax(logits, dim=-1) predicted_sentences = processor.batch_decode(predicted_ids) for i, predicted_sentence in enumerate(predicted_sentences): print("-" * 100) print("Reference:", test_dataset[i]["sentence"]) print("Prediction:", predicted_sentence) ``` | Reference | Prediction | | ------------- | ------------- | | 祖母は、おおむね機嫌よく、サイコロをころがしている。 | 祖母思い切れを最布ロぼがしている | | 財布をなくしたので、交番へ行きます。 | 財布をなく時間ので交番でへ行きます | | 飲み屋のおやじ、旅館の主人、医者をはじめ、交際のある人にきいてまわったら、みんな、私より収入が多いはずなのに、税金は安い。 | ロみ屋のおやし旅館の主人に医をはめ交載のあの人に聞いて回ったらみんな私より収入が多い発ずなのに請金は安い | ## Evaluation The model can be evaluated as follows on the Japanese test data of Common Voice. ```python import torch import re import librosa from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor LANG_ID = "ja" MODEL_ID = "NTQAI/wav2vec2-large-japanese" DEVICE = "cuda" CHARS_TO_IGNORE = [",", "?", "¿", ".", "!", "¡", ";", ";", ":", '""', "%", '"', "�", "ʿ", "·", "჻", "~", "՞", "؟", "،", "।", "॥", "«", "»", "„", "“", "”", "「", "」", "‘", "’", "《", "》", "(", ")", "[", "]", "{", "}", "=", "`", "_", "+", "<", ">", "…", "–", "°", "´", "ʾ", "‹", "›", "©", "®", "—", "→", "。", "、", "﹂", "﹁", "‧", "~", "﹏", ",", "{", "}", "(", ")", "[", "]", "【", "】", "‥", "〽", "『", "』", "〝", "〟", "⟨", "⟩", "〜", ":", "!", "?", "♪", "؛", "/", "\\", "º", "−", "^", "'", "ʻ", "ˆ"] test_dataset = load_dataset("common_voice", LANG_ID, split="test") wer = load_metric("wer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/wer.py cer = load_metric("cer.py") # https://github.com/jonatasgrosman/wav2vec2-sprint/blob/main/cer.py chars_to_ignore_regex = f"[{re.escape(''.join(CHARS_TO_IGNORE))}]" processor = Wav2Vec2Processor.from_pretrained(MODEL_ID) model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID) model.to(DEVICE) # Preprocessing the datasets. # We need to read the audio files as arrays def speech_file_to_array_fn(batch): with warnings.catch_warnings(): warnings.simplefilter("ignore") speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000) batch["speech"] = speech_array batch["sentence"] = re.sub(chars_to_ignore_regex, "", batch["sentence"]).upper() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) # Preprocessing the datasets. # We need to read the audio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to(DEVICE), attention_mask=inputs.attention_mask.to(DEVICE)).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=8) predictions = [x.upper() for x in result["pred_strings"]] references = [x.upper() for x in result["sentence"]] print(f"WER: {wer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}") print(f"CER: {cer.compute(predictions=predictions, references=references, chunk_size=1000) * 100}") ``` **Test Result**: | Model | WER | CER | | ------------- | ------------- | ------------- | | NTQAI/wav2vec2-large-japanese | **73.10%** | **18.15%** | | vumichien/wav2vec2-large-xlsr-japanese | 1108.86% | 23.40% | | qqhann/w2v_hf_jsut_xlsr53 | 1012.18% | 70.77% |
{"language": "ja", "tags": ["audio", "automatic-speech-recognition", "speech"], "datasets": ["common_voice"], "metrics": ["wer", "cer"], "model-index": [{"name": "Wav2Vec2 Japanese by NTQAI", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ja", "type": "common_voice", "args": "ja"}, "metrics": [{"type": "wer", "value": 81.3, "name": "Test WER"}, {"type": "cer", "value": 21.9, "name": "Test CER"}]}]}]}
automatic-speech-recognition
NTQAI/wav2vec2-large-japanese
[ "transformers", "pytorch", "jax", "wav2vec2", "automatic-speech-recognition", "audio", "speech", "ja", "dataset:common_voice", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ja" ]
TAGS #transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #ja #dataset-common_voice #model-index #endpoints_compatible #region-us
Wav2Vec2-Large-Japanese ======================= Fine-tuned facebook/wav2vec2-large-xlsr-53 on Japanese using the Common Voice, JSUT, TEDxJP and some other data. This model is a model trained on public data. If you want to use trained model with more 600 hours of data and higher accuracy please contact nha282@URL When using this model, make sure that your speech input is sampled at 16kHz. Usage ----- The model can be used directly (without a language model) as follows: Evaluation ---------- The model can be evaluated as follows on the Japanese test data of Common Voice. Test Result: Model: NTQAI/wav2vec2-large-japanese, WER: 73.10%, CER: 18.15% Model: vumichien/wav2vec2-large-xlsr-japanese, WER: 1108.86%, CER: 23.40% Model: qqhann/w2v\_hf\_jsut\_xlsr53, WER: 1012.18%, CER: 70.77%
[]
[ "TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #ja #dataset-common_voice #model-index #endpoints_compatible #region-us \n" ]
[ 61 ]
[ "passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #ja #dataset-common_voice #model-index #endpoints_compatible #region-us \n" ]
[ -0.13124075531959534, 0.08899911493062973, -0.005622770171612501, -0.06805486977100372, 0.10405170172452927, -0.03877341002225876, 0.07911530137062073, 0.10991860181093216, 0.08735337853431702, 0.016482260078191757, 0.038872525095939636, 0.1908925473690033, 0.022578060626983643, 0.02741044946014881, -0.06210148707032204, -0.23748372495174408, 0.08670602738857269, 0.04998859763145447, 0.11794266849756241, 0.08031686395406723, 0.09920649230480194, -0.07381762564182281, 0.015063727274537086, 0.06027871370315552, -0.12341594696044922, 0.027492035180330276, 0.06224464252591133, -0.14478127658367157, 0.10069651156663895, 0.022056739777326584, 0.03659854456782341, 0.030288448557257652, 0.017276868224143982, -0.20080795884132385, 0.014074134640395641, -0.01926051825284958, 0.03588537499308586, 0.016093606129288673, 0.08179441839456558, -0.06309708207845688, -0.01609405316412449, 0.10660365968942642, -0.013132140971720219, 0.09023880958557129, -0.053432922810316086, -0.20472469925880432, 0.01973477192223072, 0.0370248481631279, 0.06869662553071976, 0.06367999315261841, -0.05141906440258026, 0.0961938127875328, -0.07877049595117569, 0.1236947700381279, 0.10408756881952286, -0.2467595338821411, 0.024954816326498985, 0.003697924083098769, 0.1012350395321846, 0.013689554296433926, -0.01892963983118534, 0.08670175820589066, 0.0060213650576770306, 0.033075615763664246, -0.06000882387161255, -0.07321944087743759, -0.192413792014122, -0.022435270249843597, -0.07752649486064911, -0.022412553429603577, 0.25498664379119873, -0.0001072363811545074, 0.0390399694442749, -0.09655621647834778, -0.021359184756875038, -0.014641416259109974, -0.06915435194969177, -0.044997137039899826, -0.031700681895017624, 0.05298662930727005, -0.03908339515328407, 0.0023527080193161964, -0.08357544988393784, -0.07093016058206558, -0.11679434776306152, 0.16190801560878754, 0.010064070113003254, 0.036984849721193314, -0.15520009398460388, -0.015065997838973999, -0.07401102781295776, -0.07000378519296646, 0.011617744341492653, -0.013561992906033993, -0.033415649086236954, -0.0010247440077364445, -0.037935469299554825, -0.108468197286129, 0.14075256884098053, 0.010739706456661224, 0.0628480389714241, 0.04884268715977669, -0.07388439029455185, 0.06089676171541214, 0.06034809350967407, 0.10873089730739594, -0.0521000474691391, -0.07584870606660843, 0.037674661725759506, -0.07957892119884491, 0.0031865148339420557, -0.045461446046829224, -0.11363603919744492, -0.04487717151641846, 0.0926085114479065, 0.0749487429857254, 0.03987998887896538, 0.008772414177656174, -0.08041999489068985, -0.036108143627643585, -0.026087138801813126, -0.07471618801355362, 0.0006810420309193432, 0.04858873039484024, 0.08013574779033661, 0.17808791995048523, -0.015774359926581383, 0.0702216774225235, -0.11636732518672943, 0.010182050988078117, 0.02177674137055874, 0.016238540410995483, 0.05818852037191391, -0.03542524576187134, 0.033688485622406006, -0.1088334247469902, 0.07165177166461945, -0.12594732642173767, -0.06555956602096558, -0.03007853776216507, -0.022547315806150436, 0.0019653516355901957, -0.1252119094133377, -0.08140212297439575, -0.035848796367645264, 0.019279712811112404, -0.10165904462337494, -0.009390256367623806, -0.07455717772245407, 0.08761616051197052, 0.0173660721629858, 0.0908319428563118, -0.05107404291629791, 0.1032601073384285, -0.03340107947587967, -0.019482159987092018, -0.04301043227314949, 0.10848105698823929, -0.055144160985946655, 0.04838120937347412, -0.07424283027648926, -0.030989529564976692, -0.07296358048915863, 0.08247685432434082, -0.03835789114236832, 0.13001775741577148, -0.17803776264190674, -0.14453987777233124, 0.17974820733070374, -0.08716181665658951, -0.10069429874420166, 0.13375185430049896, 0.04650714248418808, -0.004398350603878498, 0.13675832748413086, 0.33693596720695496, 0.0050239949487149715, -0.1363605558872223, 0.04326647147536278, 0.07605873793363571, -0.05595537647604942, -0.07849977910518646, 0.040770310908555984, -0.07503404468297958, -0.05046813189983368, 0.032815150916576385, 0.0987476333975792, 0.07276799529790878, -0.06160967797040939, -0.052081480622291565, -0.007378994487226009, -0.10283467173576355, 0.006182751152664423, -0.0048544155433773994, 0.040090080350637436, -0.032598141580820084, -0.009426024742424488, -0.00353598827496171, 0.06193196401000023, -0.06553098559379578, 0.04520348832011223, -0.12839260697364807, 0.16829098761081696, -0.12369485944509506, -0.003961542155593634, -0.18866486847400665, 0.17480678856372833, -0.01252052653580904, 0.061438534408807755, 0.06569430977106094, 0.04480452835559845, 0.06741490960121155, -0.07410697638988495, 0.007520502898842096, -0.05411504954099655, 0.17919200658798218, 0.05914929881691933, -0.02788049913942814, -0.15061481297016144, 0.03360615670681, -0.08696901798248291, -0.02171912044286728, 0.014579083770513535, -0.05897677317261696, 0.03934047371149063, 0.13840053975582123, 0.024844080209732056, -0.012387332506477833, 0.07187765091657639, 0.03744339197874069, 0.004568750038743019, 0.03771878406405449, 0.05481954663991928, -0.009169475175440311, -0.040933601558208466, 0.2628037631511688, -0.1451248973608017, 0.26861637830734253, 0.20350778102874756, -0.24078041315078735, 0.06305871158838272, 0.12391496449708939, 0.03224043548107147, -0.01169058121740818, 0.05201776325702667, -0.03923220932483673, 0.23479284346103668, -0.02414383552968502, 0.13432325422763824, -0.06395351141691208, -0.0008388574351556599, 0.05088881030678749, -0.04929092526435852, -0.03827297315001488, 0.03724202886223793, -0.00878793466836214, -0.08288707584142685, 0.10328932106494904, 0.06864967942237854, -0.04175712168216705, 0.20533645153045654, -0.03860219940543175, -0.047332629561424255, 0.059845250099897385, -0.055448271334171295, -0.05713837966322899, 0.05082451552152634, -0.34574708342552185, -0.06938681751489639, 0.06599309295415878, 0.005356551613658667, 0.10101569443941116, -0.11026027798652649, 0.015739375725388527, -0.010933727025985718, -0.06201831251382828, -0.0979069173336029, 0.08197485655546188, 0.013340797275304794, 0.062333494424819946, -0.07687512785196304, -0.11826343089342117, 0.042031511664390564, -0.03675370290875435, -0.10949460417032242, 0.06141241639852524, -0.12176244705915451, -0.27397266030311584, -0.10156526416540146, -0.1324874758720398, 0.00833460595458746, 0.049853116273880005, 0.12198484688997269, -0.1073596179485321, -0.02415517158806324, 0.03791780024766922, 0.047173045575618744, -0.08645399659872055, 0.020533446222543716, -0.003559538396075368, 0.008830929175019264, 0.007998156361281872, -0.09794437140226364, -0.021355368196964264, -0.04297100752592087, 0.04991672933101654, 0.03605581074953079, -0.04545459523797035, 0.06344923377037048, 0.2142288237810135, 0.10450762510299683, 0.044110581278800964, -0.008805845864117146, 0.18759213387966156, -0.11294341087341309, -0.0608726367354393, 0.17066776752471924, -0.07571814209222794, -0.01196493860334158, 0.20287147164344788, 0.021561965346336365, -0.054706647992134094, -0.07346269488334656, -0.05121816322207451, -0.05029710754752159, -0.22982090711593628, -0.13434606790542603, -0.11776154488325119, -0.023066116496920586, -0.040774617344141006, 0.03872067481279373, 0.1125570684671402, -0.018166519701480865, 0.004497533664107323, -0.05698217451572418, 0.047210726886987686, 0.017015576362609863, 0.19616439938545227, -0.05317835882306099, 0.11682207137346268, -0.028883252292871475, -0.09446482360363007, 0.04048837348818779, 0.06456021964550018, 0.0650007426738739, 0.16723647713661194, 0.07216859608888626, 0.016147229820489883, 0.08647951483726501, 0.17444846034049988, 0.07526557147502899, 0.03367221727967262, 0.004135389346629381, 0.015052460134029388, -0.03647309169173241, -0.058536797761917114, 0.03278617560863495, 0.2979666292667389, -0.07173792272806168, -0.05103087052702904, -0.1290789693593979, 0.009235559031367302, 0.15615178644657135, 0.10718666762113571, -0.19246727228164673, -0.0001912767911562696, 0.0703396201133728, -0.08483743667602539, -0.07539036870002747, 0.11727581173181534, 0.032900188118219376, -0.09129451215267181, 0.08218652009963989, 0.05589544400572777, 0.08481337130069733, -0.10390304774045944, 0.07497123628854752, -0.09664059430360794, -0.07543917000293732, 0.04454701766371727, 0.044115107506513596, -0.21072392165660858, 0.22929726541042328, 0.022222083061933517, 0.06070145592093468, -0.035451389849185944, 0.022494575008749962, 0.010201612487435341, 0.08275426179170609, 0.1638580560684204, -0.0004142792895436287, -0.06313630193471909, -0.09011087566614151, -0.04172547161579132, 0.040079791098833084, 0.07970929145812988, 0.05932921543717384, -0.047406576573848724, -0.010530542582273483, -0.07177041471004486, -0.008905749768018723, -0.1256898194551468, -0.11055824160575867, -0.09839557856321335, 0.016524286940693855, 0.23454636335372925, 0.14787545800209045, 0.00577518017962575, -0.05958196893334389, -0.15082304179668427, 0.08769068121910095, -0.09352287650108337, -0.02004859782755375, -0.06632930785417557, -0.1656503528356552, 0.14735275506973267, -0.028482962399721146, 0.052628763020038605, -0.0028211395256221294, 0.03567063808441162, -0.0370626226067543, -0.1144421249628067, 0.13563008606433868, -0.13211141526699066, -0.006313554011285305, -0.06741585582494736, 0.2774144113063812, -0.01736464537680149, 0.05703860893845558, 0.05915405601263046, 0.03772563487291336, -0.047192491590976715, 0.01248480100184679, 0.09167733043432236, 0.12028152495622635, -0.09243020415306091, 0.06592000275850296, 0.021941466256976128, -0.20917555689811707, -0.07294689118862152, 0.00022772522061131895, 0.23789118230342865, 0.07277421653270721, -0.034191202372312546, 0.20152129232883453, 0.24851182103157043, -0.050171248614788055, -0.2731817066669464, -0.13485555350780487, -0.009197922423481941, 0.04310956597328186, -0.08935592323541641, -0.14429204165935516, 0.11203140020370483, -0.1196657046675682, -0.06173766776919365, 0.011788995936512947, -0.1921926587820053, -0.09845329076051712, 0.24964311718940735, -0.0805589109659195, 0.3175583481788635, -0.04911881685256958, -0.12227140367031097, -0.04265204071998596, -0.1370532214641571, 0.02254079282283783, -0.03596150130033493, 0.09085220098495483, 0.061135295778512955, 0.09068068116903305, 0.059039801359176636, -0.025320416316390038, 0.08395242691040039, 0.0840827077627182, -0.07480616867542267, -0.02294464036822319, -0.00701666995882988, -0.05104733258485794, 0.03407396003603935, 0.0313970223069191, -0.021657252684235573, 0.022330261766910553, -0.09909392148256302, -0.08283371478319168, -0.09498298168182373, 0.07370411604642868, 0.07569440454244614, -0.009492618963122368, 0.07510560750961304, -0.1004260927438736, -0.006215950939804316, 0.02904602885246277, 0.17034925520420074, -0.11225736141204834, 0.1172691285610199, 0.1429930180311203, 0.207240492105484, -0.1685434728860855, -0.09133642911911011, -0.059992264956235886, -0.10136523097753525, 0.11635389924049377, 0.001145420130342245, 0.07605420053005219, 0.06964610517024994, 0.03164936974644661, 0.04685232415795326, 0.06169835478067398, -0.0722857192158699, 0.028176790103316307, 0.07531643658876419, -0.11293989419937134, -0.07752273976802826, -0.04207495227456093, -0.004289867822080851, 0.10010547190904617, 0.0905313640832901, 0.1470135748386383, 0.014181078411638737, -0.03690766915678978, -0.03791423887014389, -0.0024508393835276365, -0.14172227680683136, 0.1728702336549759, 0.03361442685127258, 0.04280252382159233, -0.18634258210659027, 0.05123982951045036, -0.03530125319957733, -0.16955113410949707, 0.019699105992913246, -0.00048483689897693694, -0.06542553752660751, -0.10689103603363037, -0.053343962877988815, 0.11388631165027618, 0.021852202713489532, -0.10623658448457718, 0.005075393244624138, -0.15219131112098694, 0.060871709138154984, 0.18238049745559692, 0.04099303111433983, 0.06981824338436127, -0.10096676647663116, -0.05662330240011215, -0.03320371359586716, 0.024636726826429367, 0.03815222904086113, 0.0030582440085709095, -0.18413963913917542, -0.025651270523667336, 0.008698442950844765, 0.10564959049224854, -0.09130329638719559, -0.08905860781669617, -0.10298450291156769, 0.08344064652919769, -0.10944949835538864, -0.022966129705309868, -0.10214221477508545, -0.010900108143687248, 0.03689330071210861, -0.09275805205106735, -0.0411689393222332, 0.030967840924859047, -0.11004039645195007, 0.05049576237797737, 0.001750656054355204, 0.06282039731740952, -0.06561595946550369, 0.008629883639514446, 0.0420173704624176, -0.039622582495212555, 0.11873438954353333, 0.21149857342243195, -0.13425077497959137, 0.11897388100624084, -0.2044321447610855, -0.20540542900562286, 0.14947634935379028, 0.0345672070980072, -0.002532817190513015, -0.05040350928902626, 0.012625359930098057, 0.13236942887306213, 0.03393346071243286, -0.005285568069666624, 0.09114444255828857, -0.060727037489414215, -0.010976874269545078, -0.08305716514587402, -0.07944810390472412, -0.025350632146000862, -0.03858260065317154, 0.16963830590248108, 0.0619310662150383, 0.10647188127040863, -0.06710217893123627, 0.06370408833026886, -0.038994163274765015, 0.058852411806583405, -0.08223479241132736, -0.13068251311779022, -0.1267867088317871, -0.037489790469408035, 0.05273421108722687, -0.05264871194958687, 0.21373268961906433, -0.02504963055253029, 0.024449462071061134, -0.0006068202201277018, 0.022505998611450195, -0.05068449303507805, 0.04266059771180153, 0.2968370318412781, 0.08677967637777328, -0.03840130195021629, -0.04626556858420372, 0.015143146738409996, 0.033058807253837585, 0.14246590435504913, -0.11071033030748367, 0.13165539503097534, 0.07173599302768707, 0.10417432337999344, 0.135152667760849, 0.0036641042679548264, -0.09375552833080292, -0.04864463582634926, -0.0595364086329937, 0.06812736392021179, -0.015679867938160896, 0.16548119485378265, 0.11737822741270065, 0.011895356699824333, 0.06264764815568924, -0.033749934285879135, -0.023451175540685654, -0.18800614774227142, -0.11384576559066772, -0.10922162234783173, -0.12192948907613754, 0.025202792137861252, -0.06387212872505188, 0.05544564500451088, 0.01766825281083584, 0.03939170390367508, -0.02459913305938244, 0.05851529911160469, 0.02116800844669342, -0.07748550176620483, 0.11086320132017136, -0.03963387385010719, -0.026997333392500877, -0.04079553857445717, 0.0021322742104530334, 0.06543850898742676, 0.012922552414238453, -0.008978055790066719, 0.0014223327161744237, -0.14169147610664368, 0.028129657730460167, -0.13018548488616943, -0.12220743298530579, -0.007666947785764933, -0.022446034476161003, 0.004158743191510439, 0.1266946643590927, 0.07608461380004883, -0.07062064856290817, 0.05409550294280052, 0.1652299165725708, -0.09595445543527603, -0.12238729745149612, -0.06745950132608414, 0.13828864693641663, -0.005850967951118946, 0.09991749376058578, -0.006236660294234753, -0.037672754377126694, -0.06924891471862793, 0.237327441573143, 0.27551817893981934, -0.013359995558857918, 0.04404044523835182, -0.01272950042039156, 0.012955059297382832, -0.031542420387268066, -0.01935138739645481, 0.11406540870666504, 0.2465541511774063, 0.012721294537186623, 0.004697503987699747, -0.05937166139483452, -0.05843522772192955, -0.03694978728890419, 0.05336534604430199, -0.013702940195798874, -0.14558236300945282, 0.00944025069475174, 0.13920310139656067, -0.24609166383743286, 0.047938600182533264, -0.12824365496635437, -0.1822081357240677, -0.07257860153913498, -0.03254004195332527, 0.11351148039102554, 0.14469951391220093, 0.0036583817563951015, -0.05284283682703972, -0.07803120464086533, 0.08976632356643677, 0.0020708488300442696, -0.2195701003074646, 0.0262337327003479, 0.0001204296640935354, -0.08441460877656937, -0.0359436571598053, -0.00013418484013527632, 0.128688782453537, 0.0010754511458799243, 0.13418518006801605, 0.0032970162574201822, 0.10662095248699188, -0.0034375020768493414, -0.14562244713306427, 0.03123828023672104, 0.22745682299137115, -0.01756345108151436, 0.07234851270914078, 0.04878014698624611, -0.1526176780462265, 0.04971173405647278, -0.1484355330467224, -0.06302519887685776, -0.051467981189489365, 0.014932747930288315, -0.04548382759094238, 0.043308570981025696, -0.05857618525624275, -0.014820407144725323, -0.04813561215996742, -0.011103708297014236, 0.03014221414923668, 0.053904175758361816, -0.06213147193193436, -0.09954547137022018, -0.1919144243001938, -0.06502608954906464, -0.08306898176670074, -0.003035099944099784, -0.14072857797145844, -0.011169912293553352, -0.08623768389225006, 0.004898558836430311, -0.09526558965444565, 0.03689795359969139, 0.10937173664569855, -0.014085018076002598, 0.010124163702130318, -0.019962357357144356, 0.12149476259946823, 0.13751715421676636, -0.142067089676857, -0.08318552374839783 ]
null
null
transformers
## How to use ```python from simpletransformers.classification import ClassificationModel, ClassificationArgs name_file = ['bash', 'c', 'c#', 'c++','css', 'haskell', 'java', 'javascript', 'lua', 'objective-c', 'perl', 'php', 'python','r','ruby', 'scala', 'sql', 'swift', 'vb.net'] deep_scc_model_args = ClassificationArgs(num_train_epochs=10,max_seq_length=300,use_multiprocessing=False) deep_scc_model = ClassificationModel("roberta", "NTUYG/DeepSCC-RoBERTa", num_labels=19, args=deep_scc_model_args,use_cuda=True) code = ''' public static double getSimilarity(String phrase1, String phrase2) { return (getSC(phrase1, phrase2) + getSC(phrase2, phrase1)) / 2.0; }''' code = code.replace('\n',' ').replace('\r',' ') predictions, raw_outputs = model.predict([code]) predict = name_file[predictions[0]] print(predict) ```
{}
text-classification
NTUYG/DeepSCC-RoBERTa
[ "transformers", "pytorch", "jax", "roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #has_space #region-us
## How to use
[ "## How to use" ]
[ "TAGS\n#transformers #pytorch #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "## How to use" ]
[ 44, 4 ]
[ "passage: TAGS\n#transformers #pytorch #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #has_space #region-us \n## How to use" ]
[ 0.019592521712183952, 0.05915647745132446, -0.004610526375472546, 0.026486078277230263, 0.18519534170627594, 0.017777901142835617, 0.05095462501049042, 0.13045820593833923, 0.02024167776107788, 0.005361353978514671, 0.09757017344236374, 0.1776249259710312, -0.03895142301917076, 0.10630647838115692, -0.10768537223339081, -0.30873027443885803, 0.05192986875772476, 0.029123833402991295, -0.03775331750512123, 0.1006578654050827, 0.10368581861257553, -0.10725729167461395, 0.06315074861049652, -0.017971741035580635, -0.15019099414348602, 0.04557972401380539, -0.001324239419773221, -0.1248675137758255, 0.10654926300048828, 0.02974645234644413, 0.1654517650604248, 0.022340331226587296, -0.02587149664759636, -0.14790020883083344, 0.03524458035826683, 0.01291688159108162, -0.08131247013807297, 0.05689634755253792, 0.0626194179058075, -0.09502825140953064, 0.08086410164833069, -0.034384265542030334, 0.032676972448825836, 0.01272613275796175, -0.1562177538871765, -0.10175465792417526, -0.006676000077277422, 0.0040949261747300625, 0.05937008559703827, 0.07564163953065872, -0.0064817676320672035, 0.12869228422641754, -0.15202461183071136, 0.08929410576820374, 0.14368151128292084, -0.28831326961517334, -0.029769081622362137, 0.16508188843727112, 0.11065439879894257, 0.06665998697280884, -0.07544759660959244, 0.04888889566063881, 0.014569347724318504, 0.012906279414892197, 0.07558610290288925, -0.07142337411642075, -0.1607968509197235, 0.0653776079416275, -0.10192784667015076, -0.06980850547552109, 0.23154187202453613, -0.06476649641990662, 0.08581797778606415, -0.05027494579553604, -0.08635696768760681, -0.0682765319943428, -0.030661912634968758, 0.009899749420583248, -0.022992409765720367, 0.05307967960834503, 0.021977195516228676, -0.04374721273779869, -0.13095469772815704, 0.026607858017086983, -0.15311342477798462, 0.1580418199300766, -0.008325733244419098, 0.03318296745419502, -0.1365077644586563, 0.06258118897676468, -0.007581052370369434, -0.09562911093235016, 0.07534066587686539, -0.09869217872619629, 0.01653311960399151, -0.026352571323513985, -0.0950273647904396, -0.038339756429195404, 0.0717509537935257, 0.1349329948425293, 0.028802791610360146, 0.043667666614055634, 0.022280395030975342, 0.09029186517000198, 0.05046005919575691, 0.15214191377162933, -0.0024164197966456413, -0.06631143391132355, 0.03906998783349991, -0.07295296341180801, -0.003042082767933607, -0.08598092943429947, -0.1812048852443695, -0.061813078820705414, 0.07465337961912155, 0.07871144264936447, 0.05546614155173302, 0.06262344866991043, -0.024501118808984756, -0.02075478807091713, 0.06215446814894676, -0.09426108002662659, 0.069389209151268, -0.000052912877436028793, 0.03594685345888138, 0.006809363141655922, 0.012911451980471611, 0.0021134966518729925, -0.04758200794458389, 0.09264792501926422, -0.05940801650285721, 0.022511210292577744, -0.07605203986167908, -0.12840479612350464, 0.022722028195858, -0.1372242271900177, 0.029805751517415047, -0.17527228593826294, -0.05552550032734871, -0.016355376690626144, 0.037824902683496475, -0.020883869379758835, -0.055758025497198105, 0.007048892788589001, -0.02615632303059101, 0.09348780661821365, -0.050960443913936615, -0.023872289806604385, -0.0638602077960968, 0.06548977643251419, -0.04605703055858612, 0.0939662903547287, -0.11312349140644073, 0.07909924536943436, -0.0683579072356224, -0.008866928517818451, -0.13657131791114807, 0.029850609600543976, -0.042989399284124374, 0.15153232216835022, -0.006844366434961557, -0.036232247948646545, -0.07189162820577621, 0.06396431475877762, -0.04644724354147911, 0.11342043429613113, -0.11711744964122772, -0.04923177510499954, 0.15342923998832703, -0.06197701022028923, -0.11260953545570374, 0.08331446349620819, -0.029059888795018196, 0.02955174818634987, 0.028331942856311798, 0.222988098859787, 0.11939115077257156, -0.0333331860601902, 0.062318287789821625, 0.1280519813299179, -0.050056394189596176, -0.10299713909626007, 0.024866562336683273, 0.0008601996232755482, -0.05780667066574097, 0.02778044156730175, 0.04734303429722786, 0.06827734410762787, -0.01544386800378561, -0.0625089481472969, -0.044350624084472656, -0.01605171523988247, 0.09655661135911942, 0.05309944599866867, 0.113649383187294, -0.0676780492067337, -0.022456230595707893, 0.01342798862606287, 0.003031367203220725, 0.01817048154771328, 0.04259032383561134, -0.035665400326251984, 0.14879438281059265, -0.020131077617406845, 0.014416172169148922, -0.22660453617572784, -0.07882609963417053, -0.03147627413272858, 0.14293541014194489, -0.017254654318094254, 0.17139680683612823, 0.04263416677713394, -0.06554250419139862, -0.012028657831251621, -0.011308333836495876, 0.15828704833984375, 0.04373732581734657, -0.0735677182674408, -0.05144859105348587, 0.011323907412588596, -0.08103752136230469, -0.011128767393529415, -0.07936783879995346, 0.028526969254016876, 0.06699255853891373, 0.09108952432870865, -0.008613131009042263, 0.06968732923269272, -0.013357250019907951, 0.0639505535364151, -0.08886749297380447, 0.010077393613755703, 0.08939651399850845, -0.009916777722537518, -0.042158886790275574, 0.15233156085014343, -0.19251832365989685, 0.29187220335006714, 0.2010555863380432, -0.2652571499347687, -0.006829450838267803, 0.016727903857827187, -0.001554309157654643, 0.05000421032309532, -0.01993541046977043, 0.005095642525702715, 0.03352861851453781, -0.03265361487865448, 0.17602182924747467, -0.025339722633361816, -0.03603143244981766, -0.022630659863352776, -0.06722190976142883, -0.043204065412282944, 0.05759400129318237, 0.03232269734144211, -0.17434102296829224, 0.19482421875, 0.2653607726097107, -0.04531692713499069, 0.17720715701580048, 0.02041679434478283, 0.03338839113712311, 0.04867524281144142, -0.04509353265166283, -0.041504502296447754, -0.03665703907608986, -0.15937648713588715, -0.05638393387198448, 0.08096467703580856, 0.036156173795461655, 0.06631911545991898, -0.11537524312734604, -0.035355959087610245, 0.03115074522793293, 0.040251631289720535, -0.021022258326411247, 0.07929343730211258, 0.07216785848140717, 0.08740366995334625, 0.031925689429044724, -0.0720776617527008, 0.07681477814912796, -0.002707880223169923, -0.05382095277309418, 0.1597137302160263, -0.14968399703502655, -0.3263891637325287, -0.1173856258392334, -0.17029045522212982, 0.005927255377173424, 0.04575937241315842, 0.11204134672880173, -0.09515166282653809, -0.04473932832479477, -0.0018402610439807177, -0.03644616901874542, -0.09946133196353912, 0.01533367671072483, -0.09911512583494186, 0.08139678835868835, -0.04779823496937752, -0.08000922948122025, -0.06933526694774628, 0.002250935649499297, -0.011706279590725899, 0.1420033574104309, -0.055422212928533554, 0.10346485674381256, 0.16592785716056824, -0.030574370175600052, 0.06698720157146454, -0.013648993335664272, 0.1958797574043274, -0.10122311860322952, -0.0012756832875311375, 0.17352771759033203, -0.02928389422595501, 0.0745474249124527, 0.19265905022621155, 0.03383447974920273, -0.017403114587068558, 0.00006713331094942987, -0.029207753017544746, -0.1082734689116478, -0.17546987533569336, -0.15298403799533844, -0.16326548159122467, -0.009498312138020992, 0.0672919899225235, 0.08538111299276352, 0.12392083555459976, 0.06851423531770706, 0.03419817239046097, 0.01711568981409073, -0.005590924061834812, 0.0939420834183693, 0.22126895189285278, 0.005257221404463053, 0.16153034567832947, -0.07076665759086609, -0.14233562350273132, 0.06341175734996796, 0.043739017099142075, 0.09976658225059509, 0.09641964733600616, 0.006487066391855478, 0.014501326717436314, 0.12016545236110687, 0.1474461704492569, 0.10916893184185028, 0.04175294190645218, -0.019106250256299973, -0.03485499694943428, -0.00625475961714983, -0.02927342802286148, 0.0623263381421566, 0.1275210678577423, -0.14648225903511047, -0.04190818592905998, -0.1689719557762146, 0.09737256169319153, 0.0848519504070282, 0.07399795204401016, -0.20148323476314545, 0.0379585400223732, 0.10922014713287354, -0.032412879168987274, -0.10764501988887787, 0.05678470805287361, 0.01290473248809576, -0.13762608170509338, 0.06258432567119598, -0.00982892606407404, 0.14022305607795715, -0.03758256882429123, 0.08025054633617401, -0.07476156949996948, -0.13570305705070496, 0.019850194454193115, 0.1177002489566803, -0.24241605401039124, 0.20210279524326324, 0.00234008627012372, -0.08902302384376526, -0.07171231508255005, -0.006631833501160145, 0.07670474052429199, 0.19196954369544983, 0.057111166417598724, 0.009989001788198948, -0.1478234827518463, -0.15301409363746643, 0.026838870719075203, -0.01721457950770855, 0.09783566743135452, -0.02704431489109993, 0.013324434868991375, -0.05900457128882408, -0.0238103698939085, -0.0037190052680671215, 0.05075185000896454, 0.023021455854177475, -0.20944970846176147, 0.0634157657623291, 0.04338591918349266, 0.02646745555102825, 0.02463781088590622, -0.027818815782666206, -0.14734841883182526, 0.1873057633638382, 0.004381042905151844, -0.0464569553732872, -0.12440935522317886, -0.08794496953487396, 0.07740578055381775, -0.06785573810338974, 0.06630447506904602, -0.08261731266975403, 0.054934047162532806, -0.08602757006883621, -0.17096805572509766, 0.10550500452518463, -0.09712643176317215, 0.005456643644720316, -0.04767695441842079, 0.10047908127307892, -0.12472463399171829, 0.02597990445792675, 0.04794417321681976, 0.08351235836744308, -0.15807683765888214, -0.10264037549495697, 0.011577769182622433, -0.002682530554011464, 0.07192932069301605, 0.055355556309223175, -0.07773779332637787, -0.06609693914651871, 0.022926218807697296, 0.02203369140625, 0.3201806843280792, 0.1559583842754364, -0.11542171984910965, 0.13563156127929688, 0.04720505326986313, -0.06397605687379837, -0.34100252389907837, -0.09800273925065994, -0.12819218635559082, -0.0023214765824377537, 0.034566283226013184, -0.11493222415447235, 0.047478437423706055, -0.04396221414208412, -0.050839636474847794, 0.03589519485831261, -0.1425606906414032, -0.07883824408054352, 0.17221201956272125, -0.04827753081917763, 0.2966228425502777, -0.13303929567337036, -0.06773903220891953, -0.04802977666258812, -0.08716165274381638, 0.12180484086275101, -0.06731559336185455, 0.08923446387052536, -0.01067415066063404, 0.051773443818092346, 0.04773787781596184, -0.050641629844903946, 0.1439080387353897, -0.052875276654958725, 0.01237840112298727, -0.12149394303560257, -0.1288757175207138, 0.07309886813163757, -0.07498358190059662, 0.005980555433779955, -0.046062059700489044, -0.0019378035794943571, -0.1856922060251236, -0.013575727120041847, -0.07586432248353958, 0.06947680562734604, 0.020730631425976753, -0.03163512796163559, -0.046423688530921936, -0.013198967091739178, 0.034242257475852966, 0.00745804188773036, 0.2529895305633545, -0.06239156424999237, 0.19933515787124634, 0.16193532943725586, 0.07728134095668793, -0.07684534043073654, -0.007769674062728882, 0.0011218166910111904, -0.044074516743421555, 0.06783611327409744, -0.1407625526189804, 0.061409130692481995, 0.08583454042673111, -0.062362272292375565, 0.07718796283006668, 0.10728295147418976, 0.029494954273104668, -0.014729307033121586, 0.17427882552146912, -0.19649574160575867, -0.01349856797605753, -0.05565648898482323, -0.021002618595957756, 0.03294824808835983, -0.016518088057637215, 0.11472003906965256, 0.023326437920331955, -0.04832848533987999, 0.004096572287380695, -0.005759719293564558, -0.033495429903268814, 0.032490335404872894, 0.0855930894613266, 0.06862398236989975, -0.11069901287555695, 0.007205888628959656, 0.09818323701620102, -0.1332141011953354, -0.002629705937579274, 0.15210044384002686, -0.11844798177480698, -0.14703935384750366, 0.0019092117436230183, 0.11583507806062698, -0.0869423970580101, -0.030216719955205917, -0.04850292578339577, -0.11645616590976715, 0.03573720157146454, 0.14738473296165466, 0.12089952826499939, 0.040359579026699066, -0.022183069959282875, -0.040264792740345, -0.01625651679933071, 0.03095782920718193, 0.015575305558741093, 0.025673896074295044, -0.14206862449645996, 0.037232376635074615, -0.024241410195827484, 0.14892524480819702, -0.08840052038431168, -0.04872962459921837, -0.17242464423179626, 0.011081659235060215, -0.0858195349574089, -0.0429203137755394, -0.08616400510072708, -0.0206319410353899, 0.005903587676584721, -0.032336827367544174, -0.06369980424642563, -0.06510034948587418, -0.14316824078559875, 0.0021722454112023115, -0.05495043098926544, 0.04970647394657135, -0.06797129660844803, -0.05063759908080101, 0.0880562886595726, -0.03251829370856285, 0.06710802018642426, 0.06853190064430237, -0.051209550350904465, 0.05443861708045006, -0.05963375046849251, -0.12660875916481018, 0.11501745134592056, 0.018994374200701714, 0.08610261231660843, 0.023798564448952675, 0.023962270468473434, 0.03102598711848259, 0.02709367126226425, 0.04025975987315178, 0.0518219880759716, -0.11628831177949905, 0.062031570822000504, -0.06968171894550323, -0.1812518686056137, -0.023910440504550934, -0.0037591378204524517, 0.1053047701716423, 0.009521621279418468, 0.120731420814991, -0.0480506494641304, 0.09288481622934341, -0.05649552866816521, 0.013529582880437374, -0.06468167901039124, -0.19020645320415497, -0.05811583995819092, -0.09639893472194672, 0.029381701722741127, -0.005964312236756086, 0.24338942766189575, 0.1286540925502777, 0.011602258309721947, 0.051100268959999084, 0.06989038735628128, 0.02853669598698616, 0.011989930644631386, 0.1534629464149475, 0.08517062664031982, -0.06364148110151291, -0.03692789375782013, 0.06835979968309402, 0.03369976580142975, 0.031089484691619873, 0.08857874572277069, 0.11138901114463806, 0.026993760839104652, 0.09440192580223083, -0.026243072003126144, -0.026997294276952744, -0.11971835047006607, -0.1287406086921692, -0.011627379804849625, 0.0815436989068985, -0.05594763532280922, -0.020352263003587723, 0.14328533411026, -0.059390876442193985, 0.07568095624446869, -0.07061850279569626, -0.039699453860521317, -0.1821652501821518, -0.06718005239963531, -0.10006141662597656, -0.1091313362121582, -0.039234958589076996, -0.0855954959988594, 0.03425537794828415, 0.11084362864494324, 0.010744347237050533, -0.015441899187862873, 0.07820826023817062, 0.015107857063412666, -0.04704032465815544, 0.07155361771583557, -0.02939036302268505, 0.06731147319078445, -0.015381306409835815, 0.005887855309993029, -0.11815594136714935, -0.0552387610077858, -0.03578239306807518, 0.0341784693300724, -0.054534558206796646, 0.02159133367240429, -0.13997377455234528, -0.12588632106781006, -0.04451939836144447, 0.05082060396671295, -0.01598716713488102, 0.15606361627578735, -0.005684970878064632, -0.01137937419116497, 0.022589432075619698, 0.26390767097473145, -0.11186543852090836, -0.05859954655170441, -0.038675859570503235, 0.2072351574897766, 0.08668243885040283, 0.07560979574918747, -0.03348839655518532, -0.03430234640836716, -0.08571536093950272, 0.264354944229126, 0.33523350954055786, -0.05188635736703873, 0.07245303690433502, 0.0243404321372509, 0.021333983168005943, 0.12098365277051926, 0.099974125623703, 0.08348050713539124, 0.2044116109609604, -0.0986861065030098, -0.018525538966059685, -0.039057835936546326, -0.025696538388729095, -0.13297703862190247, 0.0798235684633255, 0.05295998230576515, -0.05822008475661278, -0.037259217351675034, 0.08295001089572906, -0.16588327288627625, 0.11335460841655731, 0.04476459324359894, -0.2475576251745224, -0.0930057242512703, 0.003643392352387309, 0.1584060937166214, -0.03741765022277832, 0.09085292369127274, -0.024561643600463867, -0.10082835704088211, 0.011324354447424412, 0.014453788287937641, -0.17166291177272797, -0.006118299439549446, 0.09266125410795212, -0.012108244933187962, 0.020099777728319168, -0.03303336352109909, 0.0020128993783146143, 0.1176663339138031, 0.061972763389348984, -0.03837895020842552, 0.06486506015062332, 0.02226787805557251, -0.04727500304579735, 0.029892947524785995, 0.03473901376128197, -0.017849456518888474, -0.09528621286153793, 0.07331417500972748, -0.18392707407474518, 0.05781478434801102, -0.09284395724534988, -0.010348817333579063, -0.01575114019215107, 0.010884069837629795, -0.0504363514482975, 0.08129362761974335, 0.10234461724758148, -0.01301206462085247, -0.023442404344677925, -0.036680739372968674, -0.03581174090504646, 0.013249607756733894, -0.0738723948597908, -0.16350729763507843, -0.06616361439228058, -0.07948208600282669, 0.0898079127073288, -0.000991520588286221, -0.15048718452453613, -0.027215274050831795, -0.050113335251808167, 0.04400598630309105, -0.1410246640443802, 0.09949614107608795, 0.07808881253004074, -0.009345689788460732, -0.015493555925786495, -0.05275560915470123, 0.02440495975315571, 0.0649026557803154, -0.12304161489009857, -0.05647892504930496 ]
null
null
transformers
## How to use ```python import logging from simpletransformers.seq2seq import Seq2SeqModel, Seq2SeqArgs logging.basicConfig(level=logging.INFO) transformers_logger = logging.getLogger("transformers") transformers_logger.setLevel(logging.WARNING) model_args = Seq2SeqArgs() # 加载本地训练好的模型 model = Seq2SeqModel( encoder_decoder_type="bart", encoder_decoder_name="NTUYG/SOTitle-java-BART", args=model_args, ) describe = """ I am a beginner at Android Java development but I have a few years of school + uni experience in Java. I am trying to write to a text file in an assets folder in my app using FileOutputStream but it doesn't seem to write to it at all since I am using InputStream to read the file after and there haven't any updates. Here is my code """ code = """ private void updateTextFile(String update) { FileOutputStream fos = null; try { fos = openFileOutput("Questions",MODE_PRIVATE); fos.write("Testing".getBytes()); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } finally { if(fos!=null) { try { fos.close(); } catch (IOException e) { e.printStackTrace(); } } } String text = ""; try { InputStream is = getAssets().open("Questions"); int size = is.available(); byte[] buffer = new byte[size]; is.read(buffer); is.close(); text = new String(buffer); } catch (IOException e) { e.printStackTrace(); } System.out.println("Tesing output " + text); } """ from nltk import word_tokenize describe = describe.replace('\n',' ').replace('\r',' ') describe = ' '.join(word_tokenize(describe)) code = code.replace('\n',' ').replace('\r',' ') code = ' '.join(word_tokenize(code)) # human : Java Android Cant seem to update text file using FileOutputStream body = describe + ' <code> ' + code +' </code>' print( model.predict( [ body ] ) ) ```
{}
text2text-generation
NTUYG/SOTitle-java-BART
[ "transformers", "pytorch", "bart", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us
## How to use
[ "## How to use" ]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n", "## How to use" ]
[ 38, 4 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n## How to use" ]
[ -0.026871027424931526, 0.005132749676704407, -0.0063082673586905, -0.001121255336329341, 0.16613808274269104, 0.0098565723747015, 0.1234436109662056, 0.12082110345363617, 0.014957552775740623, -0.029217196628451347, 0.12190449982881546, 0.1964552253484726, -0.01807437650859356, 0.16018450260162354, -0.09238891303539276, -0.27027422189712524, 0.03716328740119934, 0.07053060084581375, 0.045333269983530045, 0.12578599154949188, 0.08620307594537735, -0.07709770649671555, 0.06912164390087128, -0.02605086751282215, -0.18492455780506134, 0.04626167193055153, 0.016025647521018982, -0.11327473819255829, 0.10834567248821259, 0.03442110866308212, 0.14057886600494385, 0.027555333450436592, -0.03156299144029617, -0.15454521775245667, 0.028279000893235207, -0.01582624390721321, -0.05855948105454445, 0.0361272431910038, 0.1073208600282669, -0.08924084156751633, 0.08828268945217133, 0.05199546739459038, 0.009658845141530037, 0.046649057418107986, -0.13070909678936005, -0.02716144546866417, -0.01731134206056595, 0.01883530430495739, 0.10775721818208694, 0.09860992431640625, 0.005566605366766453, 0.11127203702926636, -0.1010153591632843, 0.11898530274629593, 0.14293187856674194, -0.29082465171813965, -0.013891233131289482, 0.05396491661667824, 0.09641750156879425, 0.05849773809313774, -0.03734475001692772, 0.027467628940939903, 0.0063414121977984905, 0.03946166858077049, 0.008455421775579453, -0.08809059858322144, -0.13923490047454834, 0.026607565581798553, -0.06309104710817337, -0.06682614982128143, 0.21515294909477234, -0.08781497180461884, 0.044584669172763824, -0.045389797538518906, -0.09130781888961792, -0.036964282393455505, -0.04887557402253151, 0.0042162504978477955, -0.06806235015392303, 0.06971263885498047, -0.02785077877342701, -0.07897254824638367, -0.14458788931369781, -0.004296088125556707, -0.15842343866825104, 0.14935031533241272, -0.0008200485026463866, 0.05140816047787666, -0.20533180236816406, 0.08875803649425507, 0.020874466747045517, -0.09822528064250946, 0.06027434021234512, -0.09528281539678574, 0.05801042169332504, -0.002246034797281027, -0.07265093177556992, -0.10104699432849884, 0.0750666931271553, 0.15302921831607819, 0.05508870258927345, 0.05020153522491455, -0.042699187994003296, 0.08517823368310928, -0.005087545607239008, 0.1212027370929718, 0.07218832522630692, -0.10104924440383911, 0.06919030845165253, -0.12147609144449234, 0.011768801137804985, -0.06680569797754288, -0.1714947372674942, -0.0622037872672081, 0.041075367480516434, 0.10215237736701965, 0.056694839149713516, 0.04018464684486389, -0.039156679064035416, -0.03499075025320053, 0.09317263215780258, -0.09845608472824097, 0.03352153301239014, 0.004084956366568804, 0.028727909550070763, 0.07638432085514069, 0.023867493495345116, 0.007011861074715853, -0.1078081727027893, 0.11213895678520203, -0.028270309790968895, 0.02101144753396511, -0.07759900391101837, -0.064460888504982, 0.016788357868790627, -0.10293359309434891, 0.023250369355082512, -0.1623286008834839, -0.17823496460914612, 0.006919223349541426, 0.016689077019691467, -0.005856287200003862, -0.02726585417985916, -0.044550370424985886, 0.008785286918282509, 0.06022665649652481, -0.08101605623960495, -0.01158566027879715, -0.03988562896847725, 0.09819931536912918, 0.0001515409239800647, 0.06434237211942673, -0.15619981288909912, 0.06879248470067978, -0.09936077892780304, -0.02857806906104088, -0.09834776073694229, 0.0585598424077034, 0.00009422068978892639, 0.15210802853107452, -0.004179821815341711, -0.004319027066230774, -0.09745682775974274, 0.043117254972457886, -0.004171763546764851, 0.17326503992080688, -0.10935534536838531, -0.08689373731613159, 0.2600945830345154, -0.08565302193164825, -0.15599636733531952, 0.07541291415691376, 0.00279200985096395, 0.04737785458564758, 0.08800304681062698, 0.16541984677314758, 0.1044781506061554, -0.03615681082010269, 0.11251936852931976, 0.12331975251436234, -0.0747954249382019, -0.1418408900499344, -0.004200024995952845, -0.031032700091600418, -0.09036172181367874, 0.0528908409178257, 0.07843870669603348, 0.09470812976360321, -0.03363850340247154, -0.04313528165221214, -0.04070103541016579, -0.02211015298962593, 0.08067483454942703, 0.028751732781529427, 0.11053059250116348, -0.06899862736463547, 0.004590677097439766, -0.007757545914500952, -0.012832599692046642, -0.015838203951716423, 0.06121647730469704, -0.03827790543437004, 0.10521229356527328, -0.01957397162914276, 0.03732900694012642, -0.20544885098934174, -0.05148268863558769, -0.008403245359659195, 0.15145502984523773, -0.0025124025996774435, 0.08278942853212357, 0.07122506201267242, -0.02272975631058216, -0.0018245093524456024, -0.028270026668906212, 0.1702892929315567, -0.006806069053709507, -0.08346132934093475, -0.02372567728161812, 0.0346323624253273, -0.06878906488418579, -0.016807937994599342, -0.04391453415155411, 0.01940637268126011, 0.028254223987460136, 0.10757570713758469, 0.0032748656813055277, 0.03855452686548233, -0.03504667058587074, 0.04370807111263275, -0.08087283372879028, 0.021587315946817398, 0.0998963862657547, 0.023271145299077034, -0.0835072249174118, 0.16797390580177307, -0.185696542263031, 0.2224149852991104, 0.19633769989013672, -0.2713489830493927, 0.031137466430664062, -0.045287203043699265, -0.016469117254018784, 0.013458115048706532, 0.012241456657648087, -0.00976769533008337, 0.06459424644708633, 0.02941727451980114, 0.19868841767311096, -0.033678289502859116, -0.04420602694153786, -0.026379771530628204, -0.07346820831298828, -0.002668887609615922, 0.02902538888156414, 0.02648070827126503, -0.13362686336040497, 0.16730688512325287, 0.20268996059894562, 0.006802982185035944, 0.178527370095253, 0.028854891657829285, -0.014569115824997425, 0.07614409923553467, 0.013064542785286903, -0.020983191207051277, -0.0726369321346283, -0.181860089302063, -0.02291039749979973, 0.07799743860960007, 0.01902909204363823, 0.09702833741903305, -0.12307851016521454, -0.019057869911193848, 0.006366883870214224, -0.005843223538249731, 0.0007576281786896288, 0.05344058573246002, 0.0645233541727066, 0.07833214849233627, -0.006620761938393116, -0.016956305131316185, 0.10825340449810028, 0.004881326109170914, -0.09859803318977356, 0.16228525340557098, -0.14848560094833374, -0.34969308972358704, -0.18700172007083893, -0.18019957840442657, -0.01623707078397274, 0.05772259831428528, 0.13894344866275787, -0.07168746739625931, -0.04497305676341057, 0.03246014937758446, -0.0018271097214892507, -0.042553115636110306, 0.0022504657972604036, -0.059313323348760605, 0.05712776258587837, -0.048669327050447464, -0.09106648713350296, -0.0582355372607708, 0.013881273567676544, -0.02218237891793251, 0.15310554206371307, -0.11345473676919937, 0.1051529198884964, 0.13448181748390198, -0.005452466197311878, 0.08395304530858994, -0.0020229145884513855, 0.16852445900440216, -0.07639633119106293, -0.01572791114449501, 0.21211153268814087, -0.06528552621603012, 0.08589012175798416, 0.12848219275474548, 0.01741642877459526, -0.05207853764295578, 0.022031664848327637, -0.06991000473499298, -0.0972001850605011, -0.22523804008960724, -0.1374012529850006, -0.1412019580602646, 0.048766862601041794, 0.059215668588876724, 0.06508413702249527, 0.13454453647136688, 0.09442252665758133, -0.00872708484530449, 0.04092537611722946, 0.021724583581089973, 0.11109845340251923, 0.17558056116104126, 0.003810567781329155, 0.1456764042377472, -0.08327808231115341, -0.13028624653816223, 0.09119439125061035, 0.04739563912153244, 0.12355680763721466, 0.09705572575330734, 0.047853074967861176, 0.008157598786056042, 0.10087352991104126, 0.1360442191362381, 0.18748606741428375, 0.035699114203453064, -0.011473817750811577, -0.002237329725176096, -0.02141478843986988, -0.07316939532756805, 0.048555757850408554, 0.0459132082760334, -0.13200509548187256, -0.04173649474978447, -0.12763535976409912, 0.0842823013663292, 0.09279221296310425, 0.037654727697372437, -0.220969557762146, 0.025308946147561073, 0.08159886300563812, -0.03833455964922905, -0.12381524592638016, 0.0528271459043026, -0.011874380521476269, -0.163802370429039, 0.08582592755556107, -0.04401037096977234, 0.14902718365192413, -0.02445622719824314, 0.0756712555885315, -0.062116652727127075, -0.11250589042901993, 0.042213305830955505, 0.1252007931470871, -0.33155879378318787, 0.18165534734725952, 0.0042354874312877655, -0.02824638970196247, -0.09818429499864578, 0.00360396527685225, 0.029689788818359375, 0.1308329552412033, 0.06893648207187653, -0.0064045218750834465, -0.060309793800115585, -0.1363750547170639, -0.016721826046705246, 0.01229205634444952, 0.13389168679714203, -0.011037161573767662, 0.012667780742049217, -0.0446355901658535, -0.041347675025463104, -0.03600287064909935, -0.03786207363009453, -0.00994185172021389, -0.20559988915920258, 0.06378590315580368, 0.045077335089445114, 0.07479534298181534, 0.015135499648749828, 0.006027268245816231, -0.02092144265770912, 0.2160588949918747, -0.04045454412698746, -0.08946356922388077, -0.12067606300115585, -0.09876587241888046, 0.05840078741312027, -0.09443023800849915, 0.06026919558644295, -0.08042138069868088, 0.04711567983031273, -0.08861425518989563, -0.19982098042964935, 0.08597545325756073, -0.11113343387842178, -0.0008725368534214795, -0.04491521045565605, 0.1616664081811905, -0.0755908265709877, -0.0014928586315363646, 0.05581286549568176, 0.012594811618328094, -0.13446350395679474, -0.09282337129116058, -0.032651614397764206, 0.0040762098506093025, 0.06248307600617409, 0.018603263422846794, -0.07378936558961868, -0.03709061071276665, -0.022023173049092293, -0.012275248765945435, 0.3135155439376831, 0.12798313796520233, -0.06461317837238312, 0.17268148064613342, 0.11965283006429672, -0.07717964798212051, -0.3064216673374176, -0.1346675157546997, -0.0958080068230629, -0.016675664111971855, -0.014509843662381172, -0.14455711841583252, 0.07824786752462387, -0.0456438884139061, -0.029848050326108932, 0.10115017741918564, -0.1797429770231247, -0.08717402070760727, 0.18123924732208252, 0.0018598699243739247, 0.28746527433395386, -0.13001923263072968, -0.11705535650253296, -0.08616893738508224, -0.17567914724349976, 0.1270042359828949, -0.05261795595288277, 0.07884171605110168, -0.03077678754925728, 0.13287073373794556, 0.04768802225589752, -0.06276610493659973, 0.08937107771635056, -0.019787641242146492, -0.013863351196050644, -0.11216988414525986, -0.027438495308160782, 0.05205252394080162, -0.03763972222805023, 0.05261051282286644, -0.05353183671832085, 0.00858267117291689, -0.1535325050354004, -0.038110990077257156, -0.07889479398727417, 0.059766873717308044, 0.03135919198393822, -0.03513339161872864, 0.0279425960034132, -0.09391410648822784, 0.006309619639068842, 0.023781325668096542, 0.1730024516582489, -0.05289915204048157, 0.15978512167930603, 0.15393806993961334, 0.11607462167739868, -0.12046564370393753, 0.025970956310629845, -0.062437333166599274, -0.06843215227127075, 0.04838770255446434, -0.05189165845513344, 0.06543955206871033, 0.10177498310804367, -0.051096610724925995, 0.06667095422744751, 0.09507443755865097, 0.012928767129778862, -0.0023286878131330013, 0.14999990165233612, -0.2394532561302185, 0.04237383231520653, -0.07970180362462997, 0.0274049062281847, 0.04497867077589035, 0.034712474793195724, 0.13977079093456268, 0.04640674591064453, -0.05490630120038986, -0.02539752423763275, -0.008364100940525532, -0.05960983783006668, 0.07569281756877899, 0.033972207456827164, 0.040054529905319214, -0.1375664621591568, 0.024402718991041183, 0.023310653865337372, -0.14574378728866577, -0.014752221293747425, 0.19780106842517853, -0.13207614421844482, -0.11305593699216843, 0.020587578415870667, 0.16738614439964294, -0.1609383076429367, -0.05825703218579292, -0.06616538017988205, -0.10181286185979843, 0.06216766685247421, 0.15132203698158264, 0.07602742314338684, 0.061988912522792816, -0.03417492285370827, -0.011361370794475079, -0.0390254482626915, 0.012528568506240845, 0.05753743276000023, 0.04356985166668892, -0.0733012929558754, 0.06778768450021744, -0.027508508414030075, 0.13529036939144135, -0.08223522454500198, -0.05238870903849602, -0.13170717656612396, 0.04410107061266899, -0.1622736155986786, -0.04093474894762039, -0.09797131270170212, -0.03841983154416084, -0.006628267001360655, -0.015279275365173817, -0.033241838216781616, -0.048132460564374924, -0.12010549753904343, 0.0005908488528802991, -0.060379743576049805, -0.015244649723172188, -0.09600774198770523, -0.013808956369757652, 0.08787433058023453, -0.04297683760523796, 0.07192547619342804, 0.16091392934322357, -0.08344123512506485, 0.08061160892248154, -0.1331515908241272, -0.11944136768579483, 0.10277814418077469, 0.02619290165603161, 0.06884495168924332, 0.08347105234861374, 0.018804533407092094, 0.09347786754369736, 0.030629755929112434, 0.025232478976249695, 0.08836875110864639, -0.12481646984815598, 0.04728831350803375, -0.050315845757722855, -0.1803440898656845, -0.06096196174621582, -0.012820728123188019, 0.0626278668642044, 0.03469894826412201, 0.12243733555078506, -0.06427715718746185, 0.1297326534986496, -0.04015863314270973, 0.011621922254562378, -0.017513392493128777, -0.16337992250919342, -0.07095588743686676, -0.10394106805324554, 0.019892239943146706, 0.01960633508861065, 0.20901215076446533, 0.025135695934295654, 0.08021571487188339, 0.0237569659948349, 0.0542207695543766, 0.032332032918930054, -0.016414932906627655, 0.20040550827980042, 0.07309575378894806, -0.046635087579488754, -0.08689060807228088, 0.08150135725736618, 0.015461531467735767, -0.004780181683599949, 0.12021881341934204, 0.05647995322942734, -0.01808624342083931, 0.11211204528808594, -0.04258043318986893, 0.03950288146734238, -0.16085562109947205, -0.22931407392024994, -0.013258692808449268, 0.04265592247247696, -0.03724539652466774, 0.10055308043956757, 0.1423015147447586, -0.05997082591056824, 0.040745750069618225, -0.013677352108061314, -0.057472776621580124, -0.17894388735294342, -0.11359457671642303, -0.09025232493877411, -0.11298372596502304, -0.00317337061278522, -0.0879945158958435, 0.06793618202209473, 0.04060682654380798, 0.027293531224131584, -0.04603184387087822, 0.12304260581731796, 0.03956669569015503, -0.07372007519006729, 0.05972444638609886, -0.03811683505773544, 0.08104579150676727, 0.034210484474897385, -0.009113093838095665, -0.12751981616020203, -0.02810581400990486, -0.00427903002128005, 0.05791405960917473, -0.04752090200781822, 0.0162013228982687, -0.11437312513589859, -0.11448998004198074, -0.04277392476797104, 0.05745629593729973, 0.0015714997425675392, 0.17594121396541595, 0.0016210471512749791, -0.006447589956223965, 0.019639821723103523, 0.21707704663276672, -0.10890305787324905, -0.12073413282632828, -0.0220822561532259, 0.19223874807357788, 0.08629036694765091, 0.07890733331441879, -0.01538701169192791, -0.007106703240424395, -0.07313153892755508, 0.3466477692127228, 0.2628476023674011, -0.04479733854532242, 0.043018076568841934, 0.027101531624794006, 0.03865887224674225, 0.12467887997627258, 0.14263927936553955, 0.08195716887712479, 0.2880226969718933, -0.08709407597780228, -0.021977636963129044, -0.02248011901974678, -0.03432050347328186, -0.144236758351326, 0.09547752141952515, -0.00424463115632534, -0.06564425677061081, -0.033181142061948776, 0.10345765948295593, -0.19067609310150146, 0.15471065044403076, -0.042729612439870834, -0.1703765094280243, -0.04825274273753166, 0.006350434385240078, 0.1817147582769394, -0.021605318412184715, 0.08134418725967407, -0.0033424091525375843, -0.07018577307462692, 0.08121032267808914, 0.011702739633619785, -0.21215836703777313, 0.015924392268061638, 0.06216197460889816, -0.1311507672071457, 0.004786741454154253, -0.009511756710708141, 0.03803583234548569, 0.07970475405454636, 0.08722948282957077, -0.041520118713378906, 0.06757446378469467, 0.0032511609606444836, -0.03868487477302551, 0.04988547042012215, 0.06776859611272812, -0.011950112879276276, -0.11299245059490204, 0.0347672700881958, -0.1779002547264099, 0.05402936041355133, -0.0018749075243249536, 0.001843538018874824, -0.013464358635246754, 0.0148546127602458, -0.05559126287698746, 0.07844388484954834, 0.06527743488550186, -0.010345575399696827, -0.001847078325226903, -0.051660433411598206, -0.008536692708730698, 0.007667294703423977, -0.06433559209108353, -0.11431429535150528, -0.1157417744398117, -0.12140149623155594, 0.14463689923286438, 0.007990970276296139, -0.20888261497020721, -0.005260932724922895, -0.08677929639816284, 0.031933028250932693, -0.18869860470294952, 0.10614289343357086, 0.062228016555309296, -0.007063574157655239, 0.01198572013527155, -0.0765434056520462, 0.03175235167145729, 0.08344466984272003, -0.11214213073253632, -0.0843752771615982 ]
null
null
transformers
# Hungarian Sentence-level Sentiment Analysis with Finetuned huBERT Model For further models, scripts and details, see [our repository](https://github.com/nytud/sentiment-analysis) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained model used: huBERT - Finetuned on Hungarian Twitter Sentiment (HTS) Corpus - Labels: 0 (negative), 1 (positive) ## Limitations - max_seq_length = 128 ## Results | Model | HTS2 | HTS5 | | ------------- | ------------- | ------------- | | huBERT | **85.56** | 68.99 | | XLM-RoBERTa| 85.56 | 66.50 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-sentiment, title = {Improving Performance of Sentence-level Sentiment Analysis with Data Augmentation Methods}, booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)}, year = {2021}, publisher = {IEEE}, address = {Online}, author = {Laki, László and Yang, Zijian Győző} pages = {417--422} } ```
{"language": ["hu"], "license": "apache-2.0", "tags": ["text-classification"], "metrics": ["accuracy"], "widget": [{"text": "J\u00f3 reggelt! majd k\u00fcld\u00f6m az \u00e9lm\u00e9nyhoz\u00f3kat :)."}]}
text-classification
NYTK/sentiment-hts2-hubert-hungarian
[ "transformers", "pytorch", "bert", "text-classification", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Hungarian Sentence-level Sentiment Analysis with Finetuned huBERT Model ======================================================================= For further models, scripts and details, see our repository or our demo site. * Pretrained model used: huBERT * Finetuned on Hungarian Twitter Sentiment (HTS) Corpus * Labels: 0 (negative), 1 (positive) Limitations ----------- * max\_seq\_length = 128 Results ------- Model: huBERT, HTS2: 85.56, HTS5: 68.99 Model: XLM-RoBERTa, HTS2: 85.56, HTS5: 66.50 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 46 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.0434887520968914, 0.1282162070274353, -0.00612996332347393, 0.032373469322919846, 0.11295655369758606, 0.025390559807419777, 0.1268875151872635, 0.11749448627233505, 0.024788297712802887, -0.06507456302642822, 0.1287708878517151, 0.17426928877830505, 0.0074262130074203014, 0.07251251488924026, -0.08164559304714203, -0.24380141496658325, 0.09530210494995117, 0.04145800694823265, -0.01236800942569971, 0.08666938543319702, 0.10412266105413437, -0.051520396023988724, 0.05744433030486107, -0.012828446924686432, -0.05014578253030777, 0.009905134327709675, 0.025493862107396126, -0.10978883504867554, 0.08529175817966461, 0.04358380287885666, 0.09833349287509918, 0.048284973949193954, -0.005097746849060059, -0.1975345015525818, 0.021753674373030663, 0.002822614973410964, -0.08160953223705292, 0.05308414623141289, 0.04982001706957817, -0.03249795362353325, 0.04260817915201187, 0.040700607001781464, -0.02561628259718418, 0.06940847635269165, -0.07914787530899048, -0.14703363180160522, -0.08879883587360382, 0.12048768997192383, 0.07300668954849243, 0.06478115171194077, 0.035202495753765106, 0.1368100494146347, -0.12130257487297058, 0.05514969304203987, 0.1097291111946106, -0.3434688150882721, 0.0030758485663682222, 0.05117661505937576, 0.02030138298869133, 0.038716405630111694, -0.037275008857250214, 0.04951541870832443, 0.060014739632606506, 0.013800493441522121, -0.008618712425231934, -0.06508287042379379, -0.1216626763343811, 0.03844738379120827, -0.04888824373483658, -0.04236017167568207, 0.24112194776535034, -0.0058019282296299934, 0.045837756246328354, 0.0003754347562789917, -0.07033520191907883, -0.01156484242528677, -0.029977379366755486, 0.06108110770583153, 0.01706695184111595, 0.10145435482263565, 0.10239952057600021, -0.003322926117107272, -0.1207592561841011, 0.018795406445860863, -0.20249098539352417, 0.09313890337944031, 0.03081495501101017, 0.06728692352771759, -0.13609135150909424, 0.05705229192972183, 0.0920647531747818, -0.12443763762712479, 0.005105041433125734, -0.08093500137329102, 0.10854659229516983, 0.018917985260486603, -0.06186467781662941, 0.09956154227256775, 0.13442467153072357, 0.2068973034620285, 0.047680411487817764, 0.04367108270525932, -0.04681188613176346, 0.12386085838079453, -0.027340779080986977, 0.09371768683195114, 0.013846096582710743, -0.03977338597178459, 0.10505826026201248, -0.10783082246780396, 0.07095388323068619, -0.038093555718660355, -0.16845954954624176, -0.02328392304480076, 0.01706092059612274, 0.10017697513103485, 0.0333692841231823, 0.05236639454960823, -0.0491148866713047, 0.006991037167608738, 0.11697519570589066, -0.05681236833333969, 0.018716316670179367, 0.022085685282945633, 0.030835358425974846, 0.08593350648880005, 0.034328166395425797, 0.02559693530201912, -0.06389128416776657, 0.09042597562074661, -0.05366957187652588, -0.0038895343896001577, -0.025319885462522507, -0.013444807380437851, 0.08832941204309464, -0.09669759124517441, 0.04561634361743927, -0.1517317295074463, -0.11237457394599915, 0.027954120188951492, 0.060836855322122574, 0.006251898594200611, -0.07666131854057312, 0.03343381732702255, -0.003533633891493082, 0.027593178674578667, -0.08270443975925446, -0.04707803204655647, -0.08973945677280426, 0.06809724122285843, -0.09197565168142319, 0.02841426059603691, -0.1709860861301422, 0.05784766376018524, -0.1001691147685051, 0.0009356352966278791, -0.0760277807712555, -0.03318394720554352, -0.10460098087787628, 0.22501885890960693, -0.020419854670763016, -0.05306515842676163, 0.010000287555158138, 0.005240917671471834, -0.05968654528260231, 0.11418697983026505, -0.08652675896883011, -0.06625915318727493, 0.17712461948394775, -0.11950070410966873, -0.18260717391967773, 0.08730577677488327, 0.008291643112897873, -0.009214501827955246, 0.07268434762954712, 0.18006229400634766, 0.10566092282533646, -0.021187148988246918, 0.07113425433635712, 0.15048831701278687, -0.040947288274765015, -0.17277170717716217, 0.03568623960018158, -0.029011648148298264, -0.13211023807525635, 0.0599011555314064, -0.009899661876261234, 0.09420648217201233, -0.0032790498808026314, -0.07821164280176163, -0.04214586690068245, -0.048005275428295135, 0.011028378270566463, 0.02089003100991249, 0.0836382806301117, -0.06815953552722931, 0.005605021025985479, -0.03909703344106674, 0.04076483100652695, 0.05278678238391876, 0.06070273369550705, -0.06702341139316559, 0.08452870696783066, 0.06683379411697388, 0.0291458610445261, -0.11659331619739532, 0.008872576989233494, -0.012856803834438324, 0.03245650976896286, 0.0038708685897290707, 0.08994337171316147, 0.022342825308442116, -0.05871325358748436, -0.0173491258174181, -0.009397649206221104, 0.13024085760116577, 0.0552787221968174, -0.024140357971191406, -0.13943928480148315, 0.035930391401052475, -0.01558198407292366, 0.0674215778708458, -0.03482707962393761, 0.01923820562660694, 0.05571414530277252, 0.11786055564880371, -0.04539300501346588, 0.11341254413127899, -0.04034513607621193, 0.024991214275360107, -0.06731708347797394, 0.001674680970609188, 0.11901775002479553, 0.041440676897764206, -0.09393310546875, 0.16488748788833618, -0.06188854202628136, 0.2677548825740814, 0.20976665616035461, -0.2136308252811432, 0.06348590552806854, -0.00470400508493185, -0.015853185206651688, -0.004086120519787073, 0.04522792994976044, 0.030256042256951332, 0.053597770631313324, 0.0050210426561534405, 0.17264100909233093, -0.039037786424160004, -0.03607746213674545, -0.025279469788074493, -0.051374200731515884, -0.008309570141136646, 0.05910451337695122, 0.1586754322052002, -0.1820884346961975, 0.1822541356086731, 0.2847212553024292, -0.0063355849124491215, 0.08594595640897751, -0.07909215241670609, 0.02161979116499424, 0.06433381140232086, -0.046706460416316986, -0.029632411897182465, 0.017825817689299583, -0.13659560680389404, -0.008348465897142887, 0.08231012523174286, 0.03678017109632492, 0.06272430717945099, -0.14455419778823853, -0.04764006659388542, -0.021808234974741936, -0.0026341069024056196, -0.06277167052030563, 0.04689870402216911, 0.00044693349627777934, 0.09370281547307968, -0.02774418517947197, -0.10532957315444946, 0.1118728443980217, -0.00005384266478358768, -0.08514346927404404, 0.13645783066749573, -0.17396454513072968, -0.26944541931152344, -0.15380650758743286, -0.14226192235946655, -0.028455447405576706, 0.017389362677931786, 0.13474521040916443, -0.06441390514373779, -0.06403996795415878, 0.019975585862994194, -0.07588216662406921, -0.019704043865203857, 0.016278661787509918, 0.013531647622585297, 0.06301607936620712, 0.012834439054131508, -0.1000068187713623, -0.06526842713356018, 0.010732683353126049, -0.008881920948624611, 0.0997009202837944, -0.09366574883460999, 0.07838699221611023, 0.11796671152114868, 0.039720650762319565, 0.04111456871032715, -0.03054552525281906, 0.1344349980354309, -0.045448530465364456, 0.009526710957288742, 0.19920960068702698, -0.045177556574344635, 0.0765223279595375, 0.15859930217266083, 0.041497934609651566, -0.049147624522447586, 0.00705241784453392, -0.05661432445049286, -0.06018070876598358, -0.23849257826805115, -0.13064086437225342, -0.12174534052610397, 0.045089565217494965, 0.050018616020679474, 0.08305211365222931, 0.09785227477550507, 0.06854219734668732, -0.019646313041448593, 0.039725515991449356, -0.00964477937668562, 0.06852885335683823, 0.27804622054100037, -0.010500295087695122, 0.10958924889564514, -0.11485037207603455, -0.06006312742829323, 0.11961676925420761, 0.06482097506523132, 0.1247367113828659, 0.13154974579811096, 0.058412108570337296, 0.07013541460037231, 0.15885613858699799, 0.10599739104509354, 0.09429550170898438, 0.03441106155514717, -0.00541922589763999, -0.045866914093494415, -0.010157668963074684, -0.051725875586271286, 0.038199301809072495, -0.034706685692071915, -0.13671709597110748, -0.04592345282435417, -0.13721737265586853, 0.08740247786045074, 0.16800956428050995, 0.038032129406929016, -0.13018716871738434, 0.01351655088365078, 0.10577696561813354, -0.01133508887141943, -0.05284135416150093, 0.11322192847728729, -0.09613724052906036, -0.10717916488647461, 0.14623866975307465, -0.0013063126243650913, 0.15257549285888672, -0.02216845564544201, 0.05267684906721115, -0.0191632229834795, -0.129877507686615, 0.07346407324075699, 0.13437047600746155, -0.2763715982437134, 0.21347050368785858, -0.01665456034243107, -0.051291290670633316, -0.06937649846076965, -0.008359488099813461, 0.08343063294887543, 0.2740291357040405, 0.05933742597699165, 0.029078135266900063, -0.11142256110906601, -0.0803193673491478, -0.06288328021764755, 0.04367838799953461, 0.024803010746836662, -0.012860679998993874, -0.047211721539497375, -0.07242458313703537, -0.015039103105664253, -0.007754213642328978, 0.04183259606361389, -0.03301812708377838, -0.15685515105724335, 0.05907463654875755, 0.1067129522562027, 0.06455303728580475, -0.055116817355155945, -0.03665735572576523, -0.15006887912750244, 0.16299813985824585, -0.09676190465688705, -0.07152728736400604, -0.09035639464855194, -0.12532760202884674, 0.01869715377688408, -0.05955588445067406, 0.042501043528318405, -0.07397680729627609, -0.005241410341113806, -0.05539749935269356, -0.19665028154850006, 0.1060280054807663, -0.1268637627363205, -0.052288565784692764, -0.04870232939720154, 0.14159005880355835, -0.0978550985455513, 0.029907124117016792, 0.027370808646082878, -0.001842310419306159, -0.09787321090698242, -0.1301853060722351, -0.03453226014971733, 0.035310372710227966, 0.059925686568021774, -0.026914602145552635, -0.12340526282787323, -0.01405020710080862, 0.0006237937486730516, -0.04588779807090759, 0.22493396699428558, 0.1684177815914154, -0.0889885351061821, 0.1904640644788742, 0.18444646894931793, -0.09197146445512772, -0.3282652795314789, -0.16268010437488556, -0.1556841880083084, -0.08959272503852844, -0.03780500963330269, -0.15659493207931519, 0.14509673416614532, 0.023510722443461418, -0.07812128961086273, 0.09832952171564102, -0.1598958969116211, -0.08176460862159729, 0.21808704733848572, -0.05656159296631813, 0.34315478801727295, -0.1343366503715515, -0.07723602652549744, -0.09342294931411743, -0.19554737210273743, 0.10376343876123428, -0.07908494025468826, 0.05570331960916519, -0.007573881186544895, 0.021853165701031685, -0.018673541024327278, -0.04128614440560341, 0.13890381157398224, -0.008476145565509796, 0.0103214206174016, -0.13031476736068726, -0.0013552778400480747, 0.07936320453882217, -0.02264109067618847, 0.010469633154571056, -0.14011137187480927, 0.020549016073346138, -0.11857570707798004, -0.016819249838590622, -0.04973926395177841, 0.06857679039239883, -0.0021640374325215816, -0.03004186600446701, -0.036596763879060745, -0.012295212596654892, 0.035312775522470474, -0.0015858429251238704, 0.24263815581798553, 0.024654481559991837, 0.09462243318557739, 0.08696931600570679, 0.08596211671829224, -0.23048752546310425, 0.03347031772136688, -0.11214955151081085, -0.08131160587072372, 0.07221675664186478, -0.0905771255493164, 0.05710280314087868, 0.1254386305809021, -0.07342210412025452, 0.07196503132581711, 0.08455442637205124, 0.03732947260141373, -0.048313844949007034, 0.1417766511440277, -0.18195410072803497, 0.0018647537799552083, -0.02354593575000763, 0.0760408267378807, 0.07369842380285263, 0.07086710631847382, 0.12023969739675522, 0.014138750731945038, -0.03971254825592041, 0.03234616667032242, 0.032200586050748825, -0.053519126027822495, 0.024554140865802765, 0.05349737033247948, 0.008772216737270355, -0.13967002928256989, 0.09211256355047226, 0.0308118537068367, -0.11210619658231735, -0.03143756836652756, 0.1102287545800209, -0.17145714163780212, -0.1305808275938034, -0.005850812885910273, 0.09881908446550369, -0.10884305089712143, -0.11918898671865463, -0.06392377614974976, -0.17009294033050537, 0.06238051503896713, 0.11077582091093063, 0.12441360205411911, 0.07130511850118637, -0.017844131216406822, -0.05644327029585838, 0.05272693559527397, -0.006818860769271851, -0.0658203586935997, 0.023090064525604248, -0.12722784280776978, -0.02436378411948681, 0.02353753335773945, 0.11302616447210312, -0.053951993584632874, -0.016352131962776184, -0.11474186182022095, 0.029883980751037598, -0.19172364473342896, 0.0039461953565478325, -0.08545416593551636, 0.004070573952049017, 0.011247806251049042, -0.09255466610193253, -0.057184915989637375, -0.008274481631815434, -0.11594387143850327, -0.011156830936670303, -0.022886618971824646, 0.08024458587169647, -0.09392581880092621, -0.03740031272172928, 0.10652794688940048, -0.02346104383468628, 0.1091407984495163, 0.07828789949417114, -0.07350261509418488, 0.09263576567173004, -0.11525139957666397, -0.115024633705616, 0.09507068991661072, 0.05464620888233185, 0.036374639719724655, -0.01279841922223568, 0.0036954700481146574, 0.10999294370412827, -0.025437625125050545, 0.049679212272167206, 0.05319029837846756, -0.12448521703481674, -0.040947090834379196, -0.029309330508112907, -0.1282980889081955, -0.005256433505564928, -0.09504454582929611, 0.12705212831497192, 0.011149496771395206, 0.1685630977153778, -0.028278924524784088, 0.041259922087192535, -0.05374511703848839, 0.021553559228777885, -0.03536881506443024, -0.16238057613372803, -0.13202747702598572, -0.08800151199102402, -0.02947688102722168, -0.01133103109896183, 0.2791491746902466, 0.009496865794062614, -0.04521911218762398, 0.07748684287071228, 0.09749559313058853, -0.038269806653261185, 0.018027622252702713, 0.24639172852039337, 0.06328576803207397, -0.008771006017923355, -0.09014892578125, -0.0020543038845062256, 0.017523886635899544, -0.08494459837675095, 0.12031934410333633, 0.09604473412036896, 0.031681448221206665, 0.055074453353881836, 0.009379317983984947, 0.012556159868836403, -0.1314580887556076, -0.11174173653125763, -0.0102754021063447, 0.08916011452674866, 0.009072324261069298, 0.12162024527788162, 0.11254798620939255, -0.04368019476532936, 0.02098405547440052, -0.07404977083206177, -0.011079542338848114, -0.17461799085140228, -0.1011018306016922, -0.08224525302648544, -0.14219555258750916, 0.002281083958223462, -0.04191518947482109, 0.0024793900083750486, 0.07121007144451141, 0.04121262580156326, -0.06531098484992981, 0.0013354613911360502, -0.037759020924568176, -0.04110828787088394, 0.03387119248509407, -0.019085893407464027, -0.016727671027183533, -0.04395937919616699, -0.05649394169449806, -0.10286637395620346, -0.03543619439005852, -0.0525272861123085, 0.029439209029078484, -0.013833541423082352, 0.029673293232917786, -0.10661040991544724, -0.07033941149711609, -0.0394895114004612, 0.014326798729598522, 0.0005730020930059254, 0.15983609855175018, 0.009345872327685356, 0.04251403361558914, 0.09167855978012085, 0.15840794146060944, -0.06753014773130417, -0.14552098512649536, -0.058581799268722534, 0.18460027873516083, 0.045525554567575455, 0.04602377116680145, 0.013214902952313423, 0.020930476486682892, -0.06345731019973755, 0.3370414972305298, 0.2952379286289215, -0.08285108208656311, 0.03298317641019821, -0.016074245795607567, 0.022447671741247177, 0.0935361385345459, 0.1397077739238739, 0.1311754286289215, 0.18173417448997498, -0.06396117061376572, -0.06234751641750336, -0.053676679730415344, -0.002607000758871436, -0.17490500211715698, 0.09575482457876205, -0.004391147289425135, -0.08462657779455185, -0.0347195565700531, 0.08737118542194366, -0.12206001579761505, 0.08615179359912872, -0.0016114325262606144, -0.1555137187242508, -0.0418897308409214, -0.014403024688363075, 0.2009618878364563, 0.012023815885186195, 0.01792079396545887, -0.021795684471726418, -0.07043350487947464, 0.14702469110488892, -0.004562158603221178, -0.18949325382709503, -0.04386605694890022, 0.0943559855222702, -0.06173930689692497, 0.10954106599092484, -0.005378682631999254, 0.02983132191002369, 0.08477696776390076, 0.08922281861305237, -0.06771853566169739, 0.07477451115846634, 0.020855069160461426, -0.05828728526830673, -0.012308647856116295, -0.10375984013080597, -0.020148959010839462, -0.07548663765192032, 0.05589277669787407, -0.08201828598976135, 0.041578009724617004, -0.011092323809862137, -0.068037249147892, -0.021885205060243607, 0.05364800989627838, -0.07794482260942459, 0.06205511465668678, 0.02576792985200882, -0.03830999508500099, -0.06381291151046753, -0.056208476424217224, -0.037184301763772964, 0.011202008463442326, -0.1824050098657608, -0.07511567324399948, -0.01904807612299919, -0.04554937034845352, 0.06046390160918236, 0.0468159057199955, -0.0808715894818306, -0.02246410772204399, -0.09982031583786011, 0.03638887032866478, -0.16978268325328827, 0.05696408823132515, 0.061324767768383026, -0.001957209315150976, -0.010295327752828598, -0.06901649385690689, 0.023620905354619026, 0.01966485008597374, -0.08515617251396179, -0.08638837188482285 ]
null
null
transformers
# Hungarian Sentence-level Sentiment Analysis Model with XLM-RoBERTa For further models, scripts and details, see [our repository](https://github.com/nytud/sentiment-analysis) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained model used: XLM-RoBERTa base - Finetuned on Hungarian Twitter Sentiment (HTS) Corpus - Labels: 0 (negative), 1 (positive) ## Limitations - max_seq_length = 128 ## Results | Model | HTS2 | HTS5 | | ------------- | ------------- | ------------- | | huBERT | 85.56 | 68.99 | | XLM-RoBERTa| **85.56** | 66.50 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {laki-yang-sentiment, title = {Improving Performance of Sentence-level Sentiment Analysis with Data Augmentation Methods}, booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)}, year = {2021}, publisher = {IEEE}, address = {Online}, author = {Laki, László and Yang, Zijian Győző} pages = {417--422} } ```
{"language": ["hu"], "license": "mit", "tags": ["text-classification"], "metrics": ["accuracy"], "widget": [{"text": "J\u00f3 reggelt! majd k\u00fcld\u00f6m az \u00e9lm\u00e9nyhoz\u00f3kat :)."}]}
text-classification
NYTK/sentiment-hts2-xlm-roberta-hungarian
[ "transformers", "pytorch", "roberta", "text-classification", "hu", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us
Hungarian Sentence-level Sentiment Analysis Model with XLM-RoBERTa ================================================================== For further models, scripts and details, see our repository or our demo site. * Pretrained model used: XLM-RoBERTa base * Finetuned on Hungarian Twitter Sentiment (HTS) Corpus * Labels: 0 (negative), 1 (positive) Limitations ----------- * max\_seq\_length = 128 Results ------- Model: huBERT, HTS2: 85.56, HTS5: 68.99 Model: XLM-RoBERTa, HTS2: 85.56, HTS5: 66.50 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 44 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.023215599358081818, 0.08236850053071976, -0.006232061889022589, 0.020524300634860992, 0.17202818393707275, 0.04555141553282738, 0.13596709072589874, 0.09959522634744644, 0.042192552238702774, -0.04651981592178345, 0.10930123925209045, 0.20743127167224884, 0.002565591363236308, 0.08535442501306534, -0.09685126692056656, -0.28601744771003723, 0.06324534118175507, 0.0471942275762558, 0.0320342518389225, 0.09813487529754639, 0.10577892512083054, -0.07319154590368271, 0.0696936771273613, -0.014901187270879745, -0.10536497086286545, 0.016854848712682724, 0.032613605260849, -0.11867183446884155, 0.1125999391078949, 0.055112093687057495, 0.13030008971691132, 0.06501612067222595, -0.0004277043044567108, -0.1769518107175827, 0.03150719031691551, -0.03631340339779854, -0.09787092357873917, 0.04626844450831413, 0.04108186066150665, -0.06427831202745438, 0.09271425753831863, 0.0670635774731636, -0.0031577374320477247, 0.08039093762636185, -0.13269442319869995, -0.10890623927116394, -0.07730189710855484, 0.10180976241827011, 0.052960354834795, 0.045587409287691116, 0.01263339351862669, 0.1413230001926422, -0.12781359255313873, 0.0675850585103035, 0.08481430262327194, -0.33632692694664, 0.011307251639664173, 0.09468194097280502, 0.05780937150120735, 0.033164091408252716, -0.0675070509314537, 0.06642194837331772, 0.05910918116569519, 0.007239123340696096, -0.03669161722064018, -0.0778246745467186, -0.07860156893730164, 0.045715272426605225, -0.05299019441008568, -0.032244723290205, 0.20598916709423065, -0.039499204605817795, 0.04010153189301491, -0.02365810237824917, -0.05493255332112312, -0.04544627666473389, -0.024821655824780464, 0.03738968446850777, -0.021044975146651268, 0.07802318036556244, 0.059806957840919495, 0.008649749681353569, -0.12064652889966965, 0.030889300629496574, -0.21556687355041504, 0.15829305350780487, 0.03235050290822983, 0.04674415662884712, -0.14077313244342804, 0.05903516709804535, 0.07304566353559494, -0.10417363047599792, 0.010639396496117115, -0.09163665771484375, 0.06475266814231873, -0.029602190479636192, -0.04188380762934685, 0.06490147858858109, 0.09111718088388443, 0.20393340289592743, 0.03992066904902458, 0.037293292582035065, -0.02267667092382908, 0.12317796796560287, 0.01532342005521059, 0.1116829589009285, 0.031703460961580276, -0.03224075213074684, 0.0590486042201519, -0.1326269954442978, 0.03782755881547928, -0.04384314641356468, -0.18901729583740234, -0.032121442258358, 0.01576625369489193, 0.09261792153120041, -0.0024078439455479383, 0.06990066170692444, -0.060054000467061996, 0.009636334143579006, 0.07643076777458191, -0.0413639210164547, 0.017156971618533134, 0.013143365271389484, 0.03313213586807251, 0.076767198741436, -0.00799138005822897, 0.015860555693507195, -0.039369456470012665, 0.13256002962589264, -0.06855552643537521, 0.0003106867370661348, -0.024801047518849373, -0.05814613029360771, 0.06957510113716125, -0.1287786215543747, 0.05057063698768616, -0.16894061863422394, -0.09730150550603867, 0.00772302458062768, 0.02790418080985546, -0.013403951190412045, -0.04105278477072716, 0.005318629089742899, 0.00753295561298728, 0.022151559591293335, -0.06474105268716812, -0.0950535237789154, -0.08097636699676514, 0.1016702950000763, -0.052172500640153885, 0.05127786099910736, -0.16808132827281952, 0.05339885875582695, -0.10130971670150757, -0.005534932482987642, -0.10615607351064682, -0.008506287820637226, -0.07939451187849045, 0.2278202623128891, 0.020708512514829636, -0.056379079818725586, -0.02373112179338932, 0.04645897075533867, -0.0781797394156456, 0.13254348933696747, -0.06946311146020889, -0.0964190736413002, 0.17278428375720978, -0.11942911148071289, -0.15602944791316986, 0.0883340835571289, -0.018471764400601387, 0.049216460436582565, 0.09094198793172836, 0.2188061624765396, 0.10800661891698837, -0.052004408091306686, 0.0685880109667778, 0.1075407937169075, -0.08903872221708298, -0.1505010724067688, 0.01669575832784176, -0.012621656060218811, -0.10312821716070175, 0.048339325934648514, 0.048841167241334915, 0.08768486231565475, -0.040588587522506714, -0.06110129877924919, -0.006057417020201683, -0.012769598513841629, 0.040680911391973495, 0.04847824573516846, 0.10440155118703842, -0.09058225154876709, 0.006945395842194557, -0.01725999265909195, 0.007740383502095938, 0.05963130667805672, 0.03482937812805176, -0.07592884451150894, 0.10803841799497604, 0.07956578582525253, 0.009022916667163372, -0.13262251019477844, -0.02750345505774021, -0.027463896200060844, 0.08413194864988327, 0.02940569631755352, 0.13335086405277252, 0.009919657371938229, -0.03795408830046654, -0.026755893602967262, 0.0008182887104339898, 0.1517375260591507, 0.04589354991912842, -0.021105878055095673, -0.1092291846871376, 0.04861227050423622, -0.031591445207595825, 0.04065059497952461, -0.0776892602443695, 0.014486267231404781, 0.08039350807666779, 0.11114297062158585, -0.027997098863124847, 0.10033141821622849, -0.049671053886413574, 0.06711621582508087, -0.07364144176244736, 0.018032044172286987, 0.12284181267023087, 0.01604340970516205, -0.07113005220890045, 0.173956036567688, -0.10499840974807739, 0.2817158102989197, 0.2117525339126587, -0.23459947109222412, -0.014816169627010822, -0.03573531284928322, -0.01896936260163784, 0.012243947945535183, 0.043594878166913986, 0.023167721927165985, 0.060374170541763306, -0.017250286415219307, 0.1786908507347107, -0.029408743605017662, -0.0267486572265625, -0.012504584155976772, -0.0549950934946537, -0.032309722155332565, 0.081553153693676, 0.15555818378925323, -0.2380944937467575, 0.18776099383831024, 0.2025606632232666, 0.04315297305583954, 0.1655876487493515, -0.04264834523200989, 0.04667115584015846, 0.04492807015776634, -0.054667484015226364, -0.022962400689721107, 0.027193566784262657, -0.12374117225408554, -0.029393494129180908, 0.06312602758407593, 0.013586352579295635, 0.06771207600831985, -0.14938384294509888, -0.0714067593216896, -0.025173010304570198, 0.024508997797966003, -0.06921068578958511, 0.09587854146957397, 0.025190254673361778, 0.09642661362886429, -0.012989730574190617, -0.09365862607955933, 0.10443460941314697, 0.006536244880408049, -0.08695592731237411, 0.14195753633975983, -0.14380551874637604, -0.27103009819984436, -0.18260622024536133, -0.17765448987483978, 0.013695062138140202, 0.03859695419669151, 0.12332307547330856, -0.06093043088912964, -0.054035771638154984, 0.04828712344169617, -0.027105605229735374, -0.05994974449276924, -0.008617158979177475, -0.045366883277893066, 0.08788598328828812, -0.05638299509882927, -0.0749424546957016, -0.08224577456712723, -0.018998203799128532, 0.006590703036636114, 0.1317763328552246, -0.107121042907238, 0.08772245794534683, 0.12325730919837952, 0.006806079298257828, 0.048803362995386124, -0.06347045302391052, 0.15271787345409393, -0.08467650413513184, -0.0066188848577439785, 0.16736917197704315, -0.03466391563415527, 0.07178139686584473, 0.2021108716726303, 0.05384765937924385, -0.05167198181152344, -0.00013850380491930991, -0.0691635012626648, -0.07414424419403076, -0.21569401025772095, -0.1390211582183838, -0.13033829629421234, 0.04627760127186775, 0.045320216566324234, 0.07089530676603317, 0.10241681337356567, 0.07576317340135574, -0.00043337620445527136, 0.012086591683328152, -0.007132890168577433, 0.09121411293745041, 0.3232410252094269, -0.003586024045944214, 0.11912936717271805, -0.10376576334238052, -0.10835418850183487, 0.0973687469959259, 0.03498901426792145, 0.12786969542503357, 0.14530324935913086, 0.042619913816452026, 0.06788670271635056, 0.11024606972932816, 0.15420712530612946, 0.06742256134748459, 0.06161779537796974, -0.0237110648304224, -0.032764483243227005, 0.0018114646663889289, -0.030697422102093697, 0.03917498514056206, 0.022942496463656425, -0.1584753692150116, -0.061204638332128525, -0.1468311995267868, 0.07131580263376236, 0.08151870220899582, 0.05411345884203911, -0.16116724908351898, 0.004988902248442173, 0.0965275764465332, -0.011090652085840702, -0.0701184868812561, 0.0983499214053154, -0.08223119378089905, -0.1461295634508133, 0.12287414073944092, -0.0067499433644115925, 0.13547636568546295, -0.03809557110071182, 0.0670686587691307, -0.03735886886715889, -0.12591101229190826, 0.03823632374405861, 0.12099412083625793, -0.27088484168052673, 0.252101331949234, -0.004496950190514326, -0.0430089570581913, -0.047534093260765076, -0.03860824182629585, 0.05328722298145294, 0.24060802161693573, 0.06082427129149437, 0.011473330669105053, -0.13419246673583984, -0.155334010720253, 0.007296083029359579, 0.018946178257465363, 0.08028466254472733, -0.01514141634106636, -0.01639009267091751, -0.07280251383781433, 0.00027127875364385545, -0.026129690930247307, -0.0020040373783558607, -0.0037851519882678986, -0.16086825728416443, 0.054560739547014236, 0.0561402253806591, 0.07462868839502335, -0.016259165480732918, -0.04908544942736626, -0.15869475901126862, 0.151691272854805, -0.09960365295410156, -0.06161416694521904, -0.1036093458533287, -0.11439812928438187, 0.002834177343174815, -0.07101110368967056, 0.039369456470012665, -0.07290481775999069, -0.02223091386258602, -0.08472937345504761, -0.17801177501678467, 0.11928609758615494, -0.09141615033149719, -0.04644513502717018, -0.06339488178491592, 0.15264791250228882, -0.09395953267812729, 0.028379999101161957, 0.03216757997870445, 0.019067145884037018, -0.0923117995262146, -0.1166546568274498, -0.016101287677884102, -0.02221347577869892, 0.05591592192649841, -0.0005722616915591061, -0.12544645369052887, -0.033003512769937515, -0.021095670759677887, -0.054459333419799805, 0.23468010127544403, 0.21560798585414886, -0.07722184807062149, 0.189819797873497, 0.15735647082328796, -0.08733949810266495, -0.33303841948509216, -0.12961214780807495, -0.16481682658195496, -0.05104494094848633, -0.011928399093449116, -0.13778679072856903, 0.09005093574523926, 0.0016080507775768638, -0.0544525645673275, 0.10305257886648178, -0.1580563634634018, -0.0969514548778534, 0.19954413175582886, -0.046360377222299576, 0.3865794241428375, -0.11874134093523026, -0.09090747684240341, -0.06987828761339188, -0.18883948028087616, 0.0962824821472168, 0.012935523875057697, 0.08804747462272644, -0.022735225036740303, 0.05675492808222771, 0.0005874708294868469, -0.039868805557489395, 0.12314847111701965, -0.014029019512236118, 0.024823138490319252, -0.1337219625711441, -0.08908099681138992, 0.10050073266029358, 0.007934453897178173, -0.013884062878787518, -0.0759909376502037, 0.008807114325463772, -0.12794944643974304, -0.033780816942453384, -0.055814217776060104, 0.07183781266212463, 0.0019888330716639757, -0.0423285998404026, -0.05574226751923561, 0.02328132651746273, -0.019781533628702164, -0.022484898567199707, 0.23775990307331085, -0.02944856323301792, 0.15672703087329865, 0.055219803005456924, 0.10190945863723755, -0.18189875781536102, 0.05646495148539543, -0.09909456968307495, -0.08620087057352066, 0.059819597750902176, -0.028592167422175407, 0.027247900143265724, 0.14743149280548096, -0.05565252900123596, 0.09064853191375732, 0.09464585036039352, 0.03413316607475281, -0.009699227288365364, 0.17233003675937653, -0.18934722244739532, -0.028028659522533417, -0.04108801111578941, -0.01895536668598652, 0.11044425517320633, 0.05717059597373009, 0.1098533645272255, 0.014150389470160007, -0.04023200646042824, 0.027156101539731026, 0.005175337195396423, -0.034644726663827896, 0.022147804498672485, 0.0648951455950737, 0.01977180503308773, -0.13357596099376678, 0.05296832695603371, 0.044081225991249084, -0.09441284090280533, -0.02591993845999241, 0.12473293393850327, -0.16421496868133545, -0.11897359043359756, -0.06599945574998856, 0.11814979463815689, -0.1638948768377304, -0.07463496923446655, -0.08169170469045639, -0.16951300203800201, 0.04603637382388115, 0.1658271700143814, 0.110027015209198, 0.08837158232927322, -0.031815413385629654, -0.05209353193640709, 0.015736399218440056, -0.00945771113038063, -0.0463312529027462, 0.02748422883450985, -0.12919776141643524, 0.03236839175224304, 0.005313422530889511, 0.12317028641700745, -0.07511641085147858, -0.048759639263153076, -0.17576169967651367, 0.03401974216103554, -0.1375025063753128, -0.023824254050850868, -0.10054460912942886, -0.01120674330741167, 0.0059767733328044415, -0.07675336301326752, -0.07243768870830536, -0.04635736718773842, -0.1175668016076088, 0.022330403327941895, 0.002938732737675309, 0.08880585432052612, -0.07134752720594406, -0.02657129615545273, 0.1053704246878624, -0.012842872180044651, 0.09775935858488083, 0.0699368417263031, -0.06383046507835388, 0.0909709632396698, -0.12689705193042755, -0.10334215313196182, 0.10321789234876633, 0.020362308248877525, 0.053578656166791916, 0.014724534936249256, 0.022006908431649208, 0.08996356278657913, -0.0008207529899664223, 0.07936154305934906, 0.01977093517780304, -0.12086397409439087, 0.007266813423484564, -0.027690360322594643, -0.15698248147964478, -0.022562628611922264, -0.05594121292233467, 0.10235657542943954, -0.020854318514466286, 0.16392944753170013, -0.05678991600871086, 0.05836459621787071, -0.04418587684631348, 0.012016636319458485, -0.03022310882806778, -0.1691918969154358, -0.09333980083465576, -0.11533361673355103, -0.01971288025379181, 0.005893524736166, 0.3003096878528595, 0.049209654331207275, -0.05449298396706581, 0.07498250156641006, 0.11547321081161499, -0.053126152604818344, 0.008184637874364853, 0.2323824167251587, 0.08208087831735611, -0.03149576857686043, -0.11610079556703568, 0.04132983461022377, -0.016303056851029396, -0.08730417490005493, 0.1394370198249817, 0.1001528799533844, -0.016982948407530785, 0.03783285990357399, 0.033366527408361435, 0.01925317756831646, -0.10054873675107956, -0.10966084152460098, 0.0008485006983391941, 0.05435125157237053, -0.003629647195339203, 0.08346199244260788, 0.1493764966726303, -0.051026925444602966, 0.03970398008823395, -0.07891257852315903, -0.02230178564786911, -0.17555777728557587, -0.11064795404672623, -0.08349993079900742, -0.1213020607829094, 0.03657307103276253, -0.033365603536367416, 0.002058430342003703, 0.06517459452152252, 0.040108319371938705, -0.08039265125989914, 0.012281925417482853, -0.02766367234289646, -0.059152934700250626, 0.03215980529785156, -0.022267630323767662, 0.015192260034382343, -0.07208782434463501, -0.036023929715156555, -0.11737241595983505, -0.05160915479063988, -0.04669763520359993, 0.02692553400993347, -0.0368964709341526, -0.005282644182443619, -0.13880661129951477, -0.08399941772222519, -0.023855729028582573, 0.052146364003419876, 0.004856186453253031, 0.12588010728359222, -0.0002110954374074936, 0.03882293775677681, 0.06559357792139053, 0.17887817323207855, -0.03146222606301308, -0.12007252126932144, -0.031659048050642014, 0.2230897694826126, 0.07535206526517868, 0.07644987851381302, -0.003916256129741669, 0.005045746453106403, -0.04108135402202606, 0.31523191928863525, 0.33779391646385193, -0.06412940472364426, 0.03980107977986336, -0.005282808095216751, 0.031566668301820755, 0.1370268315076828, 0.14318279922008514, 0.07639425992965698, 0.23162944614887238, -0.05783047899603844, -0.050493646413087845, -0.06004573032259941, 0.00016400527965743095, -0.1254124939441681, 0.10406843572854996, 0.035251397639513016, -0.060998398810625076, -0.055435191839933395, 0.11418992280960083, -0.17336909472942352, 0.09893900156021118, 0.03467122092843056, -0.19074071943759918, -0.040952131152153015, -0.014752067625522614, 0.16842758655548096, 0.0026997216045856476, 0.04443088546395302, -0.025042450055480003, -0.0921037420630455, 0.06910824775695801, 0.016645053401589394, -0.2168399542570114, -0.021991314366459846, 0.08602889627218246, -0.04174334183335304, 0.052762020379304886, -0.042491212487220764, 0.03475659340620041, 0.09505649656057358, 0.08576839417219162, -0.023791829124093056, 0.08937326818704605, 0.023460282012820244, -0.060205671936273575, -0.0036443136632442474, -0.06351467221975327, 0.013586051762104034, -0.09156053513288498, 0.06683986634016037, -0.11647924035787582, 0.06387247890233994, -0.06761929392814636, -0.07032106071710587, -0.024066142737865448, 0.07189842313528061, -0.07056467980146408, 0.0728694424033165, 0.05449383333325386, -0.014182218350470066, -0.03500181436538696, -0.0519128255546093, -0.032735053449869156, 0.0427161306142807, -0.15929262340068817, -0.0905010774731636, -0.05213642120361328, -0.07084652036428452, 0.0767972469329834, 0.018510902300477028, -0.16233114898204803, -0.012995350174605846, -0.1213335171341896, 0.05188489332795143, -0.1724989414215088, 0.09156450629234314, 0.06959810107946396, 0.015005971305072308, -0.009749031625688076, -0.0969204530119896, 0.02824798785150051, 0.03697001934051514, -0.10741708427667618, -0.08369214087724686 ]
null
null
transformers
# Hungarian Sentence-level Sentiment Analysis with Finetuned huBERT Model For further models, scripts and details, see [our repository](https://github.com/nytud/sentiment-analysis) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained model used: huBERT - Finetuned on Hungarian Twitter Sentiment (HTS) Corpus - Labels: 0 (very negative), 1 (negative), 2 (neutral), 3 (positive), 4 (very positive) ## Limitations - max_seq_length = 128 ## Results | Model | HTS2 | HTS5 | | ------------- | ------------- | ------------- | | huBERT | 85.56 | **68.99** | | XLM-RoBERTa| 85.56 | 66.50 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-sentiment, title = {Improving Performance of Sentence-level Sentiment Analysis with Data Augmentation Methods}, booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)}, year = {2021}, publisher = {IEEE}, address = {Online}, author = {Laki, László and Yang, Zijian Győző} pages = {417--422} } ```
{"language": ["hu"], "license": "apache-2.0", "tags": ["text-classification"], "metrics": ["accuracy"], "widget": [{"text": "J\u00f3 reggelt! majd k\u00fcld\u00f6m az \u00e9lm\u00e9nyhoz\u00f3kat :)."}]}
text-classification
NYTK/sentiment-hts5-hubert-hungarian
[ "transformers", "pytorch", "bert", "text-classification", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Hungarian Sentence-level Sentiment Analysis with Finetuned huBERT Model ======================================================================= For further models, scripts and details, see our repository or our demo site. * Pretrained model used: huBERT * Finetuned on Hungarian Twitter Sentiment (HTS) Corpus * Labels: 0 (very negative), 1 (negative), 2 (neutral), 3 (positive), 4 (very positive) Limitations ----------- * max\_seq\_length = 128 Results ------- Model: huBERT, HTS2: 85.56, HTS5: 68.99 Model: XLM-RoBERTa, HTS2: 85.56, HTS5: 66.50 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 46 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.0434887520968914, 0.1282162070274353, -0.00612996332347393, 0.032373469322919846, 0.11295655369758606, 0.025390559807419777, 0.1268875151872635, 0.11749448627233505, 0.024788297712802887, -0.06507456302642822, 0.1287708878517151, 0.17426928877830505, 0.0074262130074203014, 0.07251251488924026, -0.08164559304714203, -0.24380141496658325, 0.09530210494995117, 0.04145800694823265, -0.01236800942569971, 0.08666938543319702, 0.10412266105413437, -0.051520396023988724, 0.05744433030486107, -0.012828446924686432, -0.05014578253030777, 0.009905134327709675, 0.025493862107396126, -0.10978883504867554, 0.08529175817966461, 0.04358380287885666, 0.09833349287509918, 0.048284973949193954, -0.005097746849060059, -0.1975345015525818, 0.021753674373030663, 0.002822614973410964, -0.08160953223705292, 0.05308414623141289, 0.04982001706957817, -0.03249795362353325, 0.04260817915201187, 0.040700607001781464, -0.02561628259718418, 0.06940847635269165, -0.07914787530899048, -0.14703363180160522, -0.08879883587360382, 0.12048768997192383, 0.07300668954849243, 0.06478115171194077, 0.035202495753765106, 0.1368100494146347, -0.12130257487297058, 0.05514969304203987, 0.1097291111946106, -0.3434688150882721, 0.0030758485663682222, 0.05117661505937576, 0.02030138298869133, 0.038716405630111694, -0.037275008857250214, 0.04951541870832443, 0.060014739632606506, 0.013800493441522121, -0.008618712425231934, -0.06508287042379379, -0.1216626763343811, 0.03844738379120827, -0.04888824373483658, -0.04236017167568207, 0.24112194776535034, -0.0058019282296299934, 0.045837756246328354, 0.0003754347562789917, -0.07033520191907883, -0.01156484242528677, -0.029977379366755486, 0.06108110770583153, 0.01706695184111595, 0.10145435482263565, 0.10239952057600021, -0.003322926117107272, -0.1207592561841011, 0.018795406445860863, -0.20249098539352417, 0.09313890337944031, 0.03081495501101017, 0.06728692352771759, -0.13609135150909424, 0.05705229192972183, 0.0920647531747818, -0.12443763762712479, 0.005105041433125734, -0.08093500137329102, 0.10854659229516983, 0.018917985260486603, -0.06186467781662941, 0.09956154227256775, 0.13442467153072357, 0.2068973034620285, 0.047680411487817764, 0.04367108270525932, -0.04681188613176346, 0.12386085838079453, -0.027340779080986977, 0.09371768683195114, 0.013846096582710743, -0.03977338597178459, 0.10505826026201248, -0.10783082246780396, 0.07095388323068619, -0.038093555718660355, -0.16845954954624176, -0.02328392304480076, 0.01706092059612274, 0.10017697513103485, 0.0333692841231823, 0.05236639454960823, -0.0491148866713047, 0.006991037167608738, 0.11697519570589066, -0.05681236833333969, 0.018716316670179367, 0.022085685282945633, 0.030835358425974846, 0.08593350648880005, 0.034328166395425797, 0.02559693530201912, -0.06389128416776657, 0.09042597562074661, -0.05366957187652588, -0.0038895343896001577, -0.025319885462522507, -0.013444807380437851, 0.08832941204309464, -0.09669759124517441, 0.04561634361743927, -0.1517317295074463, -0.11237457394599915, 0.027954120188951492, 0.060836855322122574, 0.006251898594200611, -0.07666131854057312, 0.03343381732702255, -0.003533633891493082, 0.027593178674578667, -0.08270443975925446, -0.04707803204655647, -0.08973945677280426, 0.06809724122285843, -0.09197565168142319, 0.02841426059603691, -0.1709860861301422, 0.05784766376018524, -0.1001691147685051, 0.0009356352966278791, -0.0760277807712555, -0.03318394720554352, -0.10460098087787628, 0.22501885890960693, -0.020419854670763016, -0.05306515842676163, 0.010000287555158138, 0.005240917671471834, -0.05968654528260231, 0.11418697983026505, -0.08652675896883011, -0.06625915318727493, 0.17712461948394775, -0.11950070410966873, -0.18260717391967773, 0.08730577677488327, 0.008291643112897873, -0.009214501827955246, 0.07268434762954712, 0.18006229400634766, 0.10566092282533646, -0.021187148988246918, 0.07113425433635712, 0.15048831701278687, -0.040947288274765015, -0.17277170717716217, 0.03568623960018158, -0.029011648148298264, -0.13211023807525635, 0.0599011555314064, -0.009899661876261234, 0.09420648217201233, -0.0032790498808026314, -0.07821164280176163, -0.04214586690068245, -0.048005275428295135, 0.011028378270566463, 0.02089003100991249, 0.0836382806301117, -0.06815953552722931, 0.005605021025985479, -0.03909703344106674, 0.04076483100652695, 0.05278678238391876, 0.06070273369550705, -0.06702341139316559, 0.08452870696783066, 0.06683379411697388, 0.0291458610445261, -0.11659331619739532, 0.008872576989233494, -0.012856803834438324, 0.03245650976896286, 0.0038708685897290707, 0.08994337171316147, 0.022342825308442116, -0.05871325358748436, -0.0173491258174181, -0.009397649206221104, 0.13024085760116577, 0.0552787221968174, -0.024140357971191406, -0.13943928480148315, 0.035930391401052475, -0.01558198407292366, 0.0674215778708458, -0.03482707962393761, 0.01923820562660694, 0.05571414530277252, 0.11786055564880371, -0.04539300501346588, 0.11341254413127899, -0.04034513607621193, 0.024991214275360107, -0.06731708347797394, 0.001674680970609188, 0.11901775002479553, 0.041440676897764206, -0.09393310546875, 0.16488748788833618, -0.06188854202628136, 0.2677548825740814, 0.20976665616035461, -0.2136308252811432, 0.06348590552806854, -0.00470400508493185, -0.015853185206651688, -0.004086120519787073, 0.04522792994976044, 0.030256042256951332, 0.053597770631313324, 0.0050210426561534405, 0.17264100909233093, -0.039037786424160004, -0.03607746213674545, -0.025279469788074493, -0.051374200731515884, -0.008309570141136646, 0.05910451337695122, 0.1586754322052002, -0.1820884346961975, 0.1822541356086731, 0.2847212553024292, -0.0063355849124491215, 0.08594595640897751, -0.07909215241670609, 0.02161979116499424, 0.06433381140232086, -0.046706460416316986, -0.029632411897182465, 0.017825817689299583, -0.13659560680389404, -0.008348465897142887, 0.08231012523174286, 0.03678017109632492, 0.06272430717945099, -0.14455419778823853, -0.04764006659388542, -0.021808234974741936, -0.0026341069024056196, -0.06277167052030563, 0.04689870402216911, 0.00044693349627777934, 0.09370281547307968, -0.02774418517947197, -0.10532957315444946, 0.1118728443980217, -0.00005384266478358768, -0.08514346927404404, 0.13645783066749573, -0.17396454513072968, -0.26944541931152344, -0.15380650758743286, -0.14226192235946655, -0.028455447405576706, 0.017389362677931786, 0.13474521040916443, -0.06441390514373779, -0.06403996795415878, 0.019975585862994194, -0.07588216662406921, -0.019704043865203857, 0.016278661787509918, 0.013531647622585297, 0.06301607936620712, 0.012834439054131508, -0.1000068187713623, -0.06526842713356018, 0.010732683353126049, -0.008881920948624611, 0.0997009202837944, -0.09366574883460999, 0.07838699221611023, 0.11796671152114868, 0.039720650762319565, 0.04111456871032715, -0.03054552525281906, 0.1344349980354309, -0.045448530465364456, 0.009526710957288742, 0.19920960068702698, -0.045177556574344635, 0.0765223279595375, 0.15859930217266083, 0.041497934609651566, -0.049147624522447586, 0.00705241784453392, -0.05661432445049286, -0.06018070876598358, -0.23849257826805115, -0.13064086437225342, -0.12174534052610397, 0.045089565217494965, 0.050018616020679474, 0.08305211365222931, 0.09785227477550507, 0.06854219734668732, -0.019646313041448593, 0.039725515991449356, -0.00964477937668562, 0.06852885335683823, 0.27804622054100037, -0.010500295087695122, 0.10958924889564514, -0.11485037207603455, -0.06006312742829323, 0.11961676925420761, 0.06482097506523132, 0.1247367113828659, 0.13154974579811096, 0.058412108570337296, 0.07013541460037231, 0.15885613858699799, 0.10599739104509354, 0.09429550170898438, 0.03441106155514717, -0.00541922589763999, -0.045866914093494415, -0.010157668963074684, -0.051725875586271286, 0.038199301809072495, -0.034706685692071915, -0.13671709597110748, -0.04592345282435417, -0.13721737265586853, 0.08740247786045074, 0.16800956428050995, 0.038032129406929016, -0.13018716871738434, 0.01351655088365078, 0.10577696561813354, -0.01133508887141943, -0.05284135416150093, 0.11322192847728729, -0.09613724052906036, -0.10717916488647461, 0.14623866975307465, -0.0013063126243650913, 0.15257549285888672, -0.02216845564544201, 0.05267684906721115, -0.0191632229834795, -0.129877507686615, 0.07346407324075699, 0.13437047600746155, -0.2763715982437134, 0.21347050368785858, -0.01665456034243107, -0.051291290670633316, -0.06937649846076965, -0.008359488099813461, 0.08343063294887543, 0.2740291357040405, 0.05933742597699165, 0.029078135266900063, -0.11142256110906601, -0.0803193673491478, -0.06288328021764755, 0.04367838799953461, 0.024803010746836662, -0.012860679998993874, -0.047211721539497375, -0.07242458313703537, -0.015039103105664253, -0.007754213642328978, 0.04183259606361389, -0.03301812708377838, -0.15685515105724335, 0.05907463654875755, 0.1067129522562027, 0.06455303728580475, -0.055116817355155945, -0.03665735572576523, -0.15006887912750244, 0.16299813985824585, -0.09676190465688705, -0.07152728736400604, -0.09035639464855194, -0.12532760202884674, 0.01869715377688408, -0.05955588445067406, 0.042501043528318405, -0.07397680729627609, -0.005241410341113806, -0.05539749935269356, -0.19665028154850006, 0.1060280054807663, -0.1268637627363205, -0.052288565784692764, -0.04870232939720154, 0.14159005880355835, -0.0978550985455513, 0.029907124117016792, 0.027370808646082878, -0.001842310419306159, -0.09787321090698242, -0.1301853060722351, -0.03453226014971733, 0.035310372710227966, 0.059925686568021774, -0.026914602145552635, -0.12340526282787323, -0.01405020710080862, 0.0006237937486730516, -0.04588779807090759, 0.22493396699428558, 0.1684177815914154, -0.0889885351061821, 0.1904640644788742, 0.18444646894931793, -0.09197146445512772, -0.3282652795314789, -0.16268010437488556, -0.1556841880083084, -0.08959272503852844, -0.03780500963330269, -0.15659493207931519, 0.14509673416614532, 0.023510722443461418, -0.07812128961086273, 0.09832952171564102, -0.1598958969116211, -0.08176460862159729, 0.21808704733848572, -0.05656159296631813, 0.34315478801727295, -0.1343366503715515, -0.07723602652549744, -0.09342294931411743, -0.19554737210273743, 0.10376343876123428, -0.07908494025468826, 0.05570331960916519, -0.007573881186544895, 0.021853165701031685, -0.018673541024327278, -0.04128614440560341, 0.13890381157398224, -0.008476145565509796, 0.0103214206174016, -0.13031476736068726, -0.0013552778400480747, 0.07936320453882217, -0.02264109067618847, 0.010469633154571056, -0.14011137187480927, 0.020549016073346138, -0.11857570707798004, -0.016819249838590622, -0.04973926395177841, 0.06857679039239883, -0.0021640374325215816, -0.03004186600446701, -0.036596763879060745, -0.012295212596654892, 0.035312775522470474, -0.0015858429251238704, 0.24263815581798553, 0.024654481559991837, 0.09462243318557739, 0.08696931600570679, 0.08596211671829224, -0.23048752546310425, 0.03347031772136688, -0.11214955151081085, -0.08131160587072372, 0.07221675664186478, -0.0905771255493164, 0.05710280314087868, 0.1254386305809021, -0.07342210412025452, 0.07196503132581711, 0.08455442637205124, 0.03732947260141373, -0.048313844949007034, 0.1417766511440277, -0.18195410072803497, 0.0018647537799552083, -0.02354593575000763, 0.0760408267378807, 0.07369842380285263, 0.07086710631847382, 0.12023969739675522, 0.014138750731945038, -0.03971254825592041, 0.03234616667032242, 0.032200586050748825, -0.053519126027822495, 0.024554140865802765, 0.05349737033247948, 0.008772216737270355, -0.13967002928256989, 0.09211256355047226, 0.0308118537068367, -0.11210619658231735, -0.03143756836652756, 0.1102287545800209, -0.17145714163780212, -0.1305808275938034, -0.005850812885910273, 0.09881908446550369, -0.10884305089712143, -0.11918898671865463, -0.06392377614974976, -0.17009294033050537, 0.06238051503896713, 0.11077582091093063, 0.12441360205411911, 0.07130511850118637, -0.017844131216406822, -0.05644327029585838, 0.05272693559527397, -0.006818860769271851, -0.0658203586935997, 0.023090064525604248, -0.12722784280776978, -0.02436378411948681, 0.02353753335773945, 0.11302616447210312, -0.053951993584632874, -0.016352131962776184, -0.11474186182022095, 0.029883980751037598, -0.19172364473342896, 0.0039461953565478325, -0.08545416593551636, 0.004070573952049017, 0.011247806251049042, -0.09255466610193253, -0.057184915989637375, -0.008274481631815434, -0.11594387143850327, -0.011156830936670303, -0.022886618971824646, 0.08024458587169647, -0.09392581880092621, -0.03740031272172928, 0.10652794688940048, -0.02346104383468628, 0.1091407984495163, 0.07828789949417114, -0.07350261509418488, 0.09263576567173004, -0.11525139957666397, -0.115024633705616, 0.09507068991661072, 0.05464620888233185, 0.036374639719724655, -0.01279841922223568, 0.0036954700481146574, 0.10999294370412827, -0.025437625125050545, 0.049679212272167206, 0.05319029837846756, -0.12448521703481674, -0.040947090834379196, -0.029309330508112907, -0.1282980889081955, -0.005256433505564928, -0.09504454582929611, 0.12705212831497192, 0.011149496771395206, 0.1685630977153778, -0.028278924524784088, 0.041259922087192535, -0.05374511703848839, 0.021553559228777885, -0.03536881506443024, -0.16238057613372803, -0.13202747702598572, -0.08800151199102402, -0.02947688102722168, -0.01133103109896183, 0.2791491746902466, 0.009496865794062614, -0.04521911218762398, 0.07748684287071228, 0.09749559313058853, -0.038269806653261185, 0.018027622252702713, 0.24639172852039337, 0.06328576803207397, -0.008771006017923355, -0.09014892578125, -0.0020543038845062256, 0.017523886635899544, -0.08494459837675095, 0.12031934410333633, 0.09604473412036896, 0.031681448221206665, 0.055074453353881836, 0.009379317983984947, 0.012556159868836403, -0.1314580887556076, -0.11174173653125763, -0.0102754021063447, 0.08916011452674866, 0.009072324261069298, 0.12162024527788162, 0.11254798620939255, -0.04368019476532936, 0.02098405547440052, -0.07404977083206177, -0.011079542338848114, -0.17461799085140228, -0.1011018306016922, -0.08224525302648544, -0.14219555258750916, 0.002281083958223462, -0.04191518947482109, 0.0024793900083750486, 0.07121007144451141, 0.04121262580156326, -0.06531098484992981, 0.0013354613911360502, -0.037759020924568176, -0.04110828787088394, 0.03387119248509407, -0.019085893407464027, -0.016727671027183533, -0.04395937919616699, -0.05649394169449806, -0.10286637395620346, -0.03543619439005852, -0.0525272861123085, 0.029439209029078484, -0.013833541423082352, 0.029673293232917786, -0.10661040991544724, -0.07033941149711609, -0.0394895114004612, 0.014326798729598522, 0.0005730020930059254, 0.15983609855175018, 0.009345872327685356, 0.04251403361558914, 0.09167855978012085, 0.15840794146060944, -0.06753014773130417, -0.14552098512649536, -0.058581799268722534, 0.18460027873516083, 0.045525554567575455, 0.04602377116680145, 0.013214902952313423, 0.020930476486682892, -0.06345731019973755, 0.3370414972305298, 0.2952379286289215, -0.08285108208656311, 0.03298317641019821, -0.016074245795607567, 0.022447671741247177, 0.0935361385345459, 0.1397077739238739, 0.1311754286289215, 0.18173417448997498, -0.06396117061376572, -0.06234751641750336, -0.053676679730415344, -0.002607000758871436, -0.17490500211715698, 0.09575482457876205, -0.004391147289425135, -0.08462657779455185, -0.0347195565700531, 0.08737118542194366, -0.12206001579761505, 0.08615179359912872, -0.0016114325262606144, -0.1555137187242508, -0.0418897308409214, -0.014403024688363075, 0.2009618878364563, 0.012023815885186195, 0.01792079396545887, -0.021795684471726418, -0.07043350487947464, 0.14702469110488892, -0.004562158603221178, -0.18949325382709503, -0.04386605694890022, 0.0943559855222702, -0.06173930689692497, 0.10954106599092484, -0.005378682631999254, 0.02983132191002369, 0.08477696776390076, 0.08922281861305237, -0.06771853566169739, 0.07477451115846634, 0.020855069160461426, -0.05828728526830673, -0.012308647856116295, -0.10375984013080597, -0.020148959010839462, -0.07548663765192032, 0.05589277669787407, -0.08201828598976135, 0.041578009724617004, -0.011092323809862137, -0.068037249147892, -0.021885205060243607, 0.05364800989627838, -0.07794482260942459, 0.06205511465668678, 0.02576792985200882, -0.03830999508500099, -0.06381291151046753, -0.056208476424217224, -0.037184301763772964, 0.011202008463442326, -0.1824050098657608, -0.07511567324399948, -0.01904807612299919, -0.04554937034845352, 0.06046390160918236, 0.0468159057199955, -0.0808715894818306, -0.02246410772204399, -0.09982031583786011, 0.03638887032866478, -0.16978268325328827, 0.05696408823132515, 0.061324767768383026, -0.001957209315150976, -0.010295327752828598, -0.06901649385690689, 0.023620905354619026, 0.01966485008597374, -0.08515617251396179, -0.08638837188482285 ]
null
null
transformers
# Hungarian Sentence-level Sentiment Analysis Model with XLM-RoBERTa For further models, scripts and details, see [our repository](https://github.com/nytud/sentiment-analysis) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained model used: XLM-RoBERTa base - Finetuned on Hungarian Twitter Sentiment (HTS) Corpus - Labels: 0 (very negative), 1 (negative), 2 (neutral), 3 (positive), 4 (very positive) ## Limitations - max_seq_length = 128 ## Results | Model | HTS2 | HTS5 | | ------------- | ------------- | ------------- | | huBERT | 85.56 | **68.99** | | XLM-RoBERTa| 85.56 | 66.50 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {laki-yang-sentiment, title = {Improving Performance of Sentence-level Sentiment Analysis with Data Augmentation Methods}, booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)}, year = {2021}, publisher = {IEEE}, address = {Online}, author = {Laki, László and Yang, Zijian Győző} pages = {417--422} } ```
{"language": ["hu"], "license": "mit", "tags": ["text-classification"], "metrics": ["accuracy"], "widget": [{"text": "J\u00f3 reggelt! majd k\u00fcld\u00f6m az \u00e9lm\u00e9nyhoz\u00f3kat :)."}]}
text-classification
NYTK/sentiment-hts5-xlm-roberta-hungarian
[ "transformers", "pytorch", "xlm-roberta", "text-classification", "hu", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #xlm-roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us
Hungarian Sentence-level Sentiment Analysis Model with XLM-RoBERTa ================================================================== For further models, scripts and details, see our repository or our demo site. * Pretrained model used: XLM-RoBERTa base * Finetuned on Hungarian Twitter Sentiment (HTS) Corpus * Labels: 0 (very negative), 1 (negative), 2 (neutral), 3 (positive), 4 (very positive) Limitations ----------- * max\_seq\_length = 128 Results ------- Model: huBERT, HTS2: 85.56, HTS5: 68.99 Model: XLM-RoBERTa, HTS2: 85.56, HTS5: 66.50 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #xlm-roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 47 ]
[ "passage: TAGS\n#transformers #pytorch #xlm-roberta #text-classification #hu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.05142524838447571, 0.09008586406707764, -0.005453280173242092, 0.01730443723499775, 0.1807711124420166, 0.04707804322242737, 0.12293757498264313, 0.10998278111219406, 0.06365624815225601, -0.05109674483537674, 0.09276022017002106, 0.2103334367275238, 0.009434251114726067, 0.09324440360069275, -0.10995159298181534, -0.26065224409103394, 0.05487699434161186, 0.05722110718488693, 0.013169308193027973, 0.08514625579118729, 0.08903326094150543, -0.08101417869329453, 0.07910528033971786, -0.029133543372154236, -0.12453645467758179, 0.010075409896671772, 0.03908049687743187, -0.11223632842302322, 0.10022852569818497, 0.06601058691740036, 0.12359333038330078, 0.07019197940826416, 0.005355910863727331, -0.180510014295578, 0.030588196590542793, -0.03574139252305031, -0.10501161962747574, 0.05330212041735649, 0.056032516062259674, -0.05115977302193642, 0.07474526017904282, 0.03905374929308891, -0.02137514017522335, 0.08114343881607056, -0.11759693175554276, -0.11640623211860657, -0.08219355344772339, 0.09659195691347122, 0.05954832583665848, 0.046718500554561615, 0.026114152744412422, 0.15588703751564026, -0.11954488605260849, 0.0823902040719986, 0.11998341977596283, -0.3399004340171814, 0.013418476097285748, 0.10991951823234558, 0.0877850353717804, 0.05392292141914368, -0.06245722621679306, 0.07905729115009308, 0.06047482043504715, 0.004532587248831987, -0.03537748381495476, -0.08539518713951111, -0.050871990621089935, 0.06450607627630234, -0.05551980808377266, -0.017525890842080116, 0.20859190821647644, -0.04295602813363075, 0.041815850883722305, -0.029427671805024147, -0.05175808444619179, -0.06982238590717316, -0.026259370148181915, 0.04416446015238762, -0.01104401983320713, 0.061080820858478546, 0.07126971334218979, 0.016871236264705658, -0.1033710464835167, 0.020894242450594902, -0.2089540809392929, 0.18376797437667847, 0.022591739892959595, 0.049501270055770874, -0.14629466831684113, 0.055657338351011276, 0.09800360351800919, -0.10317275673151016, -0.003442818531766534, -0.07942599803209305, 0.04278067499399185, -0.02407967858016491, -0.036965690553188324, 0.07550603151321411, 0.09419984370470047, 0.1411314159631729, 0.012196253053843975, 0.04531806707382202, -0.022096915170550346, 0.12155158817768097, 0.01463204063475132, 0.13148505985736847, 0.02806546725332737, -0.031245393678545952, 0.06340064108371735, -0.12967419624328613, 0.022446230053901672, -0.04751644283533096, -0.1935807764530182, -0.027073640376329422, 0.009289475157856941, 0.11505589634180069, -0.009583162143826485, 0.06637360900640488, -0.05161772668361664, -0.0059195347130298615, 0.07887862622737885, -0.05454671010375023, 0.03740235045552254, 0.022605033591389656, 0.027506697922945023, 0.08035676926374435, -0.008102085441350937, 0.004235930275171995, -0.05038997530937195, 0.10163309425115585, -0.071820467710495, 0.031188976019620895, -0.032428961247205734, -0.078915536403656, 0.06123090162873268, -0.12199511379003525, 0.040493108332157135, -0.1658485233783722, -0.09479949623346329, 0.001206143177114427, 0.017773309722542763, -0.0034033693373203278, -0.03692660480737686, 0.01264161616563797, -0.00030146041535772383, 0.02366749756038189, -0.05626872554421425, -0.08683241903781891, -0.07824607938528061, 0.09110153466463089, -0.06064258888363838, 0.050650741904973984, -0.15902811288833618, 0.06300341337919235, -0.07997839152812958, -0.0018444197485223413, -0.10202588886022568, -0.00017347873654216528, -0.08759204298257828, 0.21732743084430695, 0.0346127450466156, -0.07171984016895294, 0.005650590639561415, 0.046431757509708405, -0.08618439733982086, 0.11260704696178436, -0.09058018028736115, -0.09961968660354614, 0.16309021413326263, -0.11185085773468018, -0.1550057828426361, 0.0743849128484726, -0.01393995713442564, 0.021436041221022606, 0.08502763509750366, 0.20233428478240967, 0.12230847030878067, -0.07783133536577225, 0.06314128637313843, 0.12253405153751373, -0.09471683949232101, -0.173273965716362, 0.013749165460467339, 0.009111042134463787, -0.11976740509271622, 0.060096777975559235, 0.006212546024471521, 0.07834496349096298, -0.04086826369166374, -0.06105157360434532, -0.01901419088244438, -0.0072308750823140144, 0.020946815609931946, 0.04856520891189575, 0.10774000734090805, -0.07148770987987518, 0.01995871029794216, 0.010140284895896912, 0.01730678789317608, 0.04724912717938423, 0.03210991248488426, -0.07771947234869003, 0.10488174110651016, 0.057181283831596375, 0.011110852472484112, -0.13260012865066528, -0.03796689212322235, -0.015970125794410706, 0.07005266845226288, 0.020538102835416794, 0.15591339766979218, -0.0022894495632499456, -0.018443265929818153, -0.015596223063766956, -0.011979397386312485, 0.1536964476108551, 0.04478074237704277, -0.04046062380075455, -0.10964678972959518, 0.03855228051543236, -0.035419825464487076, 0.02309272065758705, -0.09237748384475708, 0.015922699123620987, 0.0525045245885849, 0.08317498862743378, -0.017599042505025864, 0.10151451826095581, -0.05460350960493088, 0.07951793819665909, -0.07244347035884857, 0.017495963722467422, 0.12495219707489014, 0.02251599170267582, -0.07958511263132095, 0.179305762052536, -0.11912465840578079, 0.30776649713516235, 0.21033325791358948, -0.21124820411205292, -0.001820901408791542, -0.0395992211997509, -0.014011291787028313, 0.01472297590225935, 0.03653334081172943, 0.028628505766391754, 0.07073165476322174, -0.0261898934841156, 0.1919976770877838, -0.0368490107357502, -0.030911384150385857, -0.01557666901499033, -0.043650656938552856, -0.0315568745136261, 0.0908251404762268, 0.18203666806221008, -0.2215633988380432, 0.17446547746658325, 0.1833423674106598, 0.04740006849169731, 0.15186676383018494, -0.044077347964048386, 0.047957152128219604, 0.025552863255143166, -0.055379677563905716, -0.01486595906317234, 0.03227240964770317, -0.0930887833237648, -0.023516593500971794, 0.060857564210891724, 0.015118787996470928, 0.08298294991254807, -0.15752552449703217, -0.07484855502843857, -0.02051662839949131, 0.006327487528324127, -0.09952441602945328, 0.0900324285030365, 0.030318623408675194, 0.10139190405607224, -0.031989146023988724, -0.0970832034945488, 0.08998177200555801, -0.001582204713486135, -0.10574668645858765, 0.14295223355293274, -0.15506099164485931, -0.2955537438392639, -0.1808956414461136, -0.15867586433887482, -0.0031850431114435196, 0.02267751470208168, 0.11094018816947937, -0.06749732047319412, -0.060295991599559784, 0.035503972321748734, -0.05194335803389549, -0.07171063125133514, -0.0010039922781288624, -0.03320429474115372, 0.09631246328353882, -0.05550164356827736, -0.08105941116809845, -0.08166923373937607, -0.02361874468624592, 0.009713714011013508, 0.1258123219013214, -0.09400302171707153, 0.09305127710103989, 0.1468895524740219, 0.005715357139706612, 0.05724760890007019, -0.05908245965838432, 0.13858485221862793, -0.0716116651892662, -0.015382116660475731, 0.16095608472824097, -0.018147915601730347, 0.06581711769104004, 0.17986246943473816, 0.059919148683547974, -0.05480819195508957, -0.01371268555521965, -0.0750751793384552, -0.07683060318231583, -0.22035232186317444, -0.1438562124967575, -0.14095643162727356, 0.03862684965133667, 0.03146182373166084, 0.08368512988090515, 0.10462448745965958, 0.05162378028035164, 0.007085112854838371, 0.013502560555934906, -0.027352066710591316, 0.07572288811206818, 0.2743903398513794, 0.0055446503683924675, 0.113361656665802, -0.10730790346860886, -0.08604298532009125, 0.08646102249622345, 0.04904875159263611, 0.16011030972003937, 0.1437162309885025, 0.06080935522913933, 0.06514838337898254, 0.10255640745162964, 0.16546978056430817, 0.07165590673685074, 0.05923648551106453, -0.034436266869306564, -0.01934634894132614, -0.002668949542567134, -0.004948129877448082, 0.03428349271416664, 0.02781955897808075, -0.1415385752916336, -0.07972465455532074, -0.12772174179553986, 0.07299096882343292, 0.06883230805397034, 0.02223304845392704, -0.1614229381084442, 0.020166490226984024, 0.09771762043237686, -0.009974566288292408, -0.05178805813193321, 0.08030087500810623, -0.06678806245326996, -0.1347304880619049, 0.11615326255559921, -0.010057645849883556, 0.13793867826461792, -0.010407325811684132, 0.05650739744305611, -0.00583612872287631, -0.10404925048351288, 0.04122048616409302, 0.10748074948787689, -0.259624719619751, 0.2535457909107208, 0.005338070914149284, -0.03548397496342659, -0.04332192242145538, -0.03077019192278385, 0.05427756905555725, 0.2541348338127136, 0.05922700837254524, 0.019263921305537224, -0.1703944355249405, -0.18430916965007782, 0.004721554461866617, 0.03048282489180565, 0.07045116275548935, -0.006264122202992439, -0.0052703930996358395, -0.06300525367259979, -0.007304729428142309, -0.019059043377637863, -0.010128223337233067, -0.00866043008863926, -0.16004422307014465, 0.05694587156176567, 0.053694408386945724, 0.049073584377765656, -0.017360223457217216, -0.05794890597462654, -0.20335915684700012, 0.17643189430236816, -0.1067509651184082, -0.057085610926151276, -0.10753656178712845, -0.1139119490981102, -0.006422100588679314, -0.07726559787988663, 0.023558013141155243, -0.059885866940021515, -0.006456331349909306, -0.07245125621557236, -0.1956835389137268, 0.11087694019079208, -0.09832759946584702, -0.03867174685001373, -0.05523695796728134, 0.156859889626503, -0.09701226651668549, 0.03555688261985779, 0.03341323137283325, 0.012039006687700748, -0.07578346878290176, -0.1298850178718567, -0.017214752733707428, 0.013530492782592773, 0.053307536989450455, 0.01632058434188366, -0.13804280757904053, -0.024865517392754555, -0.01883416250348091, -0.0485106036067009, 0.2428654283285141, 0.18741874396800995, -0.0980098769068718, 0.2026846706867218, 0.11262104660272598, -0.10539612174034119, -0.3306334614753723, -0.10330401360988617, -0.1525285392999649, -0.03916784003376961, -0.01109705027192831, -0.12813657522201538, 0.10766436159610748, 0.006577406078577042, -0.037055160850286484, 0.13156287372112274, -0.19694747030735016, -0.09889785200357437, 0.1693613976240158, -0.030634578317403793, 0.3567836880683899, -0.12262469530105591, -0.08669660240411758, -0.06749171018600464, -0.16541160643100739, 0.11027276515960693, 0.004837086424231529, 0.09331551939249039, -0.0345906987786293, 0.041459064930677414, -0.0012474942486733198, -0.04489761218428612, 0.1435445249080658, -0.0006549481186084449, 0.04514412209391594, -0.11508174985647202, -0.09316235035657883, 0.07090248912572861, 0.0014818230411037803, -0.00340858893468976, -0.09299173206090927, 0.004925812128931284, -0.13905109465122223, -0.029487457126379013, -0.04920089989900589, 0.0644974336028099, 0.0040397341363132, -0.024350928142666817, -0.050966132432222366, 0.01082880049943924, -0.008497747592628002, -0.029056137427687645, 0.22158809006214142, -0.02890518307685852, 0.15368810296058655, 0.07163635641336441, 0.1042012944817543, -0.20797201991081238, 0.027506249025464058, -0.09111598879098892, -0.07105876505374908, 0.07828652858734131, -0.015201580710709095, 0.03794316202402115, 0.16015410423278809, -0.04684014245867729, 0.10178422927856445, 0.10295680165290833, 0.05928141996264458, -0.013747288845479488, 0.15782520174980164, -0.17235255241394043, -0.05025316774845123, -0.04854463413357735, -0.028886273503303528, 0.10901149362325668, 0.06560995429754257, 0.1049560010433197, 0.014245918951928616, -0.033199746161699295, 0.017843063920736313, 0.014429105445742607, -0.02870846353471279, 0.04345378279685974, 0.07001864910125732, 0.030462155118584633, -0.1497480273246765, 0.044005002826452255, 0.038603734225034714, -0.07559766620397568, -0.033069025725126266, 0.133958101272583, -0.15695035457611084, -0.12176097929477692, -0.04639957845211029, 0.14247989654541016, -0.14915823936462402, -0.08464258909225464, -0.09779828041791916, -0.1797337830066681, 0.05748383328318596, 0.1643170714378357, 0.10496586561203003, 0.07906235009431839, -0.03207134082913399, -0.06438026577234268, 0.001151895965449512, -0.0033269424457103014, -0.054807789623737335, 0.028487129136919975, -0.14700955152511597, 0.07812399417161942, 0.01379395928233862, 0.1418166309595108, -0.07452952861785889, -0.04048185423016548, -0.17604847252368927, 0.028919430449604988, -0.14244602620601654, 0.00039671536069363356, -0.10305139422416687, -0.005343061871826649, 0.004742595367133617, -0.06350557506084442, -0.08144477754831314, -0.0281619131565094, -0.1211511641740799, 0.020240342244505882, -0.0013299998827278614, 0.0865354984998703, -0.05221153795719147, -0.028558064252138138, 0.07733334600925446, -0.009426047094166279, 0.0826466754078865, 0.06830784678459167, -0.06310868263244629, 0.08818365633487701, -0.14318536221981049, -0.10419527441263199, 0.10714192688465118, 0.04069513455033302, 0.05600326135754585, -0.013489126227796078, 0.038099318742752075, 0.10134076327085495, -0.015482117421925068, 0.07668576389551163, 0.0033771067392081022, -0.12479742616415024, -0.003644726239144802, -0.05814215540885925, -0.1258680820465088, -0.03302006796002388, -0.05218816176056862, 0.10689393430948257, -0.0047999536618590355, 0.15555225312709808, -0.057394418865442276, 0.07079461216926575, -0.07072921097278595, 0.003518054960295558, -0.031590789556503296, -0.16159111261367798, -0.09211502969264984, -0.1001041978597641, 0.000681308563798666, 0.014182706363499165, 0.3087388575077057, 0.06558313220739365, -0.051237475126981735, 0.06296499818563461, 0.11068323254585266, -0.04166906327009201, 0.007729705423116684, 0.23801957070827484, 0.07495550811290741, -0.011480988003313541, -0.08312834799289703, 0.046541355550289154, -0.011759789660573006, -0.07654553651809692, 0.16335846483707428, 0.09745799750089645, -0.007180165499448776, 0.03884625434875488, 0.03911367431282997, -0.018078545108437538, -0.09521021693944931, -0.09067341685295105, -0.03535192459821701, 0.06851360946893692, -0.006054883357137442, 0.061852823942899704, 0.14171016216278076, -0.06306188553571701, 0.04681190475821495, -0.06611691415309906, -0.028249287977814674, -0.18558752536773682, -0.10436674952507019, -0.09265363216400146, -0.1278754025697708, 0.024698257446289062, -0.04007194936275482, -0.0028641042299568653, 0.03646691516041756, 0.026189306750893593, -0.06963939964771271, -0.017235487699508667, -0.0253482386469841, -0.0535338893532753, 0.022499145939946175, -0.024933742359280586, 0.009740392677485943, -0.06991882622241974, -0.009963453747332096, -0.10505051910877228, -0.07234235852956772, -0.04045592248439789, 0.030104748904705048, -0.03250415250658989, 0.01566777378320694, -0.13744288682937622, -0.07860424369573593, -0.021962866187095642, 0.041516028344631195, 0.007044869475066662, 0.13658078014850616, 0.013012323528528214, 0.02887643501162529, 0.06171724945306778, 0.15011964738368988, -0.031277142465114594, -0.1082271859049797, -0.033509816974401474, 0.21577365696430206, 0.07683626562356949, 0.06646504253149033, -0.00009863082232186571, 0.01764410361647606, -0.04413808509707451, 0.30076172947883606, 0.35477548837661743, -0.0709891989827156, 0.04636537283658981, -0.021312979981303215, 0.026777559891343117, 0.11785537749528885, 0.15691417455673218, 0.0887087732553482, 0.2658764719963074, -0.050612710416316986, -0.06937979906797409, -0.07425009459257126, 0.00534967053681612, -0.13011516630649567, 0.08180008828639984, 0.03568281978368759, -0.062541663646698, -0.036752935498952866, 0.09892074763774872, -0.16698677837848663, 0.11673281341791153, 0.043351445347070694, -0.18009911477565765, -0.03611627593636513, -0.0110305305570364, 0.16667525470256805, -0.004120856057852507, 0.03407398983836174, -0.03684417903423309, -0.08719940483570099, 0.09455941617488861, 0.004149528220295906, -0.22719024121761322, -0.011231325566768646, 0.08592019975185394, -0.05262612923979759, 0.03637532889842987, -0.04197894409298897, 0.05025940388441086, 0.08344951272010803, 0.09486986696720123, -0.025872856378555298, 0.10460727661848068, 0.012550009414553642, -0.08281725645065308, 0.020084043964743614, -0.06017766892910004, 0.008833162486553192, -0.08132299780845642, 0.061912402510643005, -0.09332216531038284, 0.06458550691604614, -0.08911987394094467, -0.07906440645456314, -0.019214114174246788, 0.05314765125513077, -0.06261127442121506, 0.07672956585884094, 0.0594334714114666, -0.012504777871072292, -0.032575663179159164, -0.04504451900720596, -0.030652053654193878, 0.026786498725414276, -0.1381138414144516, -0.08208015561103821, -0.07335740327835083, -0.05284043774008751, 0.06491699814796448, 0.020873986184597015, -0.14416640996932983, -0.02167438343167305, -0.12824618816375732, 0.029789624735713005, -0.16454613208770752, 0.080861896276474, 0.08093652874231339, 0.014499851502478123, -0.011795503087341785, -0.11616556346416473, 0.030937299132347107, 0.04738723114132881, -0.1000836119055748, -0.08626106381416321 ]
null
null
transformers
# Hungarian Abstractive Summarization BART model For further models, scripts and details, see [our repository](https://github.com/nytud/neural-models) or [our demo site](https://juniper.nytud.hu/demo/nlp). - BART base model (see Results Table - bold): - Pretrained on Webcorpus 2.0 - Finetuned HI corpus (hvg.hu + index.hu) - Segments: 559.162 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) - **max_source_length = 1024** - max_target_length = 256 ## Results | Model | HI | NOL | | ------------- | ------------- | ------------- | | BART-base-512 | 30.18/13.86/22.92 | 46.48/32.40/39.45 | | BART-base-1024| **31.86/14.59/23.79** | 47.01/32.91/39.97 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-bart, title = {{BARTerezzünk! - Messze, messze, messze a világtól, - BART kísérleti modellek magyar nyelvre}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {15--29} } ```
{"language": ["hu"], "license": "apache-2.0", "tags": ["summarization"], "metrics": ["rouge"], "widget": [{"text": "A Tisza-parti v\u00e1ros \u00e1llatkertj\u00e9ben r\u00e9g\u00f3ta tartanak szurik\u00e1t\u00e1kat ( Suricata suricatta ) , de tavaly tavaszig nem siker\u00fclt szapor\u00edtani \u0151ket , annak ellen\u00e9re , hogy t\u00e1gas h\u00e1z \u00e9s kifut\u00f3 \u00e9p\u00fclt sz\u00e1mukra - k\u00f6z\u00f6lte Veprik R\u00f3bert igazgat\u00f3 . 2010-ben alakult ki az \u00faj - h\u00e1rom Amszterdamb\u00f3l sz\u00e1rmaz\u00f3 n\u0151st\u00e9nyb\u0151l \u00e9s egy budapesti fiatal h\u00edmb\u0151l \u00e1ll\u00f3 - csapat , amely szaporodni kezdett . 2011-ben h\u00e1rom , id\u00e9n pedig egy ut\u00f3ddal \u00f6rvendeztett\u00e9k meg a gondoz\u00f3kat \u00e9s az \u00e1llatbar\u00e1tokat . A szurik\u00e1t\u00e1k ut\u00f3dai - tizenegy hetes vemhess\u00e9g ut\u00e1n - okt\u00f3ber \u00e9s m\u00e1rcius k\u00f6z\u00f6tt vakon \u00e9s sz\u0151rtelen\u00fcl j\u00f6nnek a vil\u00e1gra . A kicsinyek h\u00e1romhetesen b\u00fajnak el\u0151 az \u00fcregb\u0151l , \u00e9s nevel\u00e9s\u00fckben mindk\u00e9t sz\u00fcl\u0151 r\u00e9szt vesz . A szurik\u00e1tacsapatokban a csal\u00e1d tagjai nagyon szoros kapcsolatban \u00e1llnak egym\u00e1ssal , viszont nagyon harciasan fell\u00e9pnek az idegenekkel szemben , ak\u00e1r meg is \u00f6lhetik azt az \u00e1llatot , amelyet betolakod\u00f3nak tekintenek . B\u00e1r a D\u00e9l-Afrik\u00e1ban , a Kalah\u00e1ri sivatagban \u0151shonos cibetmacskaf\u00e9le ragadoz\u00f3kat a szegedi \u00e1llatkertben term\u00e9szetes \u00e9l\u0151hely\u00fckh\u00f6z k\u00e9pest kevesebb vesz\u00e9ly fenyegeti , a vadasparki erd\u0151ben ragadoz\u00f3 madarak is \u00e9lnek , amelyek ak\u00e1r zs\u00e1km\u00e1nyk\u00e9nt is tekinthetn\u00e9nek a szurik\u00e1t\u00e1kra . A szegedi csapatn\u00e1l azonban szigor\u00fa \u0151rs\u00e9g van , mindig lesi valaki k\u00e9t l\u00e1bra \u00e1llva a vesz\u00e9lyforr\u00e1sokat . Az \u0151rszemek figyelm\u00e9t m\u00e9g a s\u00e1rk\u00e1nyrep\u00fcl\u0151k is felkeltik , \u00e9s felbukkan\u00e1sakor valamennyi egyed biztos helyre menek\u00fcl . A szurik\u00e1t\u00e1k a Kalah\u00e1ri sivatag boz\u00f3tos , szikl\u00e1s ter\u00fcletein csapatokban \u00e9lnek . A 700 gramm k\u00f6r\u00fcli testt\u00f6meg\u0171 ragadoz\u00f3k rovarokkal , l\u00e1rv\u00e1kkal , skorpi\u00f3kkal t\u00e1pl\u00e1lkoznak , de n\u00e9ha elfogyasztj\u00e1k a kisebb gerinceseket , toj\u00e1sokat \u00e9s n\u00f6v\u00e9nyi gum\u00f3kat is . A nappal akt\u00edv \u00e1llatok f\u00f6ldalatti \u00fcregrendszert \u00e1snak , amelynek t\u00f6bb bej\u00e1rata is van . Ha a szurik\u00e1t\u00e1k idegen csapattal vagy ragadoz\u00f3val ker\u00fclnek szembe , azonnal elkezdenek \u00e1sni , nagy porfelh\u0151t kavarva . Az is gyakorta el\u0151fordul , hogy szorosan egym\u00e1shoz b\u00fajnak , felborzolj\u00e1k sz\u0151r\u00fcket , megny\u00fajtj\u00e1k test\u00fcket , hogy min\u00e9l nagyobbnak l\u00e1tsz\u00f3djanak . Az el\u0151ad\u00e1suk cs\u00facspontj\u00e1n pedig az eg\u00e9sz csapat a leveg\u0151be ugrik , k\u00f6zben pedig morog . A hangad\u00e1s egy\u00e9bk\u00e9nt is fontos a szurik\u00e1t\u00e1k kapcsolat\u00e1ban , az egyedek legal\u00e1bb t\u00edzf\u00e9le jelz\u00e9st haszn\u00e1lnak a kol\u00f3ni\u00e1n bel\u00fcl ."}]}
summarization
NYTK/summarization-hi-bart-base-1024-hungarian
[ "transformers", "pytorch", "bart", "text2text-generation", "summarization", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Hungarian Abstractive Summarization BART model ============================================== For further models, scripts and details, see our repository or our demo site. * BART base model (see Results Table - bold): + Pretrained on Webcorpus 2.0 + Finetuned HI corpus (URL + URL) - Segments: 559.162 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) * max\_source\_length = 1024 * max\_target\_length = 256 Results ------- Model: BART-base-512, HI: 30.18/13.86/22.92, NOL: 46.48/32.40/39.45 Model: BART-base-1024, HI: 31.86/14.59/23.79, NOL: 47.01/32.91/39.97 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 52 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03763221949338913, 0.09045056998729706, -0.00622772378847003, -0.005903321783989668, 0.09766548871994019, 0.021367143839597702, 0.12657077610492706, 0.11909694224596024, -0.012927955947816372, -0.05026744678616524, 0.13250283896923065, 0.1365789771080017, 0.01781429722905159, 0.11761949956417084, -0.0686158761382103, -0.24075445532798767, 0.09058404713869095, 0.0386543944478035, 0.005316436290740967, 0.09374922513961792, 0.11377763003110886, -0.03891528397798538, 0.06224219128489494, -0.01886739209294319, -0.07457423955202103, 0.013743330724537373, 0.004637845326215029, -0.10095703601837158, 0.07875913381576538, 0.04441635683178902, 0.06399630010128021, 0.05935051292181015, -0.01910143531858921, -0.19767697155475616, 0.024185996502637863, -0.014616673812270164, -0.0665985718369484, 0.04531598463654518, 0.054657842963933945, -0.02421688102185726, 0.12835273146629333, 0.047464191913604736, -0.04497377946972847, 0.0688280388712883, -0.08242414891719818, -0.10874903202056885, -0.09509193897247314, 0.08033432066440582, 0.06876575946807861, 0.05750083923339844, 0.014892791397869587, 0.11376810818910599, -0.09492882341146469, 0.05829180032014847, 0.09786463528871536, -0.3445099890232086, -0.0024179767351597548, 0.04181130602955818, 0.058750398457050323, 0.03254202380776405, -0.020017487928271294, 0.046662185341119766, 0.06359859555959702, 0.03271108493208885, -0.007011004723608494, -0.06454012542963028, -0.13983072340488434, 0.031392838805913925, -0.0375542975962162, -0.055083077400922775, 0.2795774042606354, -0.01818937622010708, 0.0492519736289978, -0.03039020672440529, -0.08293955028057098, -0.0008557162364013493, -0.04777277261018753, 0.06526334583759308, -0.005250237882137299, 0.1019728034734726, 0.06996248662471771, -0.026128018274903297, -0.13300956785678864, 0.019412066787481308, -0.1926204413175583, 0.06729467958211899, 0.02832097001373768, 0.06440375000238419, -0.15489375591278076, 0.06287933886051178, 0.07946198433637619, -0.140621617436409, 0.017751017585396767, -0.07481461018323898, 0.09628383070230484, 0.04387955740094185, -0.0649716779589653, 0.03509249538183212, 0.14348414540290833, 0.20958487689495087, 0.03238973021507263, 0.04523094743490219, -0.04251605272293091, 0.12043869495391846, -0.008658022619783878, 0.06321308016777039, 0.019640717655420303, -0.08920376002788544, 0.09021943062543869, -0.1006273701786995, 0.0962204858660698, -0.039242446422576904, -0.14811328053474426, -0.00469821784645319, -0.0004528526624199003, 0.09877286851406097, 0.060927677899599075, 0.05193565413355827, -0.04968037083745003, 0.015289356000721455, 0.08595307171344757, -0.05796189233660698, 0.001791513292118907, 0.009917902760207653, 0.03402552753686905, 0.0989810898900032, 0.03534751757979393, 0.036770693957805634, -0.08401119709014893, 0.08215142786502838, -0.052649252116680145, -0.01714039035141468, -0.03893836960196495, -0.009832041338086128, 0.07785974442958832, -0.06270406395196915, 0.04479178413748741, -0.14309583604335785, -0.15703241527080536, 0.020589584484696388, 0.053150806576013565, -0.014484778046607971, -0.07817638665437698, 0.012051594443619251, -0.014285555109381676, 0.03852049633860588, -0.09106123447418213, 0.0050948793068528175, -0.08701479434967041, 0.067646823823452, -0.06698282063007355, 0.04336929693818092, -0.20427867770195007, 0.06067601591348648, -0.12374912947416306, -0.00214839237742126, -0.04806305095553398, -0.004234221763908863, -0.07800848037004471, 0.20671173930168152, -0.014734753407537937, -0.028024328872561455, -0.0508195161819458, 0.011774428188800812, -0.021816611289978027, 0.13512717187404633, -0.13527926802635193, -0.062090761959552765, 0.21618090569972992, -0.1290607899427414, -0.18888764083385468, 0.09519954770803452, 0.02815408445894718, 0.029998840764164925, 0.06354206800460815, 0.19745104014873505, 0.05918384715914726, -0.006548587698489428, 0.065024234354496, 0.14411529898643494, -0.040496956557035446, -0.13651026785373688, 0.026172656565904617, -0.023997705429792404, -0.10357008129358292, 0.07968580722808838, 0.05106288194656372, 0.10105089098215103, -0.01589171029627323, -0.0728413537144661, -0.053163670003414154, -0.029447974637150764, -0.019387781620025635, -0.0036139136645942926, 0.07862724363803864, -0.07201717048883438, 0.005257106851786375, -0.07641023397445679, 0.04294158145785332, 0.03261997550725937, 0.06205906346440315, -0.04664396867156029, 0.10146033018827438, 0.05781075730919838, 0.05182764679193497, -0.11202497780323029, 0.013690195977687836, -0.014303714036941528, 0.037479791790246964, -0.003063549753278494, 0.08256827294826508, 0.013329686596989632, -0.04530830308794975, -0.00956654455512762, -0.0016437673475593328, 0.1373220682144165, 0.03494308888912201, -0.02662176825106144, -0.11827075481414795, 0.038042161613702774, -0.02544964849948883, 0.07443337142467499, -0.00883529707789421, 0.02265392430126667, 0.041148070245981216, 0.13984116911888123, -0.049493033438920975, 0.09061897546052933, -0.021795185282826424, 0.021380309015512466, -0.08316447585821152, 0.005865962244570255, 0.11629040539264679, 0.05453335866332054, -0.10642588883638382, 0.20181907713413239, -0.09225921332836151, 0.22848555445671082, 0.2229742556810379, -0.182373046875, 0.08503415435552597, -0.005380003713071346, -0.03429025784134865, -0.016996493563055992, 0.04603155329823494, 0.0044000050984323025, 0.007652831729501486, 0.007689979858696461, 0.17302857339382172, -0.045825012028217316, -0.029477179050445557, -0.026488328352570534, -0.04174968972802162, -0.004768709186464548, 0.04486777260899544, 0.14702299237251282, -0.17151780426502228, 0.1741742342710495, 0.27235302329063416, 0.019832121208310127, 0.12537862360477448, -0.06917447596788406, 0.0047770654782652855, 0.06868996471166611, -0.019924219697713852, -0.04796747490763664, 0.023314638063311577, -0.1506577581167221, 0.008043097332119942, 0.09302730113267899, 0.03401925414800644, 0.08947886526584625, -0.12727682292461395, -0.032904017716646194, -0.03726470470428467, -0.0032638609409332275, -0.061008501797914505, 0.06745333969593048, 0.011419885791838169, 0.10570176690816879, -0.04347348213195801, -0.06105359271168709, 0.08882124722003937, 0.0008029557066038251, -0.08134201914072037, 0.12640410661697388, -0.17460407316684723, -0.28479310870170593, -0.16217823326587677, -0.08233023434877396, -0.024930953979492188, 0.0034249236341565847, 0.15353091061115265, -0.02151927351951599, -0.05609731003642082, 0.00046756863594055176, -0.03172476217150688, -0.01805698312819004, -0.00582085782662034, 0.01636289618909359, 0.06293001025915146, 0.0027623942587524652, -0.11428564786911011, -0.049904655665159225, 0.012886082753539085, 0.0060647171922028065, 0.12038151174783707, -0.09920699149370193, 0.09993861615657806, 0.09292452782392502, 0.04270691052079201, 0.029958153143525124, -0.028704354539513588, 0.12718811631202698, -0.039491117000579834, 0.006761827506124973, 0.1972167044878006, -0.03807378187775612, 0.07163243740797043, 0.13941296935081482, 0.011479916051030159, -0.06886782497167587, 0.01780710369348526, -0.07020539790391922, -0.06823379546403885, -0.21168257296085358, -0.13525483012199402, -0.14237405359745026, 0.08139976114034653, 0.03737109154462814, 0.06879844516515732, 0.040385399013757706, 0.06886129081249237, -0.052476949989795685, 0.0312077347189188, -0.005128837656229734, 0.07469839602708817, 0.2618454694747925, -0.04183804988861084, 0.11835148185491562, -0.11583887040615082, -0.06721889972686768, 0.12330127507448196, 0.0704972967505455, 0.12697479128837585, 0.10214177519083023, 0.06801602989435196, 0.0787055641412735, 0.15301811695098877, 0.09184373915195465, 0.1199888214468956, 0.043259646743535995, -0.0015929309884086251, -0.06059034913778305, -0.026906996965408325, -0.0731167271733284, 0.0493352897465229, -0.04092789441347122, -0.1322110891342163, -0.039466407150030136, -0.0930243730545044, 0.10141311585903168, 0.1649523228406906, 0.04226130247116089, -0.13680866360664368, 0.011476203799247742, 0.10832834243774414, -0.01709054596722126, -0.06770434230566025, 0.1118093803524971, -0.0415651760995388, -0.10009574890136719, 0.1447945535182953, -0.0021850396879017353, 0.15244324505329132, 0.015029422007501125, 0.05993237718939781, -0.062349967658519745, -0.11136962473392487, 0.08071429282426834, 0.13240143656730652, -0.2872105538845062, 0.19047172367572784, -0.032716695219278336, -0.03748749941587448, -0.07593827694654465, 0.005907368380576372, 0.05740375444293022, 0.1792834848165512, 0.06270583719015121, 0.018468890339136124, -0.14566567540168762, -0.023433370515704155, -0.057581108063459396, 0.0681442841887474, 0.04528699442744255, 0.001732130185700953, -0.03423627093434334, -0.07373934984207153, -0.0020484209526330233, -0.016315797343850136, 0.03421356901526451, -0.0382486991584301, -0.1814577430486679, 0.07072315365076065, 0.11244300752878189, 0.0588550791144371, -0.0616103932261467, -0.009969946928322315, -0.08109187334775925, 0.17618420720100403, -0.0541972778737545, -0.07949303090572357, -0.10310330241918564, -0.09937329590320587, 0.05930984020233154, -0.04543091729283333, 0.033061910420656204, -0.07589322328567505, 0.001993467565625906, -0.07052643597126007, -0.20601530373096466, 0.09230482578277588, -0.13473115861415863, -0.043732650578022, -0.04451385885477066, 0.1579730212688446, -0.11745425313711166, 0.02164432778954506, 0.030333327129483223, -0.0018000565469264984, -0.11971430480480194, -0.1329612135887146, -0.05768745020031929, 0.020118730142712593, 0.05331016704440117, -0.012276953086256981, -0.11351658403873444, -0.02484804019331932, 0.017717478796839714, -0.0722678080201149, 0.22813193500041962, 0.16573791205883026, -0.07811696082353592, 0.1972920447587967, 0.16176557540893555, -0.08298609405755997, -0.3242345154285431, -0.1988833248615265, -0.15335775911808014, -0.059509459882974625, -0.012056571431457996, -0.1299349069595337, 0.10364850610494614, 0.004883048590272665, -0.0688013881444931, 0.10691330581903458, -0.19583365321159363, -0.078946053981781, 0.18717439472675323, -0.05349648371338844, 0.31876006722450256, -0.14121605455875397, -0.08739297091960907, -0.10223046690225601, -0.21377742290496826, 0.10322869569063187, -0.0849073976278305, 0.06545615196228027, -0.019960785284638405, 0.05797940865159035, -0.010429051704704762, -0.047499626874923706, 0.12960483133792877, -0.010348222218453884, -0.014282082207500935, -0.11260470002889633, 0.03852298483252525, 0.08378816395998001, -0.006223613861948252, 0.04447924345731735, -0.18215502798557281, 0.03744278475642204, -0.10625399649143219, -0.00850141141563654, -0.048494670540094376, 0.06861825287342072, -0.008481637574732304, -0.02318590320646763, -0.03146900236606598, -0.03875196352601051, 0.03236065432429314, 0.010957633145153522, 0.21556290984153748, 0.017902227118611336, 0.09988299757242203, 0.11608009785413742, 0.06932543218135834, -0.24493062496185303, 0.07320941984653473, -0.10018078237771988, -0.0763905718922615, 0.05633970722556114, -0.09056801348924637, 0.05900518223643303, 0.12160839885473251, -0.06210573762655258, 0.05370432510972023, 0.07837021350860596, 0.02014162205159664, -0.028634008020162582, 0.1597369909286499, -0.19694454967975616, -0.015234480611979961, -0.03885170817375183, 0.08334523439407349, 0.06791578233242035, 0.04468420892953873, 0.13511188328266144, 0.0161084346473217, -0.03435155004262924, 0.015359065495431423, 0.04576031118631363, -0.06843013316392899, 0.04358139634132385, 0.0060377889312803745, 0.009301768615841866, -0.1463301032781601, 0.09488291293382645, 0.009938215836882591, -0.13413651287555695, -0.026878919452428818, 0.1384759247303009, -0.1632564812898636, -0.12557975947856903, -0.035188622772693634, 0.08691705763339996, -0.11131642758846283, -0.11088436841964722, -0.06445396691560745, -0.17242558300495148, 0.06343519687652588, 0.07174990326166153, 0.101102314889431, 0.07712004333734512, -0.01623843051493168, -0.054811496287584305, 0.038241274654865265, -0.004338306840509176, -0.017475800588726997, 0.028790447860956192, -0.08173917233943939, -0.017054880037903786, 0.013982708565890789, 0.10681317746639252, -0.056582529097795486, -0.013775707222521305, -0.1000395193696022, 0.019405808299779892, -0.20654524862766266, -0.014987640082836151, -0.11138198524713516, -0.024024801328778267, 0.007070464082062244, -0.0766787901520729, -0.0631231740117073, -0.013890625908970833, -0.10210329294204712, -0.027906527742743492, -0.06019094958901405, 0.06444083154201508, -0.09079153835773468, -0.014137363992631435, 0.10135616362094879, -0.03998007997870445, 0.1036924496293068, 0.10453064739704132, -0.07108422368764877, 0.09410328418016434, -0.1200857013463974, -0.13402555882930756, 0.09525029361248016, 0.057681553065776825, 0.022385234013199806, 0.005305006168782711, -0.009083517827093601, 0.13371725380420685, -0.003367204684764147, 0.030109642073512077, 0.03494397923350334, -0.1298186182975769, -0.061045531183481216, -0.041192926466464996, -0.11841069906949997, -0.015956010669469833, -0.10164637863636017, 0.09692510217428207, 0.030949482694268227, 0.14754660427570343, -0.03365393728017807, 0.04279015585780144, -0.06464646756649017, 0.03303400054574013, -0.03441397100687027, -0.1591556817293167, -0.11776424944400787, -0.08647211641073227, -0.021450038999319077, 0.005311750341206789, 0.2704131007194519, -0.004793477710336447, -0.03641544654965401, 0.0669018104672432, 0.08313320577144623, -0.029920484870672226, 0.00987919420003891, 0.25962430238723755, 0.06879810988903046, -0.010221549309790134, -0.12525779008865356, 0.010047639720141888, 0.012616468593478203, -0.08963532000780106, 0.10787323862314224, 0.09320555627346039, 0.038514334708452225, 0.0837748646736145, 0.008057801052927971, 0.006657994352281094, -0.12352705001831055, -0.08765412867069244, 0.0006755553185939789, 0.08513709157705307, 0.002282294211909175, 0.12471475452184677, 0.13926562666893005, -0.0546247623860836, 0.019292229786515236, -0.06735231727361679, -0.005191338248550892, -0.14633731544017792, -0.09925170987844467, -0.07716499269008636, -0.13808217644691467, -0.004118069540709257, -0.05876297503709793, 0.02226887457072735, 0.06254894286394119, 0.03378378227353096, -0.0730864480137825, 0.016053007915616035, -0.013075723312795162, -0.0748930424451828, 0.03280244767665863, -0.016277670860290527, -0.003398780943825841, -0.024585548788309097, -0.049754612147808075, -0.08814110606908798, -0.005314778070896864, -0.03468628600239754, 0.05477277562022209, -0.012853099033236504, 0.03874988481402397, -0.10887712240219116, -0.05911590903997421, -0.0488186851143837, 0.019239144399762154, 0.016337549313902855, 0.1369474232196808, 0.022066960111260414, 0.027017410844564438, 0.08897440135478973, 0.189834862947464, -0.07568466663360596, -0.16569937765598297, -0.057385075837373734, 0.1563788801431656, 0.04028889909386635, 0.05061136558651924, 0.01302589662373066, 0.02476346679031849, -0.07612861692905426, 0.34172576665878296, 0.30292633175849915, -0.0842638686299324, 0.013408830389380455, -0.013821525499224663, 0.03196385130286217, 0.07280732691287994, 0.11747576296329498, 0.1419629603624344, 0.22894705832004547, -0.0633167251944542, -0.0342041477560997, -0.06164666637778282, -0.011379384435713291, -0.16521087288856506, 0.10886316746473312, -0.019189288839697838, -0.08866600692272186, -0.014199310913681984, 0.0907660648226738, -0.12409275025129318, 0.07481842488050461, -0.059848852455616, -0.15542776882648468, -0.03254380077123642, 0.0001943191309692338, 0.21452878415584564, 0.021070972084999084, 0.02321447990834713, -0.032849907875061035, -0.04723024368286133, 0.1415794938802719, -0.013134322129189968, -0.18438193202018738, -0.014645610935986042, 0.07338922470808029, -0.12581893801689148, 0.07409758120775223, -0.01630956493318081, 0.01994241587817669, 0.08428996056318283, 0.09921777993440628, -0.0683901384472847, 0.07933018356561661, 0.017007190734148026, -0.0349293127655983, 0.011066017672419548, -0.06293949484825134, -0.019886817783117294, -0.05565422773361206, 0.06234407797455788, -0.09178829193115234, 0.04625855013728142, 0.032076720148324966, -0.042408403009176254, 0.0014522764831781387, 0.037800382822752, -0.07124499976634979, 0.07143672555685043, 0.01960405334830284, -0.04051028937101364, -0.03878381848335266, -0.0547565221786499, -0.016546303406357765, 0.005677100736647844, -0.1321941763162613, -0.04769081249833107, -0.051470790058374405, -0.07895500212907791, 0.07249294221401215, 0.03518173098564148, -0.1310557872056961, -0.004024401307106018, -0.11698086559772491, 0.04297050088644028, -0.16703630983829498, 0.07701706886291504, 0.06942970305681229, -0.023959310725331306, -0.005020285490900278, -0.11605126410722733, 0.028935354202985764, 0.031161418184638023, -0.07446826994419098, -0.08652307838201523 ]
null
null
transformers
# Hungarian Abstractive Summarization BART model For further models, scripts and details, see [our repository](https://github.com/nytud/neural-models) or [our demo site](https://juniper.nytud.hu/demo/nlp). - BART base model (see Results Table - bold): - Pretrained on Webcorpus 2.0 - Finetuned HI corpus (hvg.hu + index.hu) - Segments: 559.162 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) - max_source_length = 512 - max_target_length = 256 ## Results | Model | HI | NOL | | ------------- | ------------- | ------------- | | BART-base-512 | **30.18/13.86/22.92** | 46.48/32.40/39.45 | | BART-base-1024| 31.86/14.59/23.79 | 47.01/32.91/39.97 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-bart, title = {{BARTerezzünk! - Messze, messze, messze a világtól, - BART kísérleti modellek magyar nyelvre}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {15--29} } ```
{"language": ["hu"], "license": "apache-2.0", "tags": ["summarization"], "metrics": ["rouge"], "widget": [{"text": "A Tisza-parti v\u00e1ros \u00e1llatkertj\u00e9ben r\u00e9g\u00f3ta tartanak szurik\u00e1t\u00e1kat ( Suricata suricatta ) , de tavaly tavaszig nem siker\u00fclt szapor\u00edtani \u0151ket , annak ellen\u00e9re , hogy t\u00e1gas h\u00e1z \u00e9s kifut\u00f3 \u00e9p\u00fclt sz\u00e1mukra - k\u00f6z\u00f6lte Veprik R\u00f3bert igazgat\u00f3 . 2010-ben alakult ki az \u00faj - h\u00e1rom Amszterdamb\u00f3l sz\u00e1rmaz\u00f3 n\u0151st\u00e9nyb\u0151l \u00e9s egy budapesti fiatal h\u00edmb\u0151l \u00e1ll\u00f3 - csapat , amely szaporodni kezdett . 2011-ben h\u00e1rom , id\u00e9n pedig egy ut\u00f3ddal \u00f6rvendeztett\u00e9k meg a gondoz\u00f3kat \u00e9s az \u00e1llatbar\u00e1tokat . A szurik\u00e1t\u00e1k ut\u00f3dai - tizenegy hetes vemhess\u00e9g ut\u00e1n - okt\u00f3ber \u00e9s m\u00e1rcius k\u00f6z\u00f6tt vakon \u00e9s sz\u0151rtelen\u00fcl j\u00f6nnek a vil\u00e1gra . A kicsinyek h\u00e1romhetesen b\u00fajnak el\u0151 az \u00fcregb\u0151l , \u00e9s nevel\u00e9s\u00fckben mindk\u00e9t sz\u00fcl\u0151 r\u00e9szt vesz . A szurik\u00e1tacsapatokban a csal\u00e1d tagjai nagyon szoros kapcsolatban \u00e1llnak egym\u00e1ssal , viszont nagyon harciasan fell\u00e9pnek az idegenekkel szemben , ak\u00e1r meg is \u00f6lhetik azt az \u00e1llatot , amelyet betolakod\u00f3nak tekintenek . B\u00e1r a D\u00e9l-Afrik\u00e1ban , a Kalah\u00e1ri sivatagban \u0151shonos cibetmacskaf\u00e9le ragadoz\u00f3kat a szegedi \u00e1llatkertben term\u00e9szetes \u00e9l\u0151hely\u00fckh\u00f6z k\u00e9pest kevesebb vesz\u00e9ly fenyegeti , a vadasparki erd\u0151ben ragadoz\u00f3 madarak is \u00e9lnek , amelyek ak\u00e1r zs\u00e1km\u00e1nyk\u00e9nt is tekinthetn\u00e9nek a szurik\u00e1t\u00e1kra . A szegedi csapatn\u00e1l azonban szigor\u00fa \u0151rs\u00e9g van , mindig lesi valaki k\u00e9t l\u00e1bra \u00e1llva a vesz\u00e9lyforr\u00e1sokat . Az \u0151rszemek figyelm\u00e9t m\u00e9g a s\u00e1rk\u00e1nyrep\u00fcl\u0151k is felkeltik , \u00e9s felbukkan\u00e1sakor valamennyi egyed biztos helyre menek\u00fcl . A szurik\u00e1t\u00e1k a Kalah\u00e1ri sivatag boz\u00f3tos , szikl\u00e1s ter\u00fcletein csapatokban \u00e9lnek . A 700 gramm k\u00f6r\u00fcli testt\u00f6meg\u0171 ragadoz\u00f3k rovarokkal , l\u00e1rv\u00e1kkal , skorpi\u00f3kkal t\u00e1pl\u00e1lkoznak , de n\u00e9ha elfogyasztj\u00e1k a kisebb gerinceseket , toj\u00e1sokat \u00e9s n\u00f6v\u00e9nyi gum\u00f3kat is . A nappal akt\u00edv \u00e1llatok f\u00f6ldalatti \u00fcregrendszert \u00e1snak , amelynek t\u00f6bb bej\u00e1rata is van . Ha a szurik\u00e1t\u00e1k idegen csapattal vagy ragadoz\u00f3val ker\u00fclnek szembe , azonnal elkezdenek \u00e1sni , nagy porfelh\u0151t kavarva . Az is gyakorta el\u0151fordul , hogy szorosan egym\u00e1shoz b\u00fajnak , felborzolj\u00e1k sz\u0151r\u00fcket , megny\u00fajtj\u00e1k test\u00fcket , hogy min\u00e9l nagyobbnak l\u00e1tsz\u00f3djanak . Az el\u0151ad\u00e1suk cs\u00facspontj\u00e1n pedig az eg\u00e9sz csapat a leveg\u0151be ugrik , k\u00f6zben pedig morog . A hangad\u00e1s egy\u00e9bk\u00e9nt is fontos a szurik\u00e1t\u00e1k kapcsolat\u00e1ban , az egyedek legal\u00e1bb t\u00edzf\u00e9le jelz\u00e9st haszn\u00e1lnak a kol\u00f3ni\u00e1n bel\u00fcl ."}]}
summarization
NYTK/summarization-hi-bart-hungarian
[ "transformers", "pytorch", "bart", "text2text-generation", "summarization", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Hungarian Abstractive Summarization BART model ============================================== For further models, scripts and details, see our repository or our demo site. * BART base model (see Results Table - bold): + Pretrained on Webcorpus 2.0 + Finetuned HI corpus (URL + URL) - Segments: 559.162 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) * max\_source\_length = 512 * max\_target\_length = 256 Results ------- Model: BART-base-512, HI: 30.18/13.86/22.92, NOL: 46.48/32.40/39.45 Model: BART-base-1024, HI: 31.86/14.59/23.79, NOL: 47.01/32.91/39.97 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 52 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03763221949338913, 0.09045056998729706, -0.00622772378847003, -0.005903321783989668, 0.09766548871994019, 0.021367143839597702, 0.12657077610492706, 0.11909694224596024, -0.012927955947816372, -0.05026744678616524, 0.13250283896923065, 0.1365789771080017, 0.01781429722905159, 0.11761949956417084, -0.0686158761382103, -0.24075445532798767, 0.09058404713869095, 0.0386543944478035, 0.005316436290740967, 0.09374922513961792, 0.11377763003110886, -0.03891528397798538, 0.06224219128489494, -0.01886739209294319, -0.07457423955202103, 0.013743330724537373, 0.004637845326215029, -0.10095703601837158, 0.07875913381576538, 0.04441635683178902, 0.06399630010128021, 0.05935051292181015, -0.01910143531858921, -0.19767697155475616, 0.024185996502637863, -0.014616673812270164, -0.0665985718369484, 0.04531598463654518, 0.054657842963933945, -0.02421688102185726, 0.12835273146629333, 0.047464191913604736, -0.04497377946972847, 0.0688280388712883, -0.08242414891719818, -0.10874903202056885, -0.09509193897247314, 0.08033432066440582, 0.06876575946807861, 0.05750083923339844, 0.014892791397869587, 0.11376810818910599, -0.09492882341146469, 0.05829180032014847, 0.09786463528871536, -0.3445099890232086, -0.0024179767351597548, 0.04181130602955818, 0.058750398457050323, 0.03254202380776405, -0.020017487928271294, 0.046662185341119766, 0.06359859555959702, 0.03271108493208885, -0.007011004723608494, -0.06454012542963028, -0.13983072340488434, 0.031392838805913925, -0.0375542975962162, -0.055083077400922775, 0.2795774042606354, -0.01818937622010708, 0.0492519736289978, -0.03039020672440529, -0.08293955028057098, -0.0008557162364013493, -0.04777277261018753, 0.06526334583759308, -0.005250237882137299, 0.1019728034734726, 0.06996248662471771, -0.026128018274903297, -0.13300956785678864, 0.019412066787481308, -0.1926204413175583, 0.06729467958211899, 0.02832097001373768, 0.06440375000238419, -0.15489375591278076, 0.06287933886051178, 0.07946198433637619, -0.140621617436409, 0.017751017585396767, -0.07481461018323898, 0.09628383070230484, 0.04387955740094185, -0.0649716779589653, 0.03509249538183212, 0.14348414540290833, 0.20958487689495087, 0.03238973021507263, 0.04523094743490219, -0.04251605272293091, 0.12043869495391846, -0.008658022619783878, 0.06321308016777039, 0.019640717655420303, -0.08920376002788544, 0.09021943062543869, -0.1006273701786995, 0.0962204858660698, -0.039242446422576904, -0.14811328053474426, -0.00469821784645319, -0.0004528526624199003, 0.09877286851406097, 0.060927677899599075, 0.05193565413355827, -0.04968037083745003, 0.015289356000721455, 0.08595307171344757, -0.05796189233660698, 0.001791513292118907, 0.009917902760207653, 0.03402552753686905, 0.0989810898900032, 0.03534751757979393, 0.036770693957805634, -0.08401119709014893, 0.08215142786502838, -0.052649252116680145, -0.01714039035141468, -0.03893836960196495, -0.009832041338086128, 0.07785974442958832, -0.06270406395196915, 0.04479178413748741, -0.14309583604335785, -0.15703241527080536, 0.020589584484696388, 0.053150806576013565, -0.014484778046607971, -0.07817638665437698, 0.012051594443619251, -0.014285555109381676, 0.03852049633860588, -0.09106123447418213, 0.0050948793068528175, -0.08701479434967041, 0.067646823823452, -0.06698282063007355, 0.04336929693818092, -0.20427867770195007, 0.06067601591348648, -0.12374912947416306, -0.00214839237742126, -0.04806305095553398, -0.004234221763908863, -0.07800848037004471, 0.20671173930168152, -0.014734753407537937, -0.028024328872561455, -0.0508195161819458, 0.011774428188800812, -0.021816611289978027, 0.13512717187404633, -0.13527926802635193, -0.062090761959552765, 0.21618090569972992, -0.1290607899427414, -0.18888764083385468, 0.09519954770803452, 0.02815408445894718, 0.029998840764164925, 0.06354206800460815, 0.19745104014873505, 0.05918384715914726, -0.006548587698489428, 0.065024234354496, 0.14411529898643494, -0.040496956557035446, -0.13651026785373688, 0.026172656565904617, -0.023997705429792404, -0.10357008129358292, 0.07968580722808838, 0.05106288194656372, 0.10105089098215103, -0.01589171029627323, -0.0728413537144661, -0.053163670003414154, -0.029447974637150764, -0.019387781620025635, -0.0036139136645942926, 0.07862724363803864, -0.07201717048883438, 0.005257106851786375, -0.07641023397445679, 0.04294158145785332, 0.03261997550725937, 0.06205906346440315, -0.04664396867156029, 0.10146033018827438, 0.05781075730919838, 0.05182764679193497, -0.11202497780323029, 0.013690195977687836, -0.014303714036941528, 0.037479791790246964, -0.003063549753278494, 0.08256827294826508, 0.013329686596989632, -0.04530830308794975, -0.00956654455512762, -0.0016437673475593328, 0.1373220682144165, 0.03494308888912201, -0.02662176825106144, -0.11827075481414795, 0.038042161613702774, -0.02544964849948883, 0.07443337142467499, -0.00883529707789421, 0.02265392430126667, 0.041148070245981216, 0.13984116911888123, -0.049493033438920975, 0.09061897546052933, -0.021795185282826424, 0.021380309015512466, -0.08316447585821152, 0.005865962244570255, 0.11629040539264679, 0.05453335866332054, -0.10642588883638382, 0.20181907713413239, -0.09225921332836151, 0.22848555445671082, 0.2229742556810379, -0.182373046875, 0.08503415435552597, -0.005380003713071346, -0.03429025784134865, -0.016996493563055992, 0.04603155329823494, 0.0044000050984323025, 0.007652831729501486, 0.007689979858696461, 0.17302857339382172, -0.045825012028217316, -0.029477179050445557, -0.026488328352570534, -0.04174968972802162, -0.004768709186464548, 0.04486777260899544, 0.14702299237251282, -0.17151780426502228, 0.1741742342710495, 0.27235302329063416, 0.019832121208310127, 0.12537862360477448, -0.06917447596788406, 0.0047770654782652855, 0.06868996471166611, -0.019924219697713852, -0.04796747490763664, 0.023314638063311577, -0.1506577581167221, 0.008043097332119942, 0.09302730113267899, 0.03401925414800644, 0.08947886526584625, -0.12727682292461395, -0.032904017716646194, -0.03726470470428467, -0.0032638609409332275, -0.061008501797914505, 0.06745333969593048, 0.011419885791838169, 0.10570176690816879, -0.04347348213195801, -0.06105359271168709, 0.08882124722003937, 0.0008029557066038251, -0.08134201914072037, 0.12640410661697388, -0.17460407316684723, -0.28479310870170593, -0.16217823326587677, -0.08233023434877396, -0.024930953979492188, 0.0034249236341565847, 0.15353091061115265, -0.02151927351951599, -0.05609731003642082, 0.00046756863594055176, -0.03172476217150688, -0.01805698312819004, -0.00582085782662034, 0.01636289618909359, 0.06293001025915146, 0.0027623942587524652, -0.11428564786911011, -0.049904655665159225, 0.012886082753539085, 0.0060647171922028065, 0.12038151174783707, -0.09920699149370193, 0.09993861615657806, 0.09292452782392502, 0.04270691052079201, 0.029958153143525124, -0.028704354539513588, 0.12718811631202698, -0.039491117000579834, 0.006761827506124973, 0.1972167044878006, -0.03807378187775612, 0.07163243740797043, 0.13941296935081482, 0.011479916051030159, -0.06886782497167587, 0.01780710369348526, -0.07020539790391922, -0.06823379546403885, -0.21168257296085358, -0.13525483012199402, -0.14237405359745026, 0.08139976114034653, 0.03737109154462814, 0.06879844516515732, 0.040385399013757706, 0.06886129081249237, -0.052476949989795685, 0.0312077347189188, -0.005128837656229734, 0.07469839602708817, 0.2618454694747925, -0.04183804988861084, 0.11835148185491562, -0.11583887040615082, -0.06721889972686768, 0.12330127507448196, 0.0704972967505455, 0.12697479128837585, 0.10214177519083023, 0.06801602989435196, 0.0787055641412735, 0.15301811695098877, 0.09184373915195465, 0.1199888214468956, 0.043259646743535995, -0.0015929309884086251, -0.06059034913778305, -0.026906996965408325, -0.0731167271733284, 0.0493352897465229, -0.04092789441347122, -0.1322110891342163, -0.039466407150030136, -0.0930243730545044, 0.10141311585903168, 0.1649523228406906, 0.04226130247116089, -0.13680866360664368, 0.011476203799247742, 0.10832834243774414, -0.01709054596722126, -0.06770434230566025, 0.1118093803524971, -0.0415651760995388, -0.10009574890136719, 0.1447945535182953, -0.0021850396879017353, 0.15244324505329132, 0.015029422007501125, 0.05993237718939781, -0.062349967658519745, -0.11136962473392487, 0.08071429282426834, 0.13240143656730652, -0.2872105538845062, 0.19047172367572784, -0.032716695219278336, -0.03748749941587448, -0.07593827694654465, 0.005907368380576372, 0.05740375444293022, 0.1792834848165512, 0.06270583719015121, 0.018468890339136124, -0.14566567540168762, -0.023433370515704155, -0.057581108063459396, 0.0681442841887474, 0.04528699442744255, 0.001732130185700953, -0.03423627093434334, -0.07373934984207153, -0.0020484209526330233, -0.016315797343850136, 0.03421356901526451, -0.0382486991584301, -0.1814577430486679, 0.07072315365076065, 0.11244300752878189, 0.0588550791144371, -0.0616103932261467, -0.009969946928322315, -0.08109187334775925, 0.17618420720100403, -0.0541972778737545, -0.07949303090572357, -0.10310330241918564, -0.09937329590320587, 0.05930984020233154, -0.04543091729283333, 0.033061910420656204, -0.07589322328567505, 0.001993467565625906, -0.07052643597126007, -0.20601530373096466, 0.09230482578277588, -0.13473115861415863, -0.043732650578022, -0.04451385885477066, 0.1579730212688446, -0.11745425313711166, 0.02164432778954506, 0.030333327129483223, -0.0018000565469264984, -0.11971430480480194, -0.1329612135887146, -0.05768745020031929, 0.020118730142712593, 0.05331016704440117, -0.012276953086256981, -0.11351658403873444, -0.02484804019331932, 0.017717478796839714, -0.0722678080201149, 0.22813193500041962, 0.16573791205883026, -0.07811696082353592, 0.1972920447587967, 0.16176557540893555, -0.08298609405755997, -0.3242345154285431, -0.1988833248615265, -0.15335775911808014, -0.059509459882974625, -0.012056571431457996, -0.1299349069595337, 0.10364850610494614, 0.004883048590272665, -0.0688013881444931, 0.10691330581903458, -0.19583365321159363, -0.078946053981781, 0.18717439472675323, -0.05349648371338844, 0.31876006722450256, -0.14121605455875397, -0.08739297091960907, -0.10223046690225601, -0.21377742290496826, 0.10322869569063187, -0.0849073976278305, 0.06545615196228027, -0.019960785284638405, 0.05797940865159035, -0.010429051704704762, -0.047499626874923706, 0.12960483133792877, -0.010348222218453884, -0.014282082207500935, -0.11260470002889633, 0.03852298483252525, 0.08378816395998001, -0.006223613861948252, 0.04447924345731735, -0.18215502798557281, 0.03744278475642204, -0.10625399649143219, -0.00850141141563654, -0.048494670540094376, 0.06861825287342072, -0.008481637574732304, -0.02318590320646763, -0.03146900236606598, -0.03875196352601051, 0.03236065432429314, 0.010957633145153522, 0.21556290984153748, 0.017902227118611336, 0.09988299757242203, 0.11608009785413742, 0.06932543218135834, -0.24493062496185303, 0.07320941984653473, -0.10018078237771988, -0.0763905718922615, 0.05633970722556114, -0.09056801348924637, 0.05900518223643303, 0.12160839885473251, -0.06210573762655258, 0.05370432510972023, 0.07837021350860596, 0.02014162205159664, -0.028634008020162582, 0.1597369909286499, -0.19694454967975616, -0.015234480611979961, -0.03885170817375183, 0.08334523439407349, 0.06791578233242035, 0.04468420892953873, 0.13511188328266144, 0.0161084346473217, -0.03435155004262924, 0.015359065495431423, 0.04576031118631363, -0.06843013316392899, 0.04358139634132385, 0.0060377889312803745, 0.009301768615841866, -0.1463301032781601, 0.09488291293382645, 0.009938215836882591, -0.13413651287555695, -0.026878919452428818, 0.1384759247303009, -0.1632564812898636, -0.12557975947856903, -0.035188622772693634, 0.08691705763339996, -0.11131642758846283, -0.11088436841964722, -0.06445396691560745, -0.17242558300495148, 0.06343519687652588, 0.07174990326166153, 0.101102314889431, 0.07712004333734512, -0.01623843051493168, -0.054811496287584305, 0.038241274654865265, -0.004338306840509176, -0.017475800588726997, 0.028790447860956192, -0.08173917233943939, -0.017054880037903786, 0.013982708565890789, 0.10681317746639252, -0.056582529097795486, -0.013775707222521305, -0.1000395193696022, 0.019405808299779892, -0.20654524862766266, -0.014987640082836151, -0.11138198524713516, -0.024024801328778267, 0.007070464082062244, -0.0766787901520729, -0.0631231740117073, -0.013890625908970833, -0.10210329294204712, -0.027906527742743492, -0.06019094958901405, 0.06444083154201508, -0.09079153835773468, -0.014137363992631435, 0.10135616362094879, -0.03998007997870445, 0.1036924496293068, 0.10453064739704132, -0.07108422368764877, 0.09410328418016434, -0.1200857013463974, -0.13402555882930756, 0.09525029361248016, 0.057681553065776825, 0.022385234013199806, 0.005305006168782711, -0.009083517827093601, 0.13371725380420685, -0.003367204684764147, 0.030109642073512077, 0.03494397923350334, -0.1298186182975769, -0.061045531183481216, -0.041192926466464996, -0.11841069906949997, -0.015956010669469833, -0.10164637863636017, 0.09692510217428207, 0.030949482694268227, 0.14754660427570343, -0.03365393728017807, 0.04279015585780144, -0.06464646756649017, 0.03303400054574013, -0.03441397100687027, -0.1591556817293167, -0.11776424944400787, -0.08647211641073227, -0.021450038999319077, 0.005311750341206789, 0.2704131007194519, -0.004793477710336447, -0.03641544654965401, 0.0669018104672432, 0.08313320577144623, -0.029920484870672226, 0.00987919420003891, 0.25962430238723755, 0.06879810988903046, -0.010221549309790134, -0.12525779008865356, 0.010047639720141888, 0.012616468593478203, -0.08963532000780106, 0.10787323862314224, 0.09320555627346039, 0.038514334708452225, 0.0837748646736145, 0.008057801052927971, 0.006657994352281094, -0.12352705001831055, -0.08765412867069244, 0.0006755553185939789, 0.08513709157705307, 0.002282294211909175, 0.12471475452184677, 0.13926562666893005, -0.0546247623860836, 0.019292229786515236, -0.06735231727361679, -0.005191338248550892, -0.14633731544017792, -0.09925170987844467, -0.07716499269008636, -0.13808217644691467, -0.004118069540709257, -0.05876297503709793, 0.02226887457072735, 0.06254894286394119, 0.03378378227353096, -0.0730864480137825, 0.016053007915616035, -0.013075723312795162, -0.0748930424451828, 0.03280244767665863, -0.016277670860290527, -0.003398780943825841, -0.024585548788309097, -0.049754612147808075, -0.08814110606908798, -0.005314778070896864, -0.03468628600239754, 0.05477277562022209, -0.012853099033236504, 0.03874988481402397, -0.10887712240219116, -0.05911590903997421, -0.0488186851143837, 0.019239144399762154, 0.016337549313902855, 0.1369474232196808, 0.022066960111260414, 0.027017410844564438, 0.08897440135478973, 0.189834862947464, -0.07568466663360596, -0.16569937765598297, -0.057385075837373734, 0.1563788801431656, 0.04028889909386635, 0.05061136558651924, 0.01302589662373066, 0.02476346679031849, -0.07612861692905426, 0.34172576665878296, 0.30292633175849915, -0.0842638686299324, 0.013408830389380455, -0.013821525499224663, 0.03196385130286217, 0.07280732691287994, 0.11747576296329498, 0.1419629603624344, 0.22894705832004547, -0.0633167251944542, -0.0342041477560997, -0.06164666637778282, -0.011379384435713291, -0.16521087288856506, 0.10886316746473312, -0.019189288839697838, -0.08866600692272186, -0.014199310913681984, 0.0907660648226738, -0.12409275025129318, 0.07481842488050461, -0.059848852455616, -0.15542776882648468, -0.03254380077123642, 0.0001943191309692338, 0.21452878415584564, 0.021070972084999084, 0.02321447990834713, -0.032849907875061035, -0.04723024368286133, 0.1415794938802719, -0.013134322129189968, -0.18438193202018738, -0.014645610935986042, 0.07338922470808029, -0.12581893801689148, 0.07409758120775223, -0.01630956493318081, 0.01994241587817669, 0.08428996056318283, 0.09921777993440628, -0.0683901384472847, 0.07933018356561661, 0.017007190734148026, -0.0349293127655983, 0.011066017672419548, -0.06293949484825134, -0.019886817783117294, -0.05565422773361206, 0.06234407797455788, -0.09178829193115234, 0.04625855013728142, 0.032076720148324966, -0.042408403009176254, 0.0014522764831781387, 0.037800382822752, -0.07124499976634979, 0.07143672555685043, 0.01960405334830284, -0.04051028937101364, -0.03878381848335266, -0.0547565221786499, -0.016546303406357765, 0.005677100736647844, -0.1321941763162613, -0.04769081249833107, -0.051470790058374405, -0.07895500212907791, 0.07249294221401215, 0.03518173098564148, -0.1310557872056961, -0.004024401307106018, -0.11698086559772491, 0.04297050088644028, -0.16703630983829498, 0.07701706886291504, 0.06942970305681229, -0.023959310725331306, -0.005020285490900278, -0.11605126410722733, 0.028935354202985764, 0.031161418184638023, -0.07446826994419098, -0.08652307838201523 ]
null
null
transformers
# Hungarian Abstractive Summarization BART model For further models, scripts and details, see [our repository](https://github.com/nytud/neural-models) or [our demo site](https://juniper.nytud.hu/demo/nlp). - BART base model (see Results Table - bold): - Pretrained on Webcorpus 2.0 - Finetuned NOL corpus (nol.hu) - Segments: 397,343 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) - max_source_length = 512 - max_target_length = 256 ## Results | Model | HI | NOL | | ------------- | ------------- | ------------- | | BART-base-512 | 30.18/13.86/22.92 | **46.48/32.40/39.45** | | BART-base-1024| 31.86/14.59/23.79 | 47.01/32.91/39.97 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-bart, title = {{BARTerezzünk! - Messze, messze, messze a világtól, - BART kísérleti modellek magyar nyelvre}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {15--29} } ```
{"language": ["hu"], "license": "apache-2.0", "tags": ["summarization"], "metrics": ["rouge"], "widget": [{"text": "A Tisza-parti v\u00e1ros \u00e1llatkertj\u00e9ben r\u00e9g\u00f3ta tartanak szurik\u00e1t\u00e1kat ( Suricata suricatta ) , de tavaly tavaszig nem siker\u00fclt szapor\u00edtani \u0151ket , annak ellen\u00e9re , hogy t\u00e1gas h\u00e1z \u00e9s kifut\u00f3 \u00e9p\u00fclt sz\u00e1mukra - k\u00f6z\u00f6lte Veprik R\u00f3bert igazgat\u00f3 . 2010-ben alakult ki az \u00faj - h\u00e1rom Amszterdamb\u00f3l sz\u00e1rmaz\u00f3 n\u0151st\u00e9nyb\u0151l \u00e9s egy budapesti fiatal h\u00edmb\u0151l \u00e1ll\u00f3 - csapat , amely szaporodni kezdett . 2011-ben h\u00e1rom , id\u00e9n pedig egy ut\u00f3ddal \u00f6rvendeztett\u00e9k meg a gondoz\u00f3kat \u00e9s az \u00e1llatbar\u00e1tokat . A szurik\u00e1t\u00e1k ut\u00f3dai - tizenegy hetes vemhess\u00e9g ut\u00e1n - okt\u00f3ber \u00e9s m\u00e1rcius k\u00f6z\u00f6tt vakon \u00e9s sz\u0151rtelen\u00fcl j\u00f6nnek a vil\u00e1gra . A kicsinyek h\u00e1romhetesen b\u00fajnak el\u0151 az \u00fcregb\u0151l , \u00e9s nevel\u00e9s\u00fckben mindk\u00e9t sz\u00fcl\u0151 r\u00e9szt vesz . A szurik\u00e1tacsapatokban a csal\u00e1d tagjai nagyon szoros kapcsolatban \u00e1llnak egym\u00e1ssal , viszont nagyon harciasan fell\u00e9pnek az idegenekkel szemben , ak\u00e1r meg is \u00f6lhetik azt az \u00e1llatot , amelyet betolakod\u00f3nak tekintenek . B\u00e1r a D\u00e9l-Afrik\u00e1ban , a Kalah\u00e1ri sivatagban \u0151shonos cibetmacskaf\u00e9le ragadoz\u00f3kat a szegedi \u00e1llatkertben term\u00e9szetes \u00e9l\u0151hely\u00fckh\u00f6z k\u00e9pest kevesebb vesz\u00e9ly fenyegeti , a vadasparki erd\u0151ben ragadoz\u00f3 madarak is \u00e9lnek , amelyek ak\u00e1r zs\u00e1km\u00e1nyk\u00e9nt is tekinthetn\u00e9nek a szurik\u00e1t\u00e1kra . A szegedi csapatn\u00e1l azonban szigor\u00fa \u0151rs\u00e9g van , mindig lesi valaki k\u00e9t l\u00e1bra \u00e1llva a vesz\u00e9lyforr\u00e1sokat . Az \u0151rszemek figyelm\u00e9t m\u00e9g a s\u00e1rk\u00e1nyrep\u00fcl\u0151k is felkeltik , \u00e9s felbukkan\u00e1sakor valamennyi egyed biztos helyre menek\u00fcl . A szurik\u00e1t\u00e1k a Kalah\u00e1ri sivatag boz\u00f3tos , szikl\u00e1s ter\u00fcletein csapatokban \u00e9lnek . A 700 gramm k\u00f6r\u00fcli testt\u00f6meg\u0171 ragadoz\u00f3k rovarokkal , l\u00e1rv\u00e1kkal , skorpi\u00f3kkal t\u00e1pl\u00e1lkoznak , de n\u00e9ha elfogyasztj\u00e1k a kisebb gerinceseket , toj\u00e1sokat \u00e9s n\u00f6v\u00e9nyi gum\u00f3kat is . A nappal akt\u00edv \u00e1llatok f\u00f6ldalatti \u00fcregrendszert \u00e1snak , amelynek t\u00f6bb bej\u00e1rata is van . Ha a szurik\u00e1t\u00e1k idegen csapattal vagy ragadoz\u00f3val ker\u00fclnek szembe , azonnal elkezdenek \u00e1sni , nagy porfelh\u0151t kavarva . Az is gyakorta el\u0151fordul , hogy szorosan egym\u00e1shoz b\u00fajnak , felborzolj\u00e1k sz\u0151r\u00fcket , megny\u00fajtj\u00e1k test\u00fcket , hogy min\u00e9l nagyobbnak l\u00e1tsz\u00f3djanak . Az el\u0151ad\u00e1suk cs\u00facspontj\u00e1n pedig az eg\u00e9sz csapat a leveg\u0151be ugrik , k\u00f6zben pedig morog . A hangad\u00e1s egy\u00e9bk\u00e9nt is fontos a szurik\u00e1t\u00e1k kapcsolat\u00e1ban , az egyedek legal\u00e1bb t\u00edzf\u00e9le jelz\u00e9st haszn\u00e1lnak a kol\u00f3ni\u00e1n bel\u00fcl ."}]}
summarization
NYTK/summarization-nol-bart-hungarian
[ "transformers", "pytorch", "bart", "text2text-generation", "summarization", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Hungarian Abstractive Summarization BART model ============================================== For further models, scripts and details, see our repository or our demo site. * BART base model (see Results Table - bold): + Pretrained on Webcorpus 2.0 + Finetuned NOL corpus (URL) - Segments: 397,343 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) * max\_source\_length = 512 * max\_target\_length = 256 Results ------- Model: BART-base-512, HI: 30.18/13.86/22.92, NOL: 46.48/32.40/39.45 Model: BART-base-1024, HI: 31.86/14.59/23.79, NOL: 47.01/32.91/39.97 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 52 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #summarization #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03763221949338913, 0.09045056998729706, -0.00622772378847003, -0.005903321783989668, 0.09766548871994019, 0.021367143839597702, 0.12657077610492706, 0.11909694224596024, -0.012927955947816372, -0.05026744678616524, 0.13250283896923065, 0.1365789771080017, 0.01781429722905159, 0.11761949956417084, -0.0686158761382103, -0.24075445532798767, 0.09058404713869095, 0.0386543944478035, 0.005316436290740967, 0.09374922513961792, 0.11377763003110886, -0.03891528397798538, 0.06224219128489494, -0.01886739209294319, -0.07457423955202103, 0.013743330724537373, 0.004637845326215029, -0.10095703601837158, 0.07875913381576538, 0.04441635683178902, 0.06399630010128021, 0.05935051292181015, -0.01910143531858921, -0.19767697155475616, 0.024185996502637863, -0.014616673812270164, -0.0665985718369484, 0.04531598463654518, 0.054657842963933945, -0.02421688102185726, 0.12835273146629333, 0.047464191913604736, -0.04497377946972847, 0.0688280388712883, -0.08242414891719818, -0.10874903202056885, -0.09509193897247314, 0.08033432066440582, 0.06876575946807861, 0.05750083923339844, 0.014892791397869587, 0.11376810818910599, -0.09492882341146469, 0.05829180032014847, 0.09786463528871536, -0.3445099890232086, -0.0024179767351597548, 0.04181130602955818, 0.058750398457050323, 0.03254202380776405, -0.020017487928271294, 0.046662185341119766, 0.06359859555959702, 0.03271108493208885, -0.007011004723608494, -0.06454012542963028, -0.13983072340488434, 0.031392838805913925, -0.0375542975962162, -0.055083077400922775, 0.2795774042606354, -0.01818937622010708, 0.0492519736289978, -0.03039020672440529, -0.08293955028057098, -0.0008557162364013493, -0.04777277261018753, 0.06526334583759308, -0.005250237882137299, 0.1019728034734726, 0.06996248662471771, -0.026128018274903297, -0.13300956785678864, 0.019412066787481308, -0.1926204413175583, 0.06729467958211899, 0.02832097001373768, 0.06440375000238419, -0.15489375591278076, 0.06287933886051178, 0.07946198433637619, -0.140621617436409, 0.017751017585396767, -0.07481461018323898, 0.09628383070230484, 0.04387955740094185, -0.0649716779589653, 0.03509249538183212, 0.14348414540290833, 0.20958487689495087, 0.03238973021507263, 0.04523094743490219, -0.04251605272293091, 0.12043869495391846, -0.008658022619783878, 0.06321308016777039, 0.019640717655420303, -0.08920376002788544, 0.09021943062543869, -0.1006273701786995, 0.0962204858660698, -0.039242446422576904, -0.14811328053474426, -0.00469821784645319, -0.0004528526624199003, 0.09877286851406097, 0.060927677899599075, 0.05193565413355827, -0.04968037083745003, 0.015289356000721455, 0.08595307171344757, -0.05796189233660698, 0.001791513292118907, 0.009917902760207653, 0.03402552753686905, 0.0989810898900032, 0.03534751757979393, 0.036770693957805634, -0.08401119709014893, 0.08215142786502838, -0.052649252116680145, -0.01714039035141468, -0.03893836960196495, -0.009832041338086128, 0.07785974442958832, -0.06270406395196915, 0.04479178413748741, -0.14309583604335785, -0.15703241527080536, 0.020589584484696388, 0.053150806576013565, -0.014484778046607971, -0.07817638665437698, 0.012051594443619251, -0.014285555109381676, 0.03852049633860588, -0.09106123447418213, 0.0050948793068528175, -0.08701479434967041, 0.067646823823452, -0.06698282063007355, 0.04336929693818092, -0.20427867770195007, 0.06067601591348648, -0.12374912947416306, -0.00214839237742126, -0.04806305095553398, -0.004234221763908863, -0.07800848037004471, 0.20671173930168152, -0.014734753407537937, -0.028024328872561455, -0.0508195161819458, 0.011774428188800812, -0.021816611289978027, 0.13512717187404633, -0.13527926802635193, -0.062090761959552765, 0.21618090569972992, -0.1290607899427414, -0.18888764083385468, 0.09519954770803452, 0.02815408445894718, 0.029998840764164925, 0.06354206800460815, 0.19745104014873505, 0.05918384715914726, -0.006548587698489428, 0.065024234354496, 0.14411529898643494, -0.040496956557035446, -0.13651026785373688, 0.026172656565904617, -0.023997705429792404, -0.10357008129358292, 0.07968580722808838, 0.05106288194656372, 0.10105089098215103, -0.01589171029627323, -0.0728413537144661, -0.053163670003414154, -0.029447974637150764, -0.019387781620025635, -0.0036139136645942926, 0.07862724363803864, -0.07201717048883438, 0.005257106851786375, -0.07641023397445679, 0.04294158145785332, 0.03261997550725937, 0.06205906346440315, -0.04664396867156029, 0.10146033018827438, 0.05781075730919838, 0.05182764679193497, -0.11202497780323029, 0.013690195977687836, -0.014303714036941528, 0.037479791790246964, -0.003063549753278494, 0.08256827294826508, 0.013329686596989632, -0.04530830308794975, -0.00956654455512762, -0.0016437673475593328, 0.1373220682144165, 0.03494308888912201, -0.02662176825106144, -0.11827075481414795, 0.038042161613702774, -0.02544964849948883, 0.07443337142467499, -0.00883529707789421, 0.02265392430126667, 0.041148070245981216, 0.13984116911888123, -0.049493033438920975, 0.09061897546052933, -0.021795185282826424, 0.021380309015512466, -0.08316447585821152, 0.005865962244570255, 0.11629040539264679, 0.05453335866332054, -0.10642588883638382, 0.20181907713413239, -0.09225921332836151, 0.22848555445671082, 0.2229742556810379, -0.182373046875, 0.08503415435552597, -0.005380003713071346, -0.03429025784134865, -0.016996493563055992, 0.04603155329823494, 0.0044000050984323025, 0.007652831729501486, 0.007689979858696461, 0.17302857339382172, -0.045825012028217316, -0.029477179050445557, -0.026488328352570534, -0.04174968972802162, -0.004768709186464548, 0.04486777260899544, 0.14702299237251282, -0.17151780426502228, 0.1741742342710495, 0.27235302329063416, 0.019832121208310127, 0.12537862360477448, -0.06917447596788406, 0.0047770654782652855, 0.06868996471166611, -0.019924219697713852, -0.04796747490763664, 0.023314638063311577, -0.1506577581167221, 0.008043097332119942, 0.09302730113267899, 0.03401925414800644, 0.08947886526584625, -0.12727682292461395, -0.032904017716646194, -0.03726470470428467, -0.0032638609409332275, -0.061008501797914505, 0.06745333969593048, 0.011419885791838169, 0.10570176690816879, -0.04347348213195801, -0.06105359271168709, 0.08882124722003937, 0.0008029557066038251, -0.08134201914072037, 0.12640410661697388, -0.17460407316684723, -0.28479310870170593, -0.16217823326587677, -0.08233023434877396, -0.024930953979492188, 0.0034249236341565847, 0.15353091061115265, -0.02151927351951599, -0.05609731003642082, 0.00046756863594055176, -0.03172476217150688, -0.01805698312819004, -0.00582085782662034, 0.01636289618909359, 0.06293001025915146, 0.0027623942587524652, -0.11428564786911011, -0.049904655665159225, 0.012886082753539085, 0.0060647171922028065, 0.12038151174783707, -0.09920699149370193, 0.09993861615657806, 0.09292452782392502, 0.04270691052079201, 0.029958153143525124, -0.028704354539513588, 0.12718811631202698, -0.039491117000579834, 0.006761827506124973, 0.1972167044878006, -0.03807378187775612, 0.07163243740797043, 0.13941296935081482, 0.011479916051030159, -0.06886782497167587, 0.01780710369348526, -0.07020539790391922, -0.06823379546403885, -0.21168257296085358, -0.13525483012199402, -0.14237405359745026, 0.08139976114034653, 0.03737109154462814, 0.06879844516515732, 0.040385399013757706, 0.06886129081249237, -0.052476949989795685, 0.0312077347189188, -0.005128837656229734, 0.07469839602708817, 0.2618454694747925, -0.04183804988861084, 0.11835148185491562, -0.11583887040615082, -0.06721889972686768, 0.12330127507448196, 0.0704972967505455, 0.12697479128837585, 0.10214177519083023, 0.06801602989435196, 0.0787055641412735, 0.15301811695098877, 0.09184373915195465, 0.1199888214468956, 0.043259646743535995, -0.0015929309884086251, -0.06059034913778305, -0.026906996965408325, -0.0731167271733284, 0.0493352897465229, -0.04092789441347122, -0.1322110891342163, -0.039466407150030136, -0.0930243730545044, 0.10141311585903168, 0.1649523228406906, 0.04226130247116089, -0.13680866360664368, 0.011476203799247742, 0.10832834243774414, -0.01709054596722126, -0.06770434230566025, 0.1118093803524971, -0.0415651760995388, -0.10009574890136719, 0.1447945535182953, -0.0021850396879017353, 0.15244324505329132, 0.015029422007501125, 0.05993237718939781, -0.062349967658519745, -0.11136962473392487, 0.08071429282426834, 0.13240143656730652, -0.2872105538845062, 0.19047172367572784, -0.032716695219278336, -0.03748749941587448, -0.07593827694654465, 0.005907368380576372, 0.05740375444293022, 0.1792834848165512, 0.06270583719015121, 0.018468890339136124, -0.14566567540168762, -0.023433370515704155, -0.057581108063459396, 0.0681442841887474, 0.04528699442744255, 0.001732130185700953, -0.03423627093434334, -0.07373934984207153, -0.0020484209526330233, -0.016315797343850136, 0.03421356901526451, -0.0382486991584301, -0.1814577430486679, 0.07072315365076065, 0.11244300752878189, 0.0588550791144371, -0.0616103932261467, -0.009969946928322315, -0.08109187334775925, 0.17618420720100403, -0.0541972778737545, -0.07949303090572357, -0.10310330241918564, -0.09937329590320587, 0.05930984020233154, -0.04543091729283333, 0.033061910420656204, -0.07589322328567505, 0.001993467565625906, -0.07052643597126007, -0.20601530373096466, 0.09230482578277588, -0.13473115861415863, -0.043732650578022, -0.04451385885477066, 0.1579730212688446, -0.11745425313711166, 0.02164432778954506, 0.030333327129483223, -0.0018000565469264984, -0.11971430480480194, -0.1329612135887146, -0.05768745020031929, 0.020118730142712593, 0.05331016704440117, -0.012276953086256981, -0.11351658403873444, -0.02484804019331932, 0.017717478796839714, -0.0722678080201149, 0.22813193500041962, 0.16573791205883026, -0.07811696082353592, 0.1972920447587967, 0.16176557540893555, -0.08298609405755997, -0.3242345154285431, -0.1988833248615265, -0.15335775911808014, -0.059509459882974625, -0.012056571431457996, -0.1299349069595337, 0.10364850610494614, 0.004883048590272665, -0.0688013881444931, 0.10691330581903458, -0.19583365321159363, -0.078946053981781, 0.18717439472675323, -0.05349648371338844, 0.31876006722450256, -0.14121605455875397, -0.08739297091960907, -0.10223046690225601, -0.21377742290496826, 0.10322869569063187, -0.0849073976278305, 0.06545615196228027, -0.019960785284638405, 0.05797940865159035, -0.010429051704704762, -0.047499626874923706, 0.12960483133792877, -0.010348222218453884, -0.014282082207500935, -0.11260470002889633, 0.03852298483252525, 0.08378816395998001, -0.006223613861948252, 0.04447924345731735, -0.18215502798557281, 0.03744278475642204, -0.10625399649143219, -0.00850141141563654, -0.048494670540094376, 0.06861825287342072, -0.008481637574732304, -0.02318590320646763, -0.03146900236606598, -0.03875196352601051, 0.03236065432429314, 0.010957633145153522, 0.21556290984153748, 0.017902227118611336, 0.09988299757242203, 0.11608009785413742, 0.06932543218135834, -0.24493062496185303, 0.07320941984653473, -0.10018078237771988, -0.0763905718922615, 0.05633970722556114, -0.09056801348924637, 0.05900518223643303, 0.12160839885473251, -0.06210573762655258, 0.05370432510972023, 0.07837021350860596, 0.02014162205159664, -0.028634008020162582, 0.1597369909286499, -0.19694454967975616, -0.015234480611979961, -0.03885170817375183, 0.08334523439407349, 0.06791578233242035, 0.04468420892953873, 0.13511188328266144, 0.0161084346473217, -0.03435155004262924, 0.015359065495431423, 0.04576031118631363, -0.06843013316392899, 0.04358139634132385, 0.0060377889312803745, 0.009301768615841866, -0.1463301032781601, 0.09488291293382645, 0.009938215836882591, -0.13413651287555695, -0.026878919452428818, 0.1384759247303009, -0.1632564812898636, -0.12557975947856903, -0.035188622772693634, 0.08691705763339996, -0.11131642758846283, -0.11088436841964722, -0.06445396691560745, -0.17242558300495148, 0.06343519687652588, 0.07174990326166153, 0.101102314889431, 0.07712004333734512, -0.01623843051493168, -0.054811496287584305, 0.038241274654865265, -0.004338306840509176, -0.017475800588726997, 0.028790447860956192, -0.08173917233943939, -0.017054880037903786, 0.013982708565890789, 0.10681317746639252, -0.056582529097795486, -0.013775707222521305, -0.1000395193696022, 0.019405808299779892, -0.20654524862766266, -0.014987640082836151, -0.11138198524713516, -0.024024801328778267, 0.007070464082062244, -0.0766787901520729, -0.0631231740117073, -0.013890625908970833, -0.10210329294204712, -0.027906527742743492, -0.06019094958901405, 0.06444083154201508, -0.09079153835773468, -0.014137363992631435, 0.10135616362094879, -0.03998007997870445, 0.1036924496293068, 0.10453064739704132, -0.07108422368764877, 0.09410328418016434, -0.1200857013463974, -0.13402555882930756, 0.09525029361248016, 0.057681553065776825, 0.022385234013199806, 0.005305006168782711, -0.009083517827093601, 0.13371725380420685, -0.003367204684764147, 0.030109642073512077, 0.03494397923350334, -0.1298186182975769, -0.061045531183481216, -0.041192926466464996, -0.11841069906949997, -0.015956010669469833, -0.10164637863636017, 0.09692510217428207, 0.030949482694268227, 0.14754660427570343, -0.03365393728017807, 0.04279015585780144, -0.06464646756649017, 0.03303400054574013, -0.03441397100687027, -0.1591556817293167, -0.11776424944400787, -0.08647211641073227, -0.021450038999319077, 0.005311750341206789, 0.2704131007194519, -0.004793477710336447, -0.03641544654965401, 0.0669018104672432, 0.08313320577144623, -0.029920484870672226, 0.00987919420003891, 0.25962430238723755, 0.06879810988903046, -0.010221549309790134, -0.12525779008865356, 0.010047639720141888, 0.012616468593478203, -0.08963532000780106, 0.10787323862314224, 0.09320555627346039, 0.038514334708452225, 0.0837748646736145, 0.008057801052927971, 0.006657994352281094, -0.12352705001831055, -0.08765412867069244, 0.0006755553185939789, 0.08513709157705307, 0.002282294211909175, 0.12471475452184677, 0.13926562666893005, -0.0546247623860836, 0.019292229786515236, -0.06735231727361679, -0.005191338248550892, -0.14633731544017792, -0.09925170987844467, -0.07716499269008636, -0.13808217644691467, -0.004118069540709257, -0.05876297503709793, 0.02226887457072735, 0.06254894286394119, 0.03378378227353096, -0.0730864480137825, 0.016053007915616035, -0.013075723312795162, -0.0748930424451828, 0.03280244767665863, -0.016277670860290527, -0.003398780943825841, -0.024585548788309097, -0.049754612147808075, -0.08814110606908798, -0.005314778070896864, -0.03468628600239754, 0.05477277562022209, -0.012853099033236504, 0.03874988481402397, -0.10887712240219116, -0.05911590903997421, -0.0488186851143837, 0.019239144399762154, 0.016337549313902855, 0.1369474232196808, 0.022066960111260414, 0.027017410844564438, 0.08897440135478973, 0.189834862947464, -0.07568466663360596, -0.16569937765598297, -0.057385075837373734, 0.1563788801431656, 0.04028889909386635, 0.05061136558651924, 0.01302589662373066, 0.02476346679031849, -0.07612861692905426, 0.34172576665878296, 0.30292633175849915, -0.0842638686299324, 0.013408830389380455, -0.013821525499224663, 0.03196385130286217, 0.07280732691287994, 0.11747576296329498, 0.1419629603624344, 0.22894705832004547, -0.0633167251944542, -0.0342041477560997, -0.06164666637778282, -0.011379384435713291, -0.16521087288856506, 0.10886316746473312, -0.019189288839697838, -0.08866600692272186, -0.014199310913681984, 0.0907660648226738, -0.12409275025129318, 0.07481842488050461, -0.059848852455616, -0.15542776882648468, -0.03254380077123642, 0.0001943191309692338, 0.21452878415584564, 0.021070972084999084, 0.02321447990834713, -0.032849907875061035, -0.04723024368286133, 0.1415794938802719, -0.013134322129189968, -0.18438193202018738, -0.014645610935986042, 0.07338922470808029, -0.12581893801689148, 0.07409758120775223, -0.01630956493318081, 0.01994241587817669, 0.08428996056318283, 0.09921777993440628, -0.0683901384472847, 0.07933018356561661, 0.017007190734148026, -0.0349293127655983, 0.011066017672419548, -0.06293949484825134, -0.019886817783117294, -0.05565422773361206, 0.06234407797455788, -0.09178829193115234, 0.04625855013728142, 0.032076720148324966, -0.042408403009176254, 0.0014522764831781387, 0.037800382822752, -0.07124499976634979, 0.07143672555685043, 0.01960405334830284, -0.04051028937101364, -0.03878381848335266, -0.0547565221786499, -0.016546303406357765, 0.005677100736647844, -0.1321941763162613, -0.04769081249833107, -0.051470790058374405, -0.07895500212907791, 0.07249294221401215, 0.03518173098564148, -0.1310557872056961, -0.004024401307106018, -0.11698086559772491, 0.04297050088644028, -0.16703630983829498, 0.07701706886291504, 0.06942970305681229, -0.023959310725331306, -0.005020285490900278, -0.11605126410722733, 0.028935354202985764, 0.031161418184638023, -0.07446826994419098, -0.08652307838201523 ]
null
null
transformers
# Hungarian GPT-2 news generator For further models, scripts and details, see [our repository](https://github.com/nytud/neural-models) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained on Hungarian Wikipedia - Finetuned on hin corpus (hvg.hu, index.hu, nol.hu) ## Results | Model | Perplexity | | ------------- | ------------- | | GPT-2 poem | 47.46 | | **GPT-2 news** | **22.06** | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-gpt2, title = {{"Az invazív medvék nem tolerálják a suzukis agressziót" - Magyar GPT-2 kísérleti modell}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {463--476} } ```
{"language": ["hu"], "license": "mit", "tags": ["text-generation"], "widget": [{"text": "Szeptember v\u00e9g\u00e9n z\u00e1rul a balatoni szezon"}]}
text-generation
NYTK/text-generation-news-gpt2-small-hungarian
[ "transformers", "pytorch", "gpt2", "text-generation", "hu", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Hungarian GPT-2 news generator ============================== For further models, scripts and details, see our repository or our demo site. * Pretrained on Hungarian Wikipedia * Finetuned on hin corpus (URL, URL, URL) Results ------- If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.024446535855531693, 0.06442651897668839, -0.005283902399241924, 0.005421709269285202, 0.15189245343208313, 0.04103633388876915, 0.1590174287557602, 0.12009090930223465, 0.02282528020441532, -0.0362015962600708, 0.1452028602361679, 0.17255353927612305, 0.02968580275774002, 0.05267444625496864, -0.059397581964731216, -0.2648511528968811, 0.06013985350728035, 0.055165328085422516, 0.027937307953834534, 0.09990787506103516, 0.07869487255811691, -0.0467706173658371, 0.09483157843351364, -0.010166062973439693, -0.13541442155838013, -0.0009215997997671366, 0.030544904991984367, -0.10687468200922012, 0.12062094360589981, 0.051536839455366135, 0.07497936487197876, 0.052261196076869965, -0.007290958426892757, -0.16617117822170258, 0.01957644335925579, -0.007943658158183098, -0.07685668766498566, 0.06487102061510086, 0.0742487907409668, -0.027853649109601974, 0.13975831866264343, 0.10309870541095734, -0.057383809238672256, 0.07458312064409256, -0.14166532456874847, -0.08910268545150757, -0.08228657394647598, 0.0744413286447525, 0.039991091936826706, 0.07266135513782501, 0.004489591345191002, 0.11472947150468826, -0.08204830437898636, 0.05109870433807373, 0.13019758462905884, -0.36239200830459595, 0.028219548985362053, 0.08977200090885162, 0.07286117970943451, 0.043361105024814606, -0.049312274903059006, 0.07975959777832031, 0.058480486273765564, 0.02916320227086544, -0.01088144350796938, -0.07729782164096832, -0.07885213196277618, 0.06760762631893158, -0.0676661878824234, -0.06298419833183289, 0.24132965505123138, -0.053199201822280884, 0.04181264713406563, -0.022645732387900352, -0.0786643773317337, -0.0529494434595108, -0.013763604685664177, 0.04136775806546211, -0.03831791877746582, 0.09706863760948181, 0.06290393322706223, -0.06116681173443794, -0.13713759183883667, -0.028518225997686386, -0.17872998118400574, 0.12485569715499878, 0.02904171496629715, 0.05575539171695709, -0.1558239758014679, 0.11528731882572174, 0.0650847777724266, -0.09255215525627136, -0.002344757318496704, -0.07966698706150055, 0.09047751873731613, 0.0028985857497900724, -0.04140479490160942, 0.037594083696603775, 0.0901261642575264, 0.17127719521522522, -0.01925463229417801, 0.01340628881007433, -0.03309369832277298, 0.12480423599481583, 0.010683214291930199, 0.085645891726017, 0.003732475684955716, -0.006858707405626774, 0.06143209710717201, -0.13637258112430573, 0.03323112055659294, -0.056029610335826874, -0.19282719492912292, -0.03461216762661934, 0.020746605470776558, 0.08625409007072449, -0.0021457262337207794, 0.07621946930885315, -0.04444169998168945, 0.002717532915994525, 0.08717796951532364, -0.044873401522636414, 0.00219765049405396, 0.009420657530426979, 0.04040636122226715, 0.11303050071001053, 0.005569754168391228, 0.02201901562511921, -0.1050248071551323, 0.06383500248193741, -0.08940201252698898, -0.004893074743449688, -0.029474381357431412, -0.05170229822397232, 0.053500574082136154, -0.09550514817237854, 0.021534467115998268, -0.14984089136123657, -0.1671937108039856, 0.02053910866379738, 0.009411088190972805, -0.02797676809132099, -0.047701701521873474, -0.00040723884012550116, -0.023715317249298096, 0.023991303518414497, -0.08017874509096146, -0.04301779344677925, -0.0724930614233017, 0.1126035824418068, -0.05999046936631203, 0.04129078611731529, -0.18048147857189178, 0.07097332179546356, -0.11662252992391586, 0.007652892265468836, -0.06420972943305969, 0.025053562596440315, -0.034609220921993256, 0.15915444493293762, -0.0016363394679501653, -0.053298309445381165, -0.030820703133940697, 0.04428395256400108, -0.04315691068768501, 0.1601988524198532, -0.07142764329910278, -0.11052264273166656, 0.20445671677589417, -0.11022500693798065, -0.1639515459537506, 0.10048230737447739, 0.017233099788427353, 0.046849723905324936, 0.08650518953800201, 0.21436560153961182, 0.04108290746808052, -0.04139857739210129, 0.060355689376592636, 0.11370433866977692, -0.08151277899742126, -0.13770471513271332, 0.02530621737241745, -0.0374416746199131, -0.11890577524900436, 0.04865839332342148, 0.023252733051776886, 0.09056369960308075, -0.03743365779519081, -0.04593479260802269, -0.028416406363248825, 0.002795846899971366, 0.008359460160136223, 0.00632643885910511, 0.115205317735672, -0.05499560013413429, -0.012998374179005623, -0.0042811911553144455, -0.003246546722948551, -0.004296299070119858, 0.047338295727968216, -0.037186551839113235, 0.147731214761734, 0.02576482482254505, 0.042227063328027725, -0.1009068638086319, -0.03753373771905899, -0.00541055528447032, 0.07240055501461029, 0.02174565941095352, 0.09664860367774963, 0.019362779334187508, 0.007334986701607704, -0.012333037331700325, 0.0020609404891729355, 0.12872414290905, 0.0027728660497814417, -0.036169108003377914, -0.08374753594398499, 0.03686651214957237, -0.022213472053408623, 0.01844276301562786, -0.06616470217704773, 0.01728067547082901, 0.05110599845647812, 0.10243768244981766, -0.040222231298685074, 0.05817754939198494, -0.04080598056316376, 0.027588170021772385, -0.08460060507059097, -0.004183882847428322, 0.11506252735853195, 0.0159800685942173, -0.07466421276330948, 0.20976728200912476, -0.1456717848777771, 0.2178497314453125, 0.20975033938884735, -0.2466924488544464, 0.019782500341534615, -0.08332116156816483, -0.034346774220466614, 0.010166656225919724, 0.05301915481686592, -0.02690899930894375, 0.0962296798825264, -0.01359652541577816, 0.17596982419490814, -0.06034132465720177, -0.04245733097195625, -0.013943384401500225, -0.048455070704221725, -0.007327884901314974, 0.060019783675670624, 0.19667977094650269, -0.16150952875614166, 0.19361229240894318, 0.19388531148433685, 0.05092756077647209, 0.17213115096092224, -0.0361475870013237, -0.0072480193339288235, 0.054641805589199066, -0.012130544520914555, -0.03466791287064552, 0.007752159144729376, -0.15013165771961212, -0.012955992482602596, 0.07595668733119965, 0.03772520646452904, 0.1082787811756134, -0.16357405483722687, -0.06556404381990433, -0.050835367292165756, -0.02280302159488201, -0.0561077855527401, 0.08965763449668884, 0.01764780841767788, 0.10956703126430511, -0.028480472043156624, -0.015541122294962406, 0.10553079098463058, 0.023712221533060074, -0.10376100987195969, 0.16592484712600708, -0.14151129126548767, -0.28058555722236633, -0.19364097714424133, -0.14757555723190308, -0.022010819986462593, 0.0454644113779068, 0.12333601713180542, -0.0708475410938263, -0.0395173504948616, 0.009022719226777554, 0.060825612396001816, -0.08098970353603363, -0.013901540078222752, -0.04827691614627838, 0.043721605092287064, -0.07584574818611145, -0.09094106405973434, -0.07148627936840057, -0.009002693928778172, -0.045703284442424774, 0.14232787489891052, -0.09087596833705902, 0.0613100528717041, 0.15175233781337738, 0.040552977472543716, 0.04367755353450775, -0.053342293947935104, 0.18162685632705688, -0.0728849396109581, 0.0047697704285383224, 0.19319851696491241, -0.016062073409557343, 0.0797395408153534, 0.13064521551132202, 0.03221523016691208, -0.0828077644109726, -0.0033836057409644127, -0.05982210487127304, -0.08082478493452072, -0.22170674800872803, -0.12013155966997147, -0.1360362023115158, 0.07026194781064987, 0.048113711178302765, 0.07534574717283249, 0.11162036657333374, 0.07984919100999832, -0.04571521654725075, 0.05665626376867294, -0.0055448212660849094, 0.08708098530769348, 0.2784316837787628, -0.024074938148260117, 0.10440028458833694, -0.0806795135140419, -0.10054991394281387, 0.10526669770479202, 0.08823179453611374, 0.1543513983488083, 0.1049911305308342, 0.08350612223148346, 0.056615348905324936, 0.10715526342391968, 0.1357964128255844, 0.09179457277059555, 0.03654608502984047, -0.014724135398864746, -0.03677833080291748, -0.022476676851511, -0.019802941009402275, 0.05390111356973648, -0.01149095967411995, -0.17413784563541412, -0.045528993010520935, -0.1286485642194748, 0.06454116851091385, 0.08749070763587952, 0.06491164863109589, -0.19882890582084656, 0.010718254372477531, 0.09701590985059738, -0.004895768128335476, -0.10224407911300659, 0.09910045564174652, -0.025299232453107834, -0.13335692882537842, 0.08806368708610535, -0.026758888736367226, 0.12787169218063354, -0.03965023532509804, 0.07060170918703079, -0.023930616676807404, -0.06949058920145035, 0.034488581120967865, 0.1240018755197525, -0.28338509798049927, 0.22057338058948517, -0.014246128499507904, -0.04048345983028412, -0.07417265325784683, 0.0035282259341329336, 0.023177562281489372, 0.16032080352306366, 0.08116693794727325, 0.020816806703805923, -0.07006371766328812, -0.10583944618701935, -0.007792746182531118, 0.049254100769758224, 0.08401104807853699, -0.04200863838195801, -0.012769831344485283, -0.06288731843233109, 0.02033419907093048, -0.02756492793560028, -0.023636765778064728, -0.01809322088956833, -0.16160239279270172, 0.07787936925888062, 0.05223245546221733, 0.1178867295384407, -0.01626501977443695, -0.0217214897274971, -0.16014127433300018, 0.1750488132238388, -0.1077793687582016, -0.10157216340303421, -0.10585608333349228, -0.08707766979932785, 0.035963043570518494, -0.054577577859163284, 0.02513476088643074, -0.07401822507381439, -0.01528279110789299, -0.07684968411922455, -0.19428443908691406, 0.11471965163946152, -0.08545909821987152, -0.0644119381904602, -0.037618789821863174, 0.18264155089855194, -0.08172924071550369, 0.017856858670711517, 0.020340848714113235, 0.014808928593993187, -0.1084558442234993, -0.14088505506515503, 0.0107803326100111, -0.017870590090751648, 0.05620378628373146, -0.012320109643042088, -0.11556130647659302, 0.055993128567934036, -0.010881399735808372, -0.0519438274204731, 0.26426297426223755, 0.1979767233133316, -0.05268546938896179, 0.20718514919281006, 0.15277720987796783, -0.10520489513874054, -0.3226362466812134, -0.1391478180885315, -0.15682242810726166, -0.0487474761903286, -0.031994499266147614, -0.2014162391424179, 0.06625549495220184, 0.02575116790831089, -0.03733588010072708, 0.17788876593112946, -0.23563365638256073, -0.08452305942773819, 0.1635710597038269, -0.023801609873771667, 0.3402602970600128, -0.13858173787593842, -0.10903534293174744, -0.06314775347709656, -0.18811683356761932, 0.1335848867893219, -0.0036088114138692617, 0.1073698177933693, -0.040197741240262985, 0.10855346918106079, 0.0017266037175431848, -0.04917115718126297, 0.12190292030572891, 0.006406732369214296, 0.01216585747897625, -0.11797633767127991, -0.002715898910537362, 0.09382028132677078, 0.022716641426086426, 0.018303727731108665, -0.11863032728433609, 0.02158787101507187, -0.10778754949569702, -0.03566008433699608, -0.06566128879785538, 0.07865149527788162, 0.016420677304267883, -0.06968382000923157, -0.03754795342683792, -0.028622964397072792, -0.026134194806218147, -0.005152544472366571, 0.1838589310646057, -0.015139875002205372, 0.14358453452587128, 0.058257389813661575, 0.07447391748428345, -0.1889023780822754, 0.04200546443462372, -0.09747598320245743, -0.07338958978652954, 0.08674562722444534, -0.04970376938581467, 0.02020965702831745, 0.1372532695531845, -0.0433538593351841, 0.08820037543773651, 0.10609222203493118, 0.0014972540084272623, 0.0031424693297594786, 0.13609756529331207, -0.23313502967357635, -0.035710278898477554, -0.06387406587600708, -0.018106726929545403, 0.11611349135637283, 0.07669027149677277, 0.14995162189006805, 0.0008275830186903477, -0.033464595675468445, 0.019735658541321754, 0.021087991073727608, -0.05416246876120567, 0.03781668096780777, 0.010548217222094536, 0.008470321074128151, -0.15467771887779236, 0.07932977378368378, 0.015627630054950714, -0.06456098705530167, -0.0008942377171479166, 0.14071448147296906, -0.14781273901462555, -0.11724218726158142, -0.08085466921329498, 0.09021368622779846, -0.1691446602344513, -0.05805530399084091, -0.05536152422428131, -0.16835367679595947, 0.07036442309617996, 0.08925706148147583, 0.06654699891805649, 0.09770942479372025, -0.043893877416849136, -0.03316112607717514, -0.01412566751241684, -0.029100704938173294, -0.05910516530275345, 0.027339445427060127, -0.09919710457324982, 0.05321282520890236, 0.0031508002430200577, 0.11772789061069489, -0.07127951085567474, -0.05028389394283295, -0.15453168749809265, 0.017703447490930557, -0.16680297255516052, -0.05755423754453659, -0.1061897948384285, -0.035119496285915375, -0.006896600592881441, -0.041298285126686096, -0.06949413567781448, -0.03235795348882675, -0.12105913460254669, 0.0018677649786695838, -0.030675111338496208, 0.07056771963834763, -0.07854915410280228, 0.009825990535318851, 0.09353702515363693, -0.02696269005537033, 0.13759441673755646, 0.10291711986064911, -0.07376792281866074, 0.11211204528808594, -0.13574577867984772, -0.09049578011035919, 0.09984620660543442, 0.024601422250270844, 0.0200074203312397, 0.03813327103853226, 0.0140736298635602, 0.0842052474617958, -0.00204268516972661, 0.060289233922958374, -0.008211317472159863, -0.12060578167438507, -0.014923413284122944, -0.046583499759435654, -0.12093952298164368, -0.03871028497815132, -0.05750548467040062, 0.05078312009572983, 0.004832398146390915, 0.11502831429243088, -0.047292787581682205, 0.06918356567621231, -0.07192657887935638, 0.02355329878628254, 0.0017273377161473036, -0.1526312530040741, -0.10156121850013733, -0.10410602390766144, -0.00379602937027812, 0.026005271822214127, 0.3125312626361847, 0.020517075434327126, -0.0679057165980339, 0.036669813096523285, 0.13470028340816498, -0.009366889484226704, -0.0042932420037686825, 0.2756626009941101, 0.09990078210830688, -0.0357833132147789, -0.142798513174057, 0.053054794669151306, -0.0076651317067444324, -0.0724271759390831, 0.1527814269065857, 0.038776885718107224, -0.028265303000807762, 0.0613541454076767, 0.02463332563638687, -0.020244063809514046, -0.10826198011636734, -0.07701799273490906, 0.01647469960153103, 0.07198598235845566, -0.022806141525506973, 0.13653497397899628, 0.15936963260173798, -0.04437195882201195, 0.02629261463880539, -0.02879677526652813, -0.03379904106259346, -0.16568318009376526, -0.17105640470981598, -0.05596132576465607, -0.1578071266412735, 0.032237254083156586, -0.06640657037496567, 0.045383207499980927, 0.0637032687664032, 0.04862026497721672, -0.08113893121480942, 0.04375147446990013, 0.027203233912587166, -0.10487627983093262, 0.019505251199007034, -0.023610932752490044, 0.04255659133195877, -0.06507734954357147, -0.044210031628608704, -0.06579286605119705, -0.027198977768421173, -0.011024805717170238, 0.047624632716178894, -0.01599903777241707, 0.019590677693486214, -0.14198802411556244, -0.07055068016052246, -0.04523460939526558, 0.058901771903038025, -0.017872847616672516, 0.12684571743011475, -0.00197883159853518, -0.0008266193326562643, 0.05877339467406273, 0.16617603600025177, -0.047262147068977356, -0.10129278153181076, -0.01829385571181774, 0.2295742928981781, 0.03662879392504692, 0.06899334490299225, 0.0007119099027477205, 0.015585449524223804, -0.054822761565446854, 0.35495829582214355, 0.3228115737438202, -0.08402803540229797, 0.004838899243623018, 0.006045275367796421, 0.03081054985523224, 0.12314484268426895, 0.15377536416053772, 0.08394250273704529, 0.25629183650016785, -0.06426944583654404, -0.06037350371479988, -0.05681953206658363, 0.0014238799922168255, -0.11415211111307144, 0.1209077313542366, 0.03245845437049866, -0.08393373340368271, -0.03646807372570038, 0.09389273077249527, -0.21150927245616913, 0.10761731117963791, -0.05378012731671333, -0.1363501250743866, -0.022183990105986595, 0.01559603214263916, 0.13962422311306, 0.00007521592488046736, 0.04705425351858139, -0.030282815918326378, -0.07627899944782257, 0.10846740007400513, 0.015908652916550636, -0.2360226809978485, 0.007994764484465122, 0.07219292968511581, -0.06302580237388611, 0.043624065816402435, -0.031042711809277534, 0.05396905168890953, 0.07590223848819733, 0.07798869162797928, -0.040760479867458344, 0.0777076855301857, 0.010845441371202469, -0.068447545170784, 0.002405361272394657, -0.033890530467033386, 0.024182306602597237, -0.11423572897911072, 0.054351408034563065, -0.08955278992652893, 0.04647817462682724, -0.024125095456838608, -0.05590018257498741, -0.008409409783780575, -0.0006762280827388167, -0.07073400169610977, 0.07158441096544266, 0.033804383128881454, -0.01287885569036007, -0.02460625022649765, -0.07829184085130692, -0.011827112175524235, 0.02333342842757702, -0.13333307206630707, -0.05124372988939285, -0.09754007309675217, -0.09669775515794754, 0.0958079993724823, 0.005240653175860643, -0.18258430063724518, 0.016731897369027138, -0.1165897399187088, 0.06435561925172806, -0.1856037676334381, 0.07899893820285797, 0.08063572645187378, 0.012318608351051807, -0.0041879527270793915, -0.060362428426742554, 0.026255134493112564, 0.05328713357448578, -0.0959041640162468, -0.07963146269321442 ]
null
null
transformers
# Hungarian GPT-2 poem generator in Petőfi style For further models, scripts and details, see [our repository](https://github.com/nytud/neural-models) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Pretrained on Hungarian Wikipedia - Finetuned on Petőfi Sándor összes költeményei ## Results | Model | Perplexity | | ------------- | ------------- | | **GPT-2 poem** | **47.46** | | GPT-2 news | 22.06 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-gpt2, title = {{"Az invazív medvék nem tolerálják a suzukis agressziót" - Magyar GPT-2 kísérleti modell}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {463--476} } ```
{"language": ["hu"], "license": "mit", "tags": ["text-generation"], "widget": [{"text": "Szegeden, janu\u00e1r v\u00e9g\u00e9n,"}]}
text-generation
NYTK/text-generation-poem-petofi-gpt2-small-hungarian
[ "transformers", "pytorch", "gpt2", "text-generation", "hu", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu" ]
TAGS #transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Hungarian GPT-2 poem generator in Petőfi style ============================================== For further models, scripts and details, see our repository or our demo site. * Pretrained on Hungarian Wikipedia * Finetuned on Petőfi Sándor összes költeményei Results ------- If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #hu #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.024446535855531693, 0.06442651897668839, -0.005283902399241924, 0.005421709269285202, 0.15189245343208313, 0.04103633388876915, 0.1590174287557602, 0.12009090930223465, 0.02282528020441532, -0.0362015962600708, 0.1452028602361679, 0.17255353927612305, 0.02968580275774002, 0.05267444625496864, -0.059397581964731216, -0.2648511528968811, 0.06013985350728035, 0.055165328085422516, 0.027937307953834534, 0.09990787506103516, 0.07869487255811691, -0.0467706173658371, 0.09483157843351364, -0.010166062973439693, -0.13541442155838013, -0.0009215997997671366, 0.030544904991984367, -0.10687468200922012, 0.12062094360589981, 0.051536839455366135, 0.07497936487197876, 0.052261196076869965, -0.007290958426892757, -0.16617117822170258, 0.01957644335925579, -0.007943658158183098, -0.07685668766498566, 0.06487102061510086, 0.0742487907409668, -0.027853649109601974, 0.13975831866264343, 0.10309870541095734, -0.057383809238672256, 0.07458312064409256, -0.14166532456874847, -0.08910268545150757, -0.08228657394647598, 0.0744413286447525, 0.039991091936826706, 0.07266135513782501, 0.004489591345191002, 0.11472947150468826, -0.08204830437898636, 0.05109870433807373, 0.13019758462905884, -0.36239200830459595, 0.028219548985362053, 0.08977200090885162, 0.07286117970943451, 0.043361105024814606, -0.049312274903059006, 0.07975959777832031, 0.058480486273765564, 0.02916320227086544, -0.01088144350796938, -0.07729782164096832, -0.07885213196277618, 0.06760762631893158, -0.0676661878824234, -0.06298419833183289, 0.24132965505123138, -0.053199201822280884, 0.04181264713406563, -0.022645732387900352, -0.0786643773317337, -0.0529494434595108, -0.013763604685664177, 0.04136775806546211, -0.03831791877746582, 0.09706863760948181, 0.06290393322706223, -0.06116681173443794, -0.13713759183883667, -0.028518225997686386, -0.17872998118400574, 0.12485569715499878, 0.02904171496629715, 0.05575539171695709, -0.1558239758014679, 0.11528731882572174, 0.0650847777724266, -0.09255215525627136, -0.002344757318496704, -0.07966698706150055, 0.09047751873731613, 0.0028985857497900724, -0.04140479490160942, 0.037594083696603775, 0.0901261642575264, 0.17127719521522522, -0.01925463229417801, 0.01340628881007433, -0.03309369832277298, 0.12480423599481583, 0.010683214291930199, 0.085645891726017, 0.003732475684955716, -0.006858707405626774, 0.06143209710717201, -0.13637258112430573, 0.03323112055659294, -0.056029610335826874, -0.19282719492912292, -0.03461216762661934, 0.020746605470776558, 0.08625409007072449, -0.0021457262337207794, 0.07621946930885315, -0.04444169998168945, 0.002717532915994525, 0.08717796951532364, -0.044873401522636414, 0.00219765049405396, 0.009420657530426979, 0.04040636122226715, 0.11303050071001053, 0.005569754168391228, 0.02201901562511921, -0.1050248071551323, 0.06383500248193741, -0.08940201252698898, -0.004893074743449688, -0.029474381357431412, -0.05170229822397232, 0.053500574082136154, -0.09550514817237854, 0.021534467115998268, -0.14984089136123657, -0.1671937108039856, 0.02053910866379738, 0.009411088190972805, -0.02797676809132099, -0.047701701521873474, -0.00040723884012550116, -0.023715317249298096, 0.023991303518414497, -0.08017874509096146, -0.04301779344677925, -0.0724930614233017, 0.1126035824418068, -0.05999046936631203, 0.04129078611731529, -0.18048147857189178, 0.07097332179546356, -0.11662252992391586, 0.007652892265468836, -0.06420972943305969, 0.025053562596440315, -0.034609220921993256, 0.15915444493293762, -0.0016363394679501653, -0.053298309445381165, -0.030820703133940697, 0.04428395256400108, -0.04315691068768501, 0.1601988524198532, -0.07142764329910278, -0.11052264273166656, 0.20445671677589417, -0.11022500693798065, -0.1639515459537506, 0.10048230737447739, 0.017233099788427353, 0.046849723905324936, 0.08650518953800201, 0.21436560153961182, 0.04108290746808052, -0.04139857739210129, 0.060355689376592636, 0.11370433866977692, -0.08151277899742126, -0.13770471513271332, 0.02530621737241745, -0.0374416746199131, -0.11890577524900436, 0.04865839332342148, 0.023252733051776886, 0.09056369960308075, -0.03743365779519081, -0.04593479260802269, -0.028416406363248825, 0.002795846899971366, 0.008359460160136223, 0.00632643885910511, 0.115205317735672, -0.05499560013413429, -0.012998374179005623, -0.0042811911553144455, -0.003246546722948551, -0.004296299070119858, 0.047338295727968216, -0.037186551839113235, 0.147731214761734, 0.02576482482254505, 0.042227063328027725, -0.1009068638086319, -0.03753373771905899, -0.00541055528447032, 0.07240055501461029, 0.02174565941095352, 0.09664860367774963, 0.019362779334187508, 0.007334986701607704, -0.012333037331700325, 0.0020609404891729355, 0.12872414290905, 0.0027728660497814417, -0.036169108003377914, -0.08374753594398499, 0.03686651214957237, -0.022213472053408623, 0.01844276301562786, -0.06616470217704773, 0.01728067547082901, 0.05110599845647812, 0.10243768244981766, -0.040222231298685074, 0.05817754939198494, -0.04080598056316376, 0.027588170021772385, -0.08460060507059097, -0.004183882847428322, 0.11506252735853195, 0.0159800685942173, -0.07466421276330948, 0.20976728200912476, -0.1456717848777771, 0.2178497314453125, 0.20975033938884735, -0.2466924488544464, 0.019782500341534615, -0.08332116156816483, -0.034346774220466614, 0.010166656225919724, 0.05301915481686592, -0.02690899930894375, 0.0962296798825264, -0.01359652541577816, 0.17596982419490814, -0.06034132465720177, -0.04245733097195625, -0.013943384401500225, -0.048455070704221725, -0.007327884901314974, 0.060019783675670624, 0.19667977094650269, -0.16150952875614166, 0.19361229240894318, 0.19388531148433685, 0.05092756077647209, 0.17213115096092224, -0.0361475870013237, -0.0072480193339288235, 0.054641805589199066, -0.012130544520914555, -0.03466791287064552, 0.007752159144729376, -0.15013165771961212, -0.012955992482602596, 0.07595668733119965, 0.03772520646452904, 0.1082787811756134, -0.16357405483722687, -0.06556404381990433, -0.050835367292165756, -0.02280302159488201, -0.0561077855527401, 0.08965763449668884, 0.01764780841767788, 0.10956703126430511, -0.028480472043156624, -0.015541122294962406, 0.10553079098463058, 0.023712221533060074, -0.10376100987195969, 0.16592484712600708, -0.14151129126548767, -0.28058555722236633, -0.19364097714424133, -0.14757555723190308, -0.022010819986462593, 0.0454644113779068, 0.12333601713180542, -0.0708475410938263, -0.0395173504948616, 0.009022719226777554, 0.060825612396001816, -0.08098970353603363, -0.013901540078222752, -0.04827691614627838, 0.043721605092287064, -0.07584574818611145, -0.09094106405973434, -0.07148627936840057, -0.009002693928778172, -0.045703284442424774, 0.14232787489891052, -0.09087596833705902, 0.0613100528717041, 0.15175233781337738, 0.040552977472543716, 0.04367755353450775, -0.053342293947935104, 0.18162685632705688, -0.0728849396109581, 0.0047697704285383224, 0.19319851696491241, -0.016062073409557343, 0.0797395408153534, 0.13064521551132202, 0.03221523016691208, -0.0828077644109726, -0.0033836057409644127, -0.05982210487127304, -0.08082478493452072, -0.22170674800872803, -0.12013155966997147, -0.1360362023115158, 0.07026194781064987, 0.048113711178302765, 0.07534574717283249, 0.11162036657333374, 0.07984919100999832, -0.04571521654725075, 0.05665626376867294, -0.0055448212660849094, 0.08708098530769348, 0.2784316837787628, -0.024074938148260117, 0.10440028458833694, -0.0806795135140419, -0.10054991394281387, 0.10526669770479202, 0.08823179453611374, 0.1543513983488083, 0.1049911305308342, 0.08350612223148346, 0.056615348905324936, 0.10715526342391968, 0.1357964128255844, 0.09179457277059555, 0.03654608502984047, -0.014724135398864746, -0.03677833080291748, -0.022476676851511, -0.019802941009402275, 0.05390111356973648, -0.01149095967411995, -0.17413784563541412, -0.045528993010520935, -0.1286485642194748, 0.06454116851091385, 0.08749070763587952, 0.06491164863109589, -0.19882890582084656, 0.010718254372477531, 0.09701590985059738, -0.004895768128335476, -0.10224407911300659, 0.09910045564174652, -0.025299232453107834, -0.13335692882537842, 0.08806368708610535, -0.026758888736367226, 0.12787169218063354, -0.03965023532509804, 0.07060170918703079, -0.023930616676807404, -0.06949058920145035, 0.034488581120967865, 0.1240018755197525, -0.28338509798049927, 0.22057338058948517, -0.014246128499507904, -0.04048345983028412, -0.07417265325784683, 0.0035282259341329336, 0.023177562281489372, 0.16032080352306366, 0.08116693794727325, 0.020816806703805923, -0.07006371766328812, -0.10583944618701935, -0.007792746182531118, 0.049254100769758224, 0.08401104807853699, -0.04200863838195801, -0.012769831344485283, -0.06288731843233109, 0.02033419907093048, -0.02756492793560028, -0.023636765778064728, -0.01809322088956833, -0.16160239279270172, 0.07787936925888062, 0.05223245546221733, 0.1178867295384407, -0.01626501977443695, -0.0217214897274971, -0.16014127433300018, 0.1750488132238388, -0.1077793687582016, -0.10157216340303421, -0.10585608333349228, -0.08707766979932785, 0.035963043570518494, -0.054577577859163284, 0.02513476088643074, -0.07401822507381439, -0.01528279110789299, -0.07684968411922455, -0.19428443908691406, 0.11471965163946152, -0.08545909821987152, -0.0644119381904602, -0.037618789821863174, 0.18264155089855194, -0.08172924071550369, 0.017856858670711517, 0.020340848714113235, 0.014808928593993187, -0.1084558442234993, -0.14088505506515503, 0.0107803326100111, -0.017870590090751648, 0.05620378628373146, -0.012320109643042088, -0.11556130647659302, 0.055993128567934036, -0.010881399735808372, -0.0519438274204731, 0.26426297426223755, 0.1979767233133316, -0.05268546938896179, 0.20718514919281006, 0.15277720987796783, -0.10520489513874054, -0.3226362466812134, -0.1391478180885315, -0.15682242810726166, -0.0487474761903286, -0.031994499266147614, -0.2014162391424179, 0.06625549495220184, 0.02575116790831089, -0.03733588010072708, 0.17788876593112946, -0.23563365638256073, -0.08452305942773819, 0.1635710597038269, -0.023801609873771667, 0.3402602970600128, -0.13858173787593842, -0.10903534293174744, -0.06314775347709656, -0.18811683356761932, 0.1335848867893219, -0.0036088114138692617, 0.1073698177933693, -0.040197741240262985, 0.10855346918106079, 0.0017266037175431848, -0.04917115718126297, 0.12190292030572891, 0.006406732369214296, 0.01216585747897625, -0.11797633767127991, -0.002715898910537362, 0.09382028132677078, 0.022716641426086426, 0.018303727731108665, -0.11863032728433609, 0.02158787101507187, -0.10778754949569702, -0.03566008433699608, -0.06566128879785538, 0.07865149527788162, 0.016420677304267883, -0.06968382000923157, -0.03754795342683792, -0.028622964397072792, -0.026134194806218147, -0.005152544472366571, 0.1838589310646057, -0.015139875002205372, 0.14358453452587128, 0.058257389813661575, 0.07447391748428345, -0.1889023780822754, 0.04200546443462372, -0.09747598320245743, -0.07338958978652954, 0.08674562722444534, -0.04970376938581467, 0.02020965702831745, 0.1372532695531845, -0.0433538593351841, 0.08820037543773651, 0.10609222203493118, 0.0014972540084272623, 0.0031424693297594786, 0.13609756529331207, -0.23313502967357635, -0.035710278898477554, -0.06387406587600708, -0.018106726929545403, 0.11611349135637283, 0.07669027149677277, 0.14995162189006805, 0.0008275830186903477, -0.033464595675468445, 0.019735658541321754, 0.021087991073727608, -0.05416246876120567, 0.03781668096780777, 0.010548217222094536, 0.008470321074128151, -0.15467771887779236, 0.07932977378368378, 0.015627630054950714, -0.06456098705530167, -0.0008942377171479166, 0.14071448147296906, -0.14781273901462555, -0.11724218726158142, -0.08085466921329498, 0.09021368622779846, -0.1691446602344513, -0.05805530399084091, -0.05536152422428131, -0.16835367679595947, 0.07036442309617996, 0.08925706148147583, 0.06654699891805649, 0.09770942479372025, -0.043893877416849136, -0.03316112607717514, -0.01412566751241684, -0.029100704938173294, -0.05910516530275345, 0.027339445427060127, -0.09919710457324982, 0.05321282520890236, 0.0031508002430200577, 0.11772789061069489, -0.07127951085567474, -0.05028389394283295, -0.15453168749809265, 0.017703447490930557, -0.16680297255516052, -0.05755423754453659, -0.1061897948384285, -0.035119496285915375, -0.006896600592881441, -0.041298285126686096, -0.06949413567781448, -0.03235795348882675, -0.12105913460254669, 0.0018677649786695838, -0.030675111338496208, 0.07056771963834763, -0.07854915410280228, 0.009825990535318851, 0.09353702515363693, -0.02696269005537033, 0.13759441673755646, 0.10291711986064911, -0.07376792281866074, 0.11211204528808594, -0.13574577867984772, -0.09049578011035919, 0.09984620660543442, 0.024601422250270844, 0.0200074203312397, 0.03813327103853226, 0.0140736298635602, 0.0842052474617958, -0.00204268516972661, 0.060289233922958374, -0.008211317472159863, -0.12060578167438507, -0.014923413284122944, -0.046583499759435654, -0.12093952298164368, -0.03871028497815132, -0.05750548467040062, 0.05078312009572983, 0.004832398146390915, 0.11502831429243088, -0.047292787581682205, 0.06918356567621231, -0.07192657887935638, 0.02355329878628254, 0.0017273377161473036, -0.1526312530040741, -0.10156121850013733, -0.10410602390766144, -0.00379602937027812, 0.026005271822214127, 0.3125312626361847, 0.020517075434327126, -0.0679057165980339, 0.036669813096523285, 0.13470028340816498, -0.009366889484226704, -0.0042932420037686825, 0.2756626009941101, 0.09990078210830688, -0.0357833132147789, -0.142798513174057, 0.053054794669151306, -0.0076651317067444324, -0.0724271759390831, 0.1527814269065857, 0.038776885718107224, -0.028265303000807762, 0.0613541454076767, 0.02463332563638687, -0.020244063809514046, -0.10826198011636734, -0.07701799273490906, 0.01647469960153103, 0.07198598235845566, -0.022806141525506973, 0.13653497397899628, 0.15936963260173798, -0.04437195882201195, 0.02629261463880539, -0.02879677526652813, -0.03379904106259346, -0.16568318009376526, -0.17105640470981598, -0.05596132576465607, -0.1578071266412735, 0.032237254083156586, -0.06640657037496567, 0.045383207499980927, 0.0637032687664032, 0.04862026497721672, -0.08113893121480942, 0.04375147446990013, 0.027203233912587166, -0.10487627983093262, 0.019505251199007034, -0.023610932752490044, 0.04255659133195877, -0.06507734954357147, -0.044210031628608704, -0.06579286605119705, -0.027198977768421173, -0.011024805717170238, 0.047624632716178894, -0.01599903777241707, 0.019590677693486214, -0.14198802411556244, -0.07055068016052246, -0.04523460939526558, 0.058901771903038025, -0.017872847616672516, 0.12684571743011475, -0.00197883159853518, -0.0008266193326562643, 0.05877339467406273, 0.16617603600025177, -0.047262147068977356, -0.10129278153181076, -0.01829385571181774, 0.2295742928981781, 0.03662879392504692, 0.06899334490299225, 0.0007119099027477205, 0.015585449524223804, -0.054822761565446854, 0.35495829582214355, 0.3228115737438202, -0.08402803540229797, 0.004838899243623018, 0.006045275367796421, 0.03081054985523224, 0.12314484268426895, 0.15377536416053772, 0.08394250273704529, 0.25629183650016785, -0.06426944583654404, -0.06037350371479988, -0.05681953206658363, 0.0014238799922168255, -0.11415211111307144, 0.1209077313542366, 0.03245845437049866, -0.08393373340368271, -0.03646807372570038, 0.09389273077249527, -0.21150927245616913, 0.10761731117963791, -0.05378012731671333, -0.1363501250743866, -0.022183990105986595, 0.01559603214263916, 0.13962422311306, 0.00007521592488046736, 0.04705425351858139, -0.030282815918326378, -0.07627899944782257, 0.10846740007400513, 0.015908652916550636, -0.2360226809978485, 0.007994764484465122, 0.07219292968511581, -0.06302580237388611, 0.043624065816402435, -0.031042711809277534, 0.05396905168890953, 0.07590223848819733, 0.07798869162797928, -0.040760479867458344, 0.0777076855301857, 0.010845441371202469, -0.068447545170784, 0.002405361272394657, -0.033890530467033386, 0.024182306602597237, -0.11423572897911072, 0.054351408034563065, -0.08955278992652893, 0.04647817462682724, -0.024125095456838608, -0.05590018257498741, -0.008409409783780575, -0.0006762280827388167, -0.07073400169610977, 0.07158441096544266, 0.033804383128881454, -0.01287885569036007, -0.02460625022649765, -0.07829184085130692, -0.011827112175524235, 0.02333342842757702, -0.13333307206630707, -0.05124372988939285, -0.09754007309675217, -0.09669775515794754, 0.0958079993724823, 0.005240653175860643, -0.18258430063724518, 0.016731897369027138, -0.1165897399187088, 0.06435561925172806, -0.1856037676334381, 0.07899893820285797, 0.08063572645187378, 0.012318608351051807, -0.0041879527270793915, -0.060362428426742554, 0.026255134493112564, 0.05328713357448578, -0.0959041640162468, -0.07963146269321442 ]
null
null
transformers
# BART Translation model For further models, scripts and details, see [our repository](https://github.com/nytud/machine-translation) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Source language: English - Target language: Hungarian - BART base model: - Pretrained on English WikiText-103 and Hungarian Wikipedia - Finetuned on subcorpora from OPUS - Segments: 56.837.602 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) - max_source_length = 128 - max_target_length = 128 ## Results | Model | BLEU | chrF-3 | chrF-6 | | ------------- | ------------- | ------------- | ------------- | | Google | 25.30 | 54.09 | 49.0 | | **BART** | **36.89** | **60.77** | **56.4** | | mT5 | 27.69 | 53.73 | 48.57 | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {laki-yang-mt, title = {{Jobban fordítunk magyarra, mint a Google!}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Laki, László and Yang, Zijian Győző}, pages = {357--372} } ```
{"language": ["en", "hu"], "license": "apache-2.0", "tags": ["translation"], "metrics": ["sacrebleu", "chrf"], "widget": [{"text": "This may not make much sense to you, sir, but I'd like to ask your permission to date your daughter.", "example_title": "Translation: English-Hungarian"}]}
translation
NYTK/translation-bart-128-en-hu
[ "transformers", "pytorch", "bart", "text2text-generation", "translation", "en", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en", "hu" ]
TAGS #transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
BART Translation model ====================== For further models, scripts and details, see our repository or our demo site. * Source language: English * Target language: Hungarian * BART base model: + Pretrained on English WikiText-103 and Hungarian Wikipedia + Finetuned on subcorpora from OPUS - Segments: 56.837.602 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) * max\_source\_length = 128 * max\_target\_length = 128 Results ------- If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 53 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03735842928290367, 0.0708189532160759, -0.0063658771105110645, 0.004717979580163956, 0.08909185230731964, 0.01860145479440689, 0.14654473960399628, 0.10724644362926483, -0.010446462780237198, -0.06789892911911011, 0.15293097496032715, 0.13722877204418182, 0.029058555141091347, 0.08973358571529388, -0.03845420107245445, -0.24617114663124084, 0.08254288136959076, 0.030061660334467888, 0.0031977912876755, 0.09431341290473938, 0.10761198401451111, -0.02981560118496418, 0.07836614549160004, -0.006377441342920065, -0.055549029260873795, 0.02299194410443306, 0.008047383278608322, -0.10016058385372162, 0.08607735484838486, 0.056772008538246155, 0.06834948807954788, 0.046982232481241226, -0.007208147551864386, -0.20808002352714539, 0.013905951753258705, -0.026194928213953972, -0.06792212277650833, 0.03536509722471237, 0.04394847899675369, -0.048289891332387924, 0.10664878040552139, 0.05561313405632973, -0.05682370439171791, 0.07655177265405655, -0.09848194569349289, -0.13398556411266327, -0.09858586639165878, 0.07387565821409225, 0.05558088421821594, 0.07136350870132446, 0.02131223864853382, 0.12403032928705215, -0.07902151346206665, 0.057557448744773865, 0.1308002918958664, -0.3884146213531494, 0.0017745224758982658, 0.007732307072728872, 0.07064133137464523, 0.0705355852842331, -0.004884765017777681, 0.05799822881817818, 0.06246720254421234, 0.031047532334923744, -0.042416881769895554, -0.09446555376052856, -0.12134203314781189, 0.02520992048084736, -0.049436114728450775, -0.04359428212046623, 0.24831293523311615, -0.03951959311962128, 0.028663676232099533, -0.007752809673547745, -0.06125439703464508, 0.0012316082138568163, -0.0426255539059639, 0.034866493195295334, 0.0009554862626828253, 0.09948564320802689, 0.09280042350292206, -0.06012608855962753, -0.14607518911361694, 0.021680910140275955, -0.21356433629989624, 0.07722430676221848, 0.03439454734325409, 0.06495949625968933, -0.1623787134885788, 0.06119706481695175, 0.08728300034999847, -0.13025210797786713, 0.010251384228467941, -0.08394642919301987, 0.1357494294643402, 0.06408490240573883, -0.04571997746825218, 0.0565309040248394, 0.1176072433590889, 0.19620467722415924, 0.02438756823539734, 0.040661562234163284, -0.07458208501338959, 0.1288108080625534, -0.032873447984457016, 0.07882745563983917, 0.02842601202428341, -0.0895468145608902, 0.07992582023143768, -0.12725813686847687, 0.07464101165533066, -0.02039291150867939, -0.19082708656787872, -0.04992600530385971, -0.03709353879094124, 0.11751330643892288, 0.04951104894280434, 0.035345446318387985, -0.03849414363503456, 0.021398814395070076, 0.09036660939455032, -0.04625096917152405, 0.007032121531665325, 0.010462605394423008, 0.0337727852165699, 0.143027201294899, 0.043562985956668854, 0.0407850407063961, -0.08296436071395874, 0.036798227578401566, -0.03785128518939018, 0.006033783312886953, -0.031177077442407608, -0.01528608426451683, 0.0736941322684288, -0.05986093357205391, 0.05384416878223419, -0.15460368990898132, -0.13887791335582733, 0.019885815680027008, 0.0440637432038784, -0.008793863467872143, -0.04768088832497597, -0.01606938987970352, 0.004706487990915775, 0.03214923292398453, -0.10297095775604248, -0.018705692142248154, -0.07395651191473007, 0.06261302530765533, -0.06161259487271309, 0.05837038904428482, -0.19925759732723236, 0.05517185479402542, -0.12758444249629974, -0.00022858093143440783, -0.07196646183729172, 0.0014107590541243553, -0.07659658789634705, 0.20560131967067719, -0.03091990202665329, -0.03924243897199631, -0.04068548604846001, 0.035138197243213654, -0.03472454100847244, 0.16374877095222473, -0.13877661526203156, -0.06711002439260483, 0.25354287028312683, -0.10734231770038605, -0.18591700494289398, 0.09648971259593964, 0.02245376445353031, 0.051101356744766235, 0.06351397186517715, 0.19068948924541473, 0.04456101357936859, -0.02839774265885353, 0.11076478660106659, 0.12850551307201385, -0.03891189768910408, -0.13729609549045563, 0.039272960275411606, -0.056502267718315125, -0.09683272242546082, 0.05905988812446594, 0.008246330544352531, 0.12821030616760254, -0.005702444352209568, -0.07229344546794891, -0.029214855283498764, -0.03094756230711937, -0.001333004329353571, -0.00771032040938735, 0.08952383697032928, -0.08022774010896683, 0.00800352729856968, -0.06911735236644745, 0.022681981325149536, 0.03809133917093277, 0.07439914345741272, -0.04740302264690399, 0.0767180323600769, 0.05535530671477318, 0.042411793023347855, -0.07699446380138397, 0.04141481593251228, -0.023818492889404297, 0.05898645520210266, 0.030571015551686287, 0.08439233154058456, 0.033135510981082916, -0.0507357232272625, -0.005244337022304535, -0.0026611380744725466, 0.14063683152198792, 0.025605523958802223, -0.025774765759706497, -0.11387355625629425, 0.060993779450654984, -0.01585279405117035, 0.02726089395582676, -0.01939920336008072, 0.012124531902372837, 0.013165476731956005, 0.13644146919250488, -0.04584013670682907, 0.11468890309333801, -0.047951437532901764, 0.028947537764906883, -0.08284896612167358, 0.0016988790594041348, 0.11051696538925171, 0.019900327548384666, -0.1133105456829071, 0.2610303461551666, -0.09784999489784241, 0.2292214334011078, 0.2298172414302826, -0.213199183344841, 0.07150284945964813, -0.021397829055786133, -0.01812385953962803, -0.008525249548256397, 0.08736217767000198, 0.031289007514715195, 0.045909978449344635, 0.011007173918187618, 0.16837526857852936, -0.05081033334136009, -0.015084093436598778, -0.023579012602567673, -0.07494287192821503, -0.02229919843375683, 0.062227651476860046, 0.12064095586538315, -0.15992619097232819, 0.1858367919921875, 0.31871408224105835, 0.03154115006327629, 0.1524982452392578, -0.06096424162387848, 0.003280099481344223, 0.047542013227939606, -0.011181379668414593, -0.04095815494656563, 0.023698676377534866, -0.13782858848571777, -0.024412527680397034, 0.08512160927057266, 0.05226289480924606, 0.0907956138253212, -0.1330796629190445, -0.05143057182431221, -0.023933352902531624, -0.008138210512697697, -0.017999662086367607, 0.06256125122308731, -0.004928169772028923, 0.08997568488121033, -0.04414178431034088, -0.0535825714468956, 0.09392615407705307, 0.01045303139835596, -0.09130482375621796, 0.12745356559753418, -0.20502224564552307, -0.287886381149292, -0.18609458208084106, -0.1119462177157402, -0.018422001972794533, 0.01211863849312067, 0.1490730494260788, -0.04722539335489273, -0.04971565306186676, 0.018060985952615738, -0.009352610446512699, -0.04980888590216637, -0.03368483856320381, -0.009895212016999722, 0.04757659509778023, -0.018840797245502472, -0.09704112261533737, -0.050014566630125046, 0.033870529383420944, -0.015277755446732044, 0.1034087985754013, -0.11793483793735504, 0.08422217518091202, 0.07717443257570267, 0.045501530170440674, 0.03626032918691635, -0.033867817372083664, 0.1411006897687912, -0.054277289658784866, 0.0028075892478227615, 0.198373481631279, -0.019888650625944138, 0.06610552966594696, 0.14853514730930328, 0.011040296405553818, -0.05744146555662155, 0.003486897796392441, -0.08713267743587494, -0.06319531053304672, -0.22912132740020752, -0.11871354281902313, -0.13371442258358002, 0.07070492953062057, 0.013968661427497864, 0.05197202414274216, 0.06450722366571426, 0.07715590298175812, -0.03238489851355553, 0.05174291878938675, -0.015135691501200199, 0.08160726726055145, 0.26983851194381714, -0.043525226414203644, 0.10621801018714905, -0.11354970186948776, -0.07170472294092178, 0.12426650524139404, 0.11078529059886932, 0.10657428950071335, 0.13089054822921753, 0.07763240486383438, 0.07802248746156693, 0.19519709050655365, 0.09642039984464645, 0.10434423387050629, 0.05273790657520294, -0.00913012120872736, -0.061664581298828125, -0.029822198674082756, -0.06435637176036835, 0.05333514139056206, -0.010703850537538528, -0.1387813836336136, -0.029492449015378952, -0.10033861547708511, 0.0812310203909874, 0.11384829133749008, 0.03142758086323738, -0.11102967709302902, 0.02890203148126602, 0.11490664631128311, -0.0037279801908880472, -0.07759895920753479, 0.13244014978408813, -0.04358062148094177, -0.1359173208475113, 0.13616138696670532, -0.001071441569365561, 0.14185042679309845, 0.017034394666552544, 0.05971655249595642, -0.09046122431755066, -0.131461501121521, 0.07976676523685455, 0.12791642546653748, -0.3536975383758545, 0.21171458065509796, -0.01882099360227585, -0.04073343798518181, -0.06990761309862137, -0.011132965795695782, 0.05627656728029251, 0.19909033179283142, 0.08930260688066483, 0.027432743459939957, -0.12880438566207886, -0.024126391857862473, -0.03545636311173439, 0.06034084036946297, 0.056465256959199905, 0.010937356390058994, -0.04104456678032875, -0.06369908899068832, -0.00018953910330310464, -0.04238871857523918, 0.07441409677267075, -0.0528387725353241, -0.17875036597251892, 0.06286247074604034, 0.10427732765674591, 0.0694628357887268, -0.04493080452084541, -0.009582164697349072, -0.1016954854130745, 0.14128297567367554, -0.09775903820991516, -0.07524728029966354, -0.0964631587266922, -0.1410622000694275, 0.07040037959814072, -0.06436016410589218, 0.04296823590993881, -0.08105991035699844, -0.04582658037543297, -0.0744117796421051, -0.1856449842453003, 0.11233552545309067, -0.12672798335552216, -0.04218152537941933, -0.0403851680457592, 0.14533957839012146, -0.09770956635475159, 0.027783505618572235, 0.029350291937589645, -0.022784516215324402, -0.12617844343185425, -0.12760664522647858, -0.048220258206129074, 0.009369192644953728, 0.04164532572031021, -0.02727167308330536, -0.11109064519405365, -0.04274356737732887, -0.008403738029301167, -0.08663244545459747, 0.22498087584972382, 0.17688363790512085, -0.06489399820566177, 0.20617356896400452, 0.19289743900299072, -0.09467557072639465, -0.3003140091896057, -0.19204306602478027, -0.16608810424804688, -0.04091419279575348, -0.000037791436625411734, -0.1440177708864212, 0.07268989086151123, -0.016541671007871628, -0.06700626015663147, 0.12216243892908096, -0.19584503769874573, -0.07794889807701111, 0.20350809395313263, -0.042302608489990234, 0.352806955575943, -0.13830667734146118, -0.09643393009901047, -0.09239218384027481, -0.25465017557144165, 0.09766785055398941, -0.08289084583520889, 0.05200343206524849, -0.0028419436421245337, 0.11247234791517258, -0.014358539134263992, -0.032498642802238464, 0.13281774520874023, -0.02796340361237526, -0.018777750432491302, -0.11821520328521729, 0.029576029628515244, 0.05683484673500061, 0.017797522246837616, 0.02123440057039261, -0.16182996332645416, 0.02194049395620823, -0.12273713201284409, -0.012828427366912365, -0.05853688344359398, 0.07070505619049072, -0.0192741546779871, -0.04020616039633751, -0.02240816131234169, -0.029933452606201172, 0.02660970576107502, 0.025957902893424034, 0.2101018875837326, 0.006043384317308664, 0.07968808710575104, 0.09254510700702667, 0.09120960533618927, -0.20598618686199188, 0.10536463558673859, -0.1156899482011795, -0.07408688962459564, 0.05852309614419937, -0.06782142072916031, 0.04227958619594574, 0.1248101219534874, -0.08235998451709747, 0.05750747397542, 0.06209380924701691, -0.004015931859612465, -0.00853781122714281, 0.14214013516902924, -0.17039085924625397, -0.022207556292414665, -0.03684140369296074, 0.07920467108488083, 0.1188640147447586, 0.06441446393728256, 0.15566128492355347, 0.012663294561207294, -0.05295023322105408, 0.0004402738995850086, 0.02895473875105381, -0.08901786804199219, 0.02347887121140957, 0.02940102480351925, -0.012404631823301315, -0.13298459351062775, 0.08785374462604523, -0.002380548045039177, -0.13840822875499725, -0.011602937243878841, 0.15653981268405914, -0.15225176513195038, -0.12419059872627258, -0.02523008920252323, 0.09794725477695465, -0.17347101867198944, -0.11387805640697479, -0.03855196759104729, -0.15765345096588135, 0.043831102550029755, 0.07427781075239182, 0.08032293617725372, 0.06250200420618057, -0.0229028332978487, -0.06177230924367905, 0.08364497870206833, -0.026521150022745132, -0.025631781667470932, 0.01405488420277834, -0.0675983726978302, 0.003943623974919319, 0.011056224815547466, 0.1221824586391449, -0.05104739964008331, -0.010481229983270168, -0.10841026902198792, 0.03283127769827843, -0.22232785820960999, -0.024584736675024033, -0.11487103998661041, -0.02648782916367054, 0.013407229445874691, -0.1025695875287056, -0.05806844308972359, -0.015464379452168941, -0.12787598371505737, -0.007758643478155136, -0.04635034501552582, 0.0721234381198883, -0.07942351698875427, -0.013850534334778786, 0.1086130142211914, -0.014244987629354, 0.09946450591087341, 0.1204531118273735, -0.06933922320604324, 0.10797575116157532, -0.11670500785112381, -0.10494954884052277, 0.07192790508270264, 0.05307643488049507, 0.04698742926120758, 0.048363424837589264, -0.022572461515665054, 0.1329403817653656, 0.027773253619670868, 0.031298112124204636, 0.035803601145744324, -0.1222395971417427, -0.05603553354740143, -0.02392624504864216, -0.12869581580162048, -0.006517562549561262, -0.07643602788448334, 0.09002377092838287, 0.0014069984899833798, 0.13533709943294525, -0.04636082798242569, 0.06356159597635269, -0.025901151821017265, 0.033259909600019455, -0.031284138560295105, -0.14292703568935394, -0.10636031627655029, -0.11002235859632492, -0.03428484499454498, 0.02111552655696869, 0.2675379514694214, -0.007412616163492203, -0.0011902720434591174, 0.047001663595438004, 0.0538569837808609, -0.041549015790224075, 0.007161622866988182, 0.2368210405111313, 0.06566151231527328, -0.016559582203626633, -0.16044844686985016, 0.014925233088433743, -0.0004924998502247036, -0.09223680198192596, 0.0787888765335083, 0.10183033347129822, 0.058205582201480865, 0.10489338636398315, -0.0013836395228281617, 0.020525246858596802, -0.11327996850013733, -0.1452610045671463, 0.02425878681242466, 0.06012146547436714, -0.016985567286610603, 0.14975951611995697, 0.15312522649765015, -0.05440497025847435, 0.01569138467311859, -0.05376313999295235, -0.005363878794014454, -0.16347958147525787, -0.12007642537355423, -0.06031479686498642, -0.1546294242143631, -0.0016970922006294131, -0.04239609092473984, 0.05792216211557388, 0.019824126735329628, 0.06266969442367554, -0.08404241502285004, 0.027103235945105553, -0.007548357360064983, -0.09965439140796661, 0.03144558146595955, -0.005556364543735981, 0.0426727756857872, -0.03501642122864723, -0.04869592562317848, -0.1017841026186943, -0.02718989923596382, -0.030012471601366997, 0.045262787491083145, -0.02940499782562256, 0.020081210881471634, -0.10129629820585251, -0.05531025305390358, -0.05986063554883003, 0.03019626997411251, 0.013223668560385704, 0.1806917041540146, 0.02095361426472664, 0.02755020000040531, 0.07106707990169525, 0.16085049510002136, -0.06530110538005829, -0.1682053804397583, -0.0479503758251667, 0.15082982182502747, 0.03682824596762657, 0.07699032872915268, -0.02752351388335228, 0.0385272242128849, -0.0812029093503952, 0.3773270845413208, 0.2938134968280792, -0.10734356194734573, 0.003345073899254203, 0.007760334759950638, 0.03554527461528778, 0.07851110398769379, 0.11729753762483597, 0.13349363207817078, 0.23671966791152954, -0.0678587332367897, -0.04882834106683731, -0.05828458070755005, -0.008262738585472107, -0.17006874084472656, 0.09817139804363251, -0.03975054994225502, -0.09048983454704285, -0.002391210524365306, 0.09659437835216522, -0.1522582471370697, 0.07626335322856903, -0.05126909539103508, -0.14105075597763062, -0.019743869081139565, -0.004941374063491821, 0.24059149622917175, 0.04790067300200462, 0.017064275220036507, -0.019664300605654716, -0.0630330890417099, 0.15666911005973816, 0.00010800204472616315, -0.19212886691093445, -0.014136869460344315, 0.07452712208032608, -0.13787908852100372, 0.08068044483661652, -0.009130553342401981, 0.014185438863933086, 0.08352915942668915, 0.10974585264921188, -0.05244855582714081, 0.06941255927085876, 0.02869943156838417, -0.011485883966088295, -0.002988257445394993, -0.07497024536132812, 0.0016435019206255674, -0.10233531892299652, 0.056106358766555786, -0.08443167805671692, 0.07579674571752548, 0.06800483912229538, -0.0482083261013031, 0.004609747789800167, 0.04391481727361679, -0.09131061285734177, 0.06162353232502937, 0.008558813482522964, -0.02852494828402996, -0.04070623591542244, -0.054914895445108414, -0.026274548843503, 0.016836408525705338, -0.11795610189437866, -0.05423274263739586, -0.026548760011792183, -0.09029786288738251, 0.07379382103681564, 0.03921010345220566, -0.13078740239143372, 0.004845517221838236, -0.10464652627706528, 0.04783692955970764, -0.16490855813026428, 0.07324777543544769, 0.0639973059296608, -0.008060651831328869, 0.011700226925313473, -0.12323325127363205, 0.0391155444085598, 0.027319492772221565, -0.07329191267490387, -0.07980701327323914 ]
null
null
transformers
# BART Translation model For further models, scripts and details, see [our repository](https://github.com/nytud/machine-translation) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Source language: English - Target language: Hungarian - Pretrained on English WikiText-103 and Hungarian Wikipedia - Finetuned on subcorpora from OPUS - Segments: 56.837.602 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) ## Results | Model | BLEU | chrF-3 | | ------------- | ------------- | ------------- | | Google en-hu | 25.30 | 54.08 | | **BART-base-enhu** | **34.38** | **58.88** | | Google hu-en| 34.48 | 59.59 | | **BART-base-huen** | **38.03** | **61,37** | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-bart, title = {{BARTerezzünk! Messze, messze, messze a világtól, - BART kísérleti modellek magyar nyelvre}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {15--29} } ```
{"language": ["en", "hu"], "license": "apache-2.0", "tags": ["translation"], "metrics": ["sacrebleu", "chrf"], "widget": [{"text": "This may not make much sense to you, sir, but I'd like to ask your permission to date your daughter.", "example_title": "Translation: English-Hungarian"}]}
translation
NYTK/translation-bart-en-hu
[ "transformers", "pytorch", "bart", "text2text-generation", "translation", "en", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en", "hu" ]
TAGS #transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
BART Translation model ====================== For further models, scripts and details, see our repository or our demo site. * Source language: English * Target language: Hungarian * Pretrained on English WikiText-103 and Hungarian Wikipedia * Finetuned on subcorpora from OPUS + Segments: 56.837.602 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) Results ------- Model: Google en-hu, BLEU: 25.30, chrF-3: 54.08 Model: BART-base-enhu, BLEU: 34.38, chrF-3: 58.88 Model: Google hu-en, BLEU: 34.48, chrF-3: 59.59 Model: BART-base-huen, BLEU: 38.03, chrF-3: 61,37 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 53 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03735842928290367, 0.0708189532160759, -0.0063658771105110645, 0.004717979580163956, 0.08909185230731964, 0.01860145479440689, 0.14654473960399628, 0.10724644362926483, -0.010446462780237198, -0.06789892911911011, 0.15293097496032715, 0.13722877204418182, 0.029058555141091347, 0.08973358571529388, -0.03845420107245445, -0.24617114663124084, 0.08254288136959076, 0.030061660334467888, 0.0031977912876755, 0.09431341290473938, 0.10761198401451111, -0.02981560118496418, 0.07836614549160004, -0.006377441342920065, -0.055549029260873795, 0.02299194410443306, 0.008047383278608322, -0.10016058385372162, 0.08607735484838486, 0.056772008538246155, 0.06834948807954788, 0.046982232481241226, -0.007208147551864386, -0.20808002352714539, 0.013905951753258705, -0.026194928213953972, -0.06792212277650833, 0.03536509722471237, 0.04394847899675369, -0.048289891332387924, 0.10664878040552139, 0.05561313405632973, -0.05682370439171791, 0.07655177265405655, -0.09848194569349289, -0.13398556411266327, -0.09858586639165878, 0.07387565821409225, 0.05558088421821594, 0.07136350870132446, 0.02131223864853382, 0.12403032928705215, -0.07902151346206665, 0.057557448744773865, 0.1308002918958664, -0.3884146213531494, 0.0017745224758982658, 0.007732307072728872, 0.07064133137464523, 0.0705355852842331, -0.004884765017777681, 0.05799822881817818, 0.06246720254421234, 0.031047532334923744, -0.042416881769895554, -0.09446555376052856, -0.12134203314781189, 0.02520992048084736, -0.049436114728450775, -0.04359428212046623, 0.24831293523311615, -0.03951959311962128, 0.028663676232099533, -0.007752809673547745, -0.06125439703464508, 0.0012316082138568163, -0.0426255539059639, 0.034866493195295334, 0.0009554862626828253, 0.09948564320802689, 0.09280042350292206, -0.06012608855962753, -0.14607518911361694, 0.021680910140275955, -0.21356433629989624, 0.07722430676221848, 0.03439454734325409, 0.06495949625968933, -0.1623787134885788, 0.06119706481695175, 0.08728300034999847, -0.13025210797786713, 0.010251384228467941, -0.08394642919301987, 0.1357494294643402, 0.06408490240573883, -0.04571997746825218, 0.0565309040248394, 0.1176072433590889, 0.19620467722415924, 0.02438756823539734, 0.040661562234163284, -0.07458208501338959, 0.1288108080625534, -0.032873447984457016, 0.07882745563983917, 0.02842601202428341, -0.0895468145608902, 0.07992582023143768, -0.12725813686847687, 0.07464101165533066, -0.02039291150867939, -0.19082708656787872, -0.04992600530385971, -0.03709353879094124, 0.11751330643892288, 0.04951104894280434, 0.035345446318387985, -0.03849414363503456, 0.021398814395070076, 0.09036660939455032, -0.04625096917152405, 0.007032121531665325, 0.010462605394423008, 0.0337727852165699, 0.143027201294899, 0.043562985956668854, 0.0407850407063961, -0.08296436071395874, 0.036798227578401566, -0.03785128518939018, 0.006033783312886953, -0.031177077442407608, -0.01528608426451683, 0.0736941322684288, -0.05986093357205391, 0.05384416878223419, -0.15460368990898132, -0.13887791335582733, 0.019885815680027008, 0.0440637432038784, -0.008793863467872143, -0.04768088832497597, -0.01606938987970352, 0.004706487990915775, 0.03214923292398453, -0.10297095775604248, -0.018705692142248154, -0.07395651191473007, 0.06261302530765533, -0.06161259487271309, 0.05837038904428482, -0.19925759732723236, 0.05517185479402542, -0.12758444249629974, -0.00022858093143440783, -0.07196646183729172, 0.0014107590541243553, -0.07659658789634705, 0.20560131967067719, -0.03091990202665329, -0.03924243897199631, -0.04068548604846001, 0.035138197243213654, -0.03472454100847244, 0.16374877095222473, -0.13877661526203156, -0.06711002439260483, 0.25354287028312683, -0.10734231770038605, -0.18591700494289398, 0.09648971259593964, 0.02245376445353031, 0.051101356744766235, 0.06351397186517715, 0.19068948924541473, 0.04456101357936859, -0.02839774265885353, 0.11076478660106659, 0.12850551307201385, -0.03891189768910408, -0.13729609549045563, 0.039272960275411606, -0.056502267718315125, -0.09683272242546082, 0.05905988812446594, 0.008246330544352531, 0.12821030616760254, -0.005702444352209568, -0.07229344546794891, -0.029214855283498764, -0.03094756230711937, -0.001333004329353571, -0.00771032040938735, 0.08952383697032928, -0.08022774010896683, 0.00800352729856968, -0.06911735236644745, 0.022681981325149536, 0.03809133917093277, 0.07439914345741272, -0.04740302264690399, 0.0767180323600769, 0.05535530671477318, 0.042411793023347855, -0.07699446380138397, 0.04141481593251228, -0.023818492889404297, 0.05898645520210266, 0.030571015551686287, 0.08439233154058456, 0.033135510981082916, -0.0507357232272625, -0.005244337022304535, -0.0026611380744725466, 0.14063683152198792, 0.025605523958802223, -0.025774765759706497, -0.11387355625629425, 0.060993779450654984, -0.01585279405117035, 0.02726089395582676, -0.01939920336008072, 0.012124531902372837, 0.013165476731956005, 0.13644146919250488, -0.04584013670682907, 0.11468890309333801, -0.047951437532901764, 0.028947537764906883, -0.08284896612167358, 0.0016988790594041348, 0.11051696538925171, 0.019900327548384666, -0.1133105456829071, 0.2610303461551666, -0.09784999489784241, 0.2292214334011078, 0.2298172414302826, -0.213199183344841, 0.07150284945964813, -0.021397829055786133, -0.01812385953962803, -0.008525249548256397, 0.08736217767000198, 0.031289007514715195, 0.045909978449344635, 0.011007173918187618, 0.16837526857852936, -0.05081033334136009, -0.015084093436598778, -0.023579012602567673, -0.07494287192821503, -0.02229919843375683, 0.062227651476860046, 0.12064095586538315, -0.15992619097232819, 0.1858367919921875, 0.31871408224105835, 0.03154115006327629, 0.1524982452392578, -0.06096424162387848, 0.003280099481344223, 0.047542013227939606, -0.011181379668414593, -0.04095815494656563, 0.023698676377534866, -0.13782858848571777, -0.024412527680397034, 0.08512160927057266, 0.05226289480924606, 0.0907956138253212, -0.1330796629190445, -0.05143057182431221, -0.023933352902531624, -0.008138210512697697, -0.017999662086367607, 0.06256125122308731, -0.004928169772028923, 0.08997568488121033, -0.04414178431034088, -0.0535825714468956, 0.09392615407705307, 0.01045303139835596, -0.09130482375621796, 0.12745356559753418, -0.20502224564552307, -0.287886381149292, -0.18609458208084106, -0.1119462177157402, -0.018422001972794533, 0.01211863849312067, 0.1490730494260788, -0.04722539335489273, -0.04971565306186676, 0.018060985952615738, -0.009352610446512699, -0.04980888590216637, -0.03368483856320381, -0.009895212016999722, 0.04757659509778023, -0.018840797245502472, -0.09704112261533737, -0.050014566630125046, 0.033870529383420944, -0.015277755446732044, 0.1034087985754013, -0.11793483793735504, 0.08422217518091202, 0.07717443257570267, 0.045501530170440674, 0.03626032918691635, -0.033867817372083664, 0.1411006897687912, -0.054277289658784866, 0.0028075892478227615, 0.198373481631279, -0.019888650625944138, 0.06610552966594696, 0.14853514730930328, 0.011040296405553818, -0.05744146555662155, 0.003486897796392441, -0.08713267743587494, -0.06319531053304672, -0.22912132740020752, -0.11871354281902313, -0.13371442258358002, 0.07070492953062057, 0.013968661427497864, 0.05197202414274216, 0.06450722366571426, 0.07715590298175812, -0.03238489851355553, 0.05174291878938675, -0.015135691501200199, 0.08160726726055145, 0.26983851194381714, -0.043525226414203644, 0.10621801018714905, -0.11354970186948776, -0.07170472294092178, 0.12426650524139404, 0.11078529059886932, 0.10657428950071335, 0.13089054822921753, 0.07763240486383438, 0.07802248746156693, 0.19519709050655365, 0.09642039984464645, 0.10434423387050629, 0.05273790657520294, -0.00913012120872736, -0.061664581298828125, -0.029822198674082756, -0.06435637176036835, 0.05333514139056206, -0.010703850537538528, -0.1387813836336136, -0.029492449015378952, -0.10033861547708511, 0.0812310203909874, 0.11384829133749008, 0.03142758086323738, -0.11102967709302902, 0.02890203148126602, 0.11490664631128311, -0.0037279801908880472, -0.07759895920753479, 0.13244014978408813, -0.04358062148094177, -0.1359173208475113, 0.13616138696670532, -0.001071441569365561, 0.14185042679309845, 0.017034394666552544, 0.05971655249595642, -0.09046122431755066, -0.131461501121521, 0.07976676523685455, 0.12791642546653748, -0.3536975383758545, 0.21171458065509796, -0.01882099360227585, -0.04073343798518181, -0.06990761309862137, -0.011132965795695782, 0.05627656728029251, 0.19909033179283142, 0.08930260688066483, 0.027432743459939957, -0.12880438566207886, -0.024126391857862473, -0.03545636311173439, 0.06034084036946297, 0.056465256959199905, 0.010937356390058994, -0.04104456678032875, -0.06369908899068832, -0.00018953910330310464, -0.04238871857523918, 0.07441409677267075, -0.0528387725353241, -0.17875036597251892, 0.06286247074604034, 0.10427732765674591, 0.0694628357887268, -0.04493080452084541, -0.009582164697349072, -0.1016954854130745, 0.14128297567367554, -0.09775903820991516, -0.07524728029966354, -0.0964631587266922, -0.1410622000694275, 0.07040037959814072, -0.06436016410589218, 0.04296823590993881, -0.08105991035699844, -0.04582658037543297, -0.0744117796421051, -0.1856449842453003, 0.11233552545309067, -0.12672798335552216, -0.04218152537941933, -0.0403851680457592, 0.14533957839012146, -0.09770956635475159, 0.027783505618572235, 0.029350291937589645, -0.022784516215324402, -0.12617844343185425, -0.12760664522647858, -0.048220258206129074, 0.009369192644953728, 0.04164532572031021, -0.02727167308330536, -0.11109064519405365, -0.04274356737732887, -0.008403738029301167, -0.08663244545459747, 0.22498087584972382, 0.17688363790512085, -0.06489399820566177, 0.20617356896400452, 0.19289743900299072, -0.09467557072639465, -0.3003140091896057, -0.19204306602478027, -0.16608810424804688, -0.04091419279575348, -0.000037791436625411734, -0.1440177708864212, 0.07268989086151123, -0.016541671007871628, -0.06700626015663147, 0.12216243892908096, -0.19584503769874573, -0.07794889807701111, 0.20350809395313263, -0.042302608489990234, 0.352806955575943, -0.13830667734146118, -0.09643393009901047, -0.09239218384027481, -0.25465017557144165, 0.09766785055398941, -0.08289084583520889, 0.05200343206524849, -0.0028419436421245337, 0.11247234791517258, -0.014358539134263992, -0.032498642802238464, 0.13281774520874023, -0.02796340361237526, -0.018777750432491302, -0.11821520328521729, 0.029576029628515244, 0.05683484673500061, 0.017797522246837616, 0.02123440057039261, -0.16182996332645416, 0.02194049395620823, -0.12273713201284409, -0.012828427366912365, -0.05853688344359398, 0.07070505619049072, -0.0192741546779871, -0.04020616039633751, -0.02240816131234169, -0.029933452606201172, 0.02660970576107502, 0.025957902893424034, 0.2101018875837326, 0.006043384317308664, 0.07968808710575104, 0.09254510700702667, 0.09120960533618927, -0.20598618686199188, 0.10536463558673859, -0.1156899482011795, -0.07408688962459564, 0.05852309614419937, -0.06782142072916031, 0.04227958619594574, 0.1248101219534874, -0.08235998451709747, 0.05750747397542, 0.06209380924701691, -0.004015931859612465, -0.00853781122714281, 0.14214013516902924, -0.17039085924625397, -0.022207556292414665, -0.03684140369296074, 0.07920467108488083, 0.1188640147447586, 0.06441446393728256, 0.15566128492355347, 0.012663294561207294, -0.05295023322105408, 0.0004402738995850086, 0.02895473875105381, -0.08901786804199219, 0.02347887121140957, 0.02940102480351925, -0.012404631823301315, -0.13298459351062775, 0.08785374462604523, -0.002380548045039177, -0.13840822875499725, -0.011602937243878841, 0.15653981268405914, -0.15225176513195038, -0.12419059872627258, -0.02523008920252323, 0.09794725477695465, -0.17347101867198944, -0.11387805640697479, -0.03855196759104729, -0.15765345096588135, 0.043831102550029755, 0.07427781075239182, 0.08032293617725372, 0.06250200420618057, -0.0229028332978487, -0.06177230924367905, 0.08364497870206833, -0.026521150022745132, -0.025631781667470932, 0.01405488420277834, -0.0675983726978302, 0.003943623974919319, 0.011056224815547466, 0.1221824586391449, -0.05104739964008331, -0.010481229983270168, -0.10841026902198792, 0.03283127769827843, -0.22232785820960999, -0.024584736675024033, -0.11487103998661041, -0.02648782916367054, 0.013407229445874691, -0.1025695875287056, -0.05806844308972359, -0.015464379452168941, -0.12787598371505737, -0.007758643478155136, -0.04635034501552582, 0.0721234381198883, -0.07942351698875427, -0.013850534334778786, 0.1086130142211914, -0.014244987629354, 0.09946450591087341, 0.1204531118273735, -0.06933922320604324, 0.10797575116157532, -0.11670500785112381, -0.10494954884052277, 0.07192790508270264, 0.05307643488049507, 0.04698742926120758, 0.048363424837589264, -0.022572461515665054, 0.1329403817653656, 0.027773253619670868, 0.031298112124204636, 0.035803601145744324, -0.1222395971417427, -0.05603553354740143, -0.02392624504864216, -0.12869581580162048, -0.006517562549561262, -0.07643602788448334, 0.09002377092838287, 0.0014069984899833798, 0.13533709943294525, -0.04636082798242569, 0.06356159597635269, -0.025901151821017265, 0.033259909600019455, -0.031284138560295105, -0.14292703568935394, -0.10636031627655029, -0.11002235859632492, -0.03428484499454498, 0.02111552655696869, 0.2675379514694214, -0.007412616163492203, -0.0011902720434591174, 0.047001663595438004, 0.0538569837808609, -0.041549015790224075, 0.007161622866988182, 0.2368210405111313, 0.06566151231527328, -0.016559582203626633, -0.16044844686985016, 0.014925233088433743, -0.0004924998502247036, -0.09223680198192596, 0.0787888765335083, 0.10183033347129822, 0.058205582201480865, 0.10489338636398315, -0.0013836395228281617, 0.020525246858596802, -0.11327996850013733, -0.1452610045671463, 0.02425878681242466, 0.06012146547436714, -0.016985567286610603, 0.14975951611995697, 0.15312522649765015, -0.05440497025847435, 0.01569138467311859, -0.05376313999295235, -0.005363878794014454, -0.16347958147525787, -0.12007642537355423, -0.06031479686498642, -0.1546294242143631, -0.0016970922006294131, -0.04239609092473984, 0.05792216211557388, 0.019824126735329628, 0.06266969442367554, -0.08404241502285004, 0.027103235945105553, -0.007548357360064983, -0.09965439140796661, 0.03144558146595955, -0.005556364543735981, 0.0426727756857872, -0.03501642122864723, -0.04869592562317848, -0.1017841026186943, -0.02718989923596382, -0.030012471601366997, 0.045262787491083145, -0.02940499782562256, 0.020081210881471634, -0.10129629820585251, -0.05531025305390358, -0.05986063554883003, 0.03019626997411251, 0.013223668560385704, 0.1806917041540146, 0.02095361426472664, 0.02755020000040531, 0.07106707990169525, 0.16085049510002136, -0.06530110538005829, -0.1682053804397583, -0.0479503758251667, 0.15082982182502747, 0.03682824596762657, 0.07699032872915268, -0.02752351388335228, 0.0385272242128849, -0.0812029093503952, 0.3773270845413208, 0.2938134968280792, -0.10734356194734573, 0.003345073899254203, 0.007760334759950638, 0.03554527461528778, 0.07851110398769379, 0.11729753762483597, 0.13349363207817078, 0.23671966791152954, -0.0678587332367897, -0.04882834106683731, -0.05828458070755005, -0.008262738585472107, -0.17006874084472656, 0.09817139804363251, -0.03975054994225502, -0.09048983454704285, -0.002391210524365306, 0.09659437835216522, -0.1522582471370697, 0.07626335322856903, -0.05126909539103508, -0.14105075597763062, -0.019743869081139565, -0.004941374063491821, 0.24059149622917175, 0.04790067300200462, 0.017064275220036507, -0.019664300605654716, -0.0630330890417099, 0.15666911005973816, 0.00010800204472616315, -0.19212886691093445, -0.014136869460344315, 0.07452712208032608, -0.13787908852100372, 0.08068044483661652, -0.009130553342401981, 0.014185438863933086, 0.08352915942668915, 0.10974585264921188, -0.05244855582714081, 0.06941255927085876, 0.02869943156838417, -0.011485883966088295, -0.002988257445394993, -0.07497024536132812, 0.0016435019206255674, -0.10233531892299652, 0.056106358766555786, -0.08443167805671692, 0.07579674571752548, 0.06800483912229538, -0.0482083261013031, 0.004609747789800167, 0.04391481727361679, -0.09131061285734177, 0.06162353232502937, 0.008558813482522964, -0.02852494828402996, -0.04070623591542244, -0.054914895445108414, -0.026274548843503, 0.016836408525705338, -0.11795610189437866, -0.05423274263739586, -0.026548760011792183, -0.09029786288738251, 0.07379382103681564, 0.03921010345220566, -0.13078740239143372, 0.004845517221838236, -0.10464652627706528, 0.04783692955970764, -0.16490855813026428, 0.07324777543544769, 0.0639973059296608, -0.008060651831328869, 0.011700226925313473, -0.12323325127363205, 0.0391155444085598, 0.027319492772221565, -0.07329191267490387, -0.07980701327323914 ]
null
null
transformers
# BART Translation model For further models, scripts and details, see [our repository](https://github.com/nytud/machine-translation) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Source language: Hungarian - Target language: English - Pretrained on English WikiText-103 and Hungarian Wikipedia - Finetuned on subcorpora from OPUS - Segments: 56.837.602 ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) ## Results | Model | BLEU | chrF-3 | | ------------- | ------------- | ------------- | | Google en-hu | 25.30 | 54.08 | | **BART-base-enhu** | **34.38** | **58.88** | | Google hu-en| 34.48 | 59.59 | | **BART-base-huen** | **38.03** | **61,37** | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {yang-bart, title = {{BARTerezzünk! - Messze, messze, messze a világtól, - BART kísérleti modellek magyar nyelvre}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Yang, Zijian Győző}, pages = {15--29} } ```
{"language": ["hu", "en"], "license": "apache-2.0", "tags": ["translation"], "metrics": ["sacrebleu", "chrf"], "widget": [{"text": "Szeretn\u00e9m megragadni az alkalmat uram, hogy az enged\u00e9ly\u00e9t k\u00e9rjem, hogy tal\u00e1lkozhassak a l\u00e1ny\u00e1val."}]}
translation
NYTK/translation-bart-hu-en
[ "transformers", "pytorch", "bart", "text2text-generation", "translation", "hu", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "hu", "en" ]
TAGS #transformers #pytorch #bart #text2text-generation #translation #hu #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
BART Translation model ====================== For further models, scripts and details, see our repository or our demo site. * Source language: Hungarian * Target language: English * Pretrained on English WikiText-103 and Hungarian Wikipedia * Finetuned on subcorpora from OPUS + Segments: 56.837.602 Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) Results ------- Model: Google en-hu, BLEU: 25.30, chrF-3: 54.08 Model: BART-base-enhu, BLEU: 34.38, chrF-3: 58.88 Model: Google hu-en, BLEU: 34.48, chrF-3: 59.59 Model: BART-base-huen, BLEU: 38.03, chrF-3: 61,37 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #bart #text2text-generation #translation #hu #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 53 ]
[ "passage: TAGS\n#transformers #pytorch #bart #text2text-generation #translation #hu #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03549858182668686, 0.06765908747911453, -0.006466731894761324, 0.006390314549207687, 0.09173344820737839, 0.019595088437199593, 0.1421384960412979, 0.1056770533323288, -0.008825202472507954, -0.06697764992713928, 0.15233270823955536, 0.14075937867164612, 0.0290214940905571, 0.08613088726997375, -0.03740636631846428, -0.24995194375514984, 0.08237611502408981, 0.029244424775242805, -0.0005900588002987206, 0.09452252089977264, 0.10677144676446915, -0.02915605902671814, 0.0776088610291481, -0.007316349074244499, -0.05692455917596817, 0.023185327649116516, 0.008690381422638893, -0.10114215314388275, 0.0868205726146698, 0.058139074593782425, 0.06972190737724304, 0.04438723996281624, -0.010678955353796482, -0.20933543145656586, 0.01423552818596363, -0.028265735134482384, -0.06832373887300491, 0.033535681664943695, 0.04445628449320793, -0.04722035676240921, 0.10384010523557663, 0.059310875833034515, -0.053839992731809616, 0.07459928095340729, -0.10060961544513702, -0.13367748260498047, -0.09516400098800659, 0.06714965403079987, 0.05579029768705368, 0.0717916488647461, 0.02171521633863449, 0.12375343590974808, -0.07975552976131439, 0.05999911203980446, 0.13123276829719543, -0.3901515007019043, 0.0009034280083142221, 0.010648532770574093, 0.07090131938457489, 0.07109448313713074, -0.005963413510471582, 0.05508958175778389, 0.0604388453066349, 0.0309202391654253, -0.045218273997306824, -0.09577105939388275, -0.12071015685796738, 0.02536943554878235, -0.05073419213294983, -0.04504747316241264, 0.24822495877742767, -0.04071049019694328, 0.030098073184490204, -0.0055220904760062695, -0.06342701613903046, 0.000514509913045913, -0.04213429242372513, 0.034006983041763306, 0.0022131612058728933, 0.10034674406051636, 0.09216935932636261, -0.057406023144721985, -0.1452079713344574, 0.02513103187084198, -0.21636246144771576, 0.07955442368984222, 0.033300504088401794, 0.0641888752579689, -0.16139985620975494, 0.06064121425151825, 0.07858438044786453, -0.13105814158916473, 0.009760415181517601, -0.08431071788072586, 0.13403037190437317, 0.0627174824476242, -0.044062793254852295, 0.05526137351989746, 0.11822180449962616, 0.19554466009140015, 0.02456231229007244, 0.04155997186899185, -0.07766233384609222, 0.1275143176317215, -0.030642474070191383, 0.08149223774671555, 0.029222363606095314, -0.09419506788253784, 0.0788826122879982, -0.12903866171836853, 0.07336200028657913, -0.02075919695198536, -0.19321314990520477, -0.04696034640073776, -0.03564596176147461, 0.11622311919927597, 0.04906318336725235, 0.03637190908193588, -0.04049098119139671, 0.020457159727811813, 0.08687112480401993, -0.047011662274599075, 0.006527082994580269, 0.010474731214344501, 0.03570547699928284, 0.13972583413124084, 0.04164787754416466, 0.038627997040748596, -0.08193187415599823, 0.036860350519418716, -0.038534071296453476, 0.005927755497395992, -0.03208986669778824, -0.016684485599398613, 0.07291098684072495, -0.0611703097820282, 0.055960386991500854, -0.15500958263874054, -0.1403968334197998, 0.01723327301442623, 0.044812824577093124, -0.01035565696656704, -0.046466439962387085, -0.018426649272441864, 0.003122881753370166, 0.034126296639442444, -0.10092410445213318, -0.019516052678227425, -0.076692596077919, 0.06386633962392807, -0.060574330389499664, 0.05918479338288307, -0.1973360925912857, 0.054655954241752625, -0.1290668249130249, -0.0023031297605484724, -0.07257840037345886, 0.0011808364652097225, -0.07740607857704163, 0.20091110467910767, -0.02822635881602764, -0.03738837316632271, -0.043773554265499115, 0.036692578345537186, -0.03702608868479729, 0.16674213111400604, -0.140282541513443, -0.06866217404603958, 0.25375398993492126, -0.10545119643211365, -0.18670502305030823, 0.09482894092798233, 0.021227890625596046, 0.05399894341826439, 0.06318825483322144, 0.1935800462961197, 0.04336598888039589, -0.026439491659402847, 0.1131066307425499, 0.13258524239063263, -0.03479631617665291, -0.13441114127635956, 0.038795553147792816, -0.056930061429739, -0.09917576611042023, 0.06116928905248642, 0.00839642807841301, 0.12818162143230438, -0.005876512732356787, -0.06998377293348312, -0.02847508154809475, -0.0278276689350605, -0.00003342304262332618, -0.005053303204476833, 0.08944696187973022, -0.08120433241128922, 0.008281520567834377, -0.06440666317939758, 0.021518882364034653, 0.03950163349509239, 0.0730048418045044, -0.04450575262308121, 0.07712454348802567, 0.05275502800941467, 0.042731136083602905, -0.08209922909736633, 0.038921162486076355, -0.02508688159286976, 0.06010124087333679, 0.029091104865074158, 0.0868108719587326, 0.036117393523454666, -0.05177730321884155, -0.005338860210031271, -0.0015941071324050426, 0.1404571235179901, 0.027078725397586823, -0.026333095505833626, -0.11530524492263794, 0.06471062451601028, -0.01775253191590309, 0.02514924667775631, -0.019274301826953888, 0.010494692251086235, 0.015185094438493252, 0.1341346651315689, -0.04274948686361313, 0.11293654143810272, -0.04750969260931015, 0.029556971043348312, -0.08341032266616821, 0.002677967306226492, 0.10903428494930267, 0.021986572071909904, -0.11013883352279663, 0.2612898349761963, -0.10003873705863953, 0.23665373027324677, 0.23112744092941284, -0.21384765207767487, 0.07158505171537399, -0.021794598549604416, -0.017680218443274498, -0.00704703526571393, 0.08751346170902252, 0.029436076059937477, 0.04572143033146858, 0.011715472675859928, 0.16931965947151184, -0.05083689093589783, -0.014014346525073051, -0.02290698140859604, -0.07515458762645721, -0.023125510662794113, 0.0642508864402771, 0.11978510767221451, -0.1588902324438095, 0.19023744761943817, 0.3206992447376251, 0.03225656598806381, 0.15754534304141998, -0.060655392706394196, 0.0049214549362659454, 0.04737424477934837, -0.012157352641224861, -0.041677284985780716, 0.021289922297000885, -0.14389291405677795, -0.023178936913609505, 0.08572740852832794, 0.05194595828652382, 0.08920995891094208, -0.13250066339969635, -0.05161053314805031, -0.021212592720985413, -0.005432938691228628, -0.015531233511865139, 0.060965750366449356, -0.004208390135318041, 0.08913461863994598, -0.042801834642887115, -0.057175297290086746, 0.09482548385858536, 0.010942474007606506, -0.08992395550012589, 0.13072112202644348, -0.2067774087190628, -0.29043057560920715, -0.18377585709095, -0.1153017058968544, -0.018201982602477074, 0.011826174333691597, 0.14992928504943848, -0.04882654920220375, -0.050084684044122696, 0.019630281254649162, -0.009915608912706375, -0.05172102898359299, -0.03342226520180702, -0.01272892951965332, 0.04648309573531151, -0.024503402411937714, -0.09632959961891174, -0.05128772184252739, 0.032757360488176346, -0.014471213333308697, 0.10465841740369797, -0.12049389630556107, 0.08417119830846786, 0.08213763684034348, 0.04826866835355759, 0.03769949823617935, -0.03349674493074417, 0.14502117037773132, -0.05626993253827095, 0.001993690151721239, 0.19333712756633759, -0.0198568943887949, 0.0653095468878746, 0.15332181751728058, 0.012745805084705353, -0.05993325635790825, 0.0031950618140399456, -0.08757198601961136, -0.06393256783485413, -0.2309676855802536, -0.11838552355766296, -0.13539037108421326, 0.07118549197912216, 0.011975736357271671, 0.05178242549300194, 0.06588154286146164, 0.07476943731307983, -0.029167944565415382, 0.05109082907438278, -0.012925670482218266, 0.08072865754365921, 0.272083044052124, -0.04158049449324608, 0.1071668192744255, -0.11067567765712738, -0.07345503568649292, 0.12476801127195358, 0.11179754137992859, 0.10469533503055573, 0.13179545104503632, 0.08024509996175766, 0.07518209517002106, 0.189095139503479, 0.09586833417415619, 0.10418865084648132, 0.05518224090337753, -0.009935922920703888, -0.06257157027721405, -0.029802368953824043, -0.06606598198413849, 0.05153387412428856, -0.00850958377122879, -0.13725662231445312, -0.02679026499390602, -0.10159595310688019, 0.08141601085662842, 0.11080900579690933, 0.029622184112668037, -0.11063611507415771, 0.029173249378800392, 0.11506709456443787, -0.003909852355718613, -0.07961154729127884, 0.1336737424135208, -0.04639945924282074, -0.1371856927871704, 0.13328686356544495, -0.00213432009331882, 0.14052799344062805, 0.015394323505461216, 0.062308572232723236, -0.09008185565471649, -0.136961430311203, 0.07953968644142151, 0.12647618353366852, -0.3536711037158966, 0.21379084885120392, -0.0179551113396883, -0.04253724589943886, -0.06947971135377884, -0.010854361578822136, 0.054817117750644684, 0.1994798183441162, 0.09291482716798782, 0.026525137946009636, -0.12934251129627228, -0.026286669075489044, -0.032101210206747055, 0.0569387748837471, 0.0569545179605484, 0.00910685770213604, -0.041173551231622696, -0.0626075342297554, -0.0017272880068048835, -0.04315824806690216, 0.07209969311952591, -0.047343660145998, -0.18113556504249573, 0.06409414857625961, 0.09657751023769379, 0.06879691779613495, -0.04211020469665527, -0.01022566482424736, -0.10090833902359009, 0.14246800541877747, -0.09986431896686554, -0.07396528124809265, -0.09702523797750473, -0.13990676403045654, 0.07011990249156952, -0.0652322769165039, 0.044774509966373444, -0.08350979536771774, -0.04708198830485344, -0.0759388655424118, -0.1855810284614563, 0.11556622385978699, -0.12805329263210297, -0.03819727897644043, -0.04048198461532593, 0.14560139179229736, -0.10074906796216965, 0.027970312163233757, 0.028667591512203217, -0.018919426947832108, -0.12676282227039337, -0.12531936168670654, -0.04730776697397232, 0.009917899034917355, 0.04132930934429169, -0.02700737863779068, -0.10858794301748276, -0.04101177304983139, -0.006537122651934624, -0.08608530461788177, 0.22774137556552887, 0.17790628969669342, -0.06414470821619034, 0.2042168527841568, 0.19024622440338135, -0.09579238295555115, -0.3004281222820282, -0.1908157765865326, -0.16397026181221008, -0.04139905050396919, 0.0007242225692607462, -0.1444641649723053, 0.0736692026257515, -0.017152512446045876, -0.06644435226917267, 0.12356630712747574, -0.1987079679965973, -0.07703768461942673, 0.20379920303821564, -0.04018338769674301, 0.35722315311431885, -0.14044351875782013, -0.09764108061790466, -0.09069708734750748, -0.25271841883659363, 0.09820324927568436, -0.08081929385662079, 0.053068604320287704, -0.003088295226916671, 0.10810182243585587, -0.012951484881341457, -0.033983297646045685, 0.13268138468265533, -0.02982264570891857, -0.01849103532731533, -0.11737823486328125, 0.027279434725642204, 0.05015110969543457, 0.018504170700907707, 0.018885361030697823, -0.15310165286064148, 0.019944030791521072, -0.12449055165052414, -0.013733498752117157, -0.05896923318505287, 0.070488840341568, -0.01904766634106636, -0.04061678424477577, -0.019398348405957222, -0.03020736761391163, 0.025630749762058258, 0.025318099185824394, 0.21525388956069946, 0.003674433333799243, 0.08104076236486435, 0.09543675184249878, 0.09211058169603348, -0.19948546588420868, 0.09813079237937927, -0.11533517390489578, -0.0731322169303894, 0.05796674266457558, -0.07040508091449738, 0.04113210737705231, 0.12477365136146545, -0.08078356087207794, 0.05720764026045799, 0.06251655519008636, -0.005113575607538223, -0.011468724347651005, 0.1418249011039734, -0.16952992975711823, -0.023737365379929543, -0.03842872753739357, 0.07819520682096481, 0.1210193783044815, 0.061608634889125824, 0.15512427687644958, 0.014904814772307873, -0.052875813096761703, 0.0017188458004966378, 0.02740364708006382, -0.08539263904094696, 0.025326471775770187, 0.030712541192770004, -0.012710769660770893, -0.13375957310199738, 0.08767374604940414, -0.0028658790979534388, -0.1443762481212616, -0.01073376089334488, 0.15821745991706848, -0.15327833592891693, -0.12501895427703857, -0.022178631275892258, 0.09828176349401474, -0.17374928295612335, -0.11111193895339966, -0.03867383673787117, -0.15558193624019623, 0.04491939768195152, 0.08055781573057175, 0.0825095996260643, 0.0632035955786705, -0.02328989841043949, -0.06039819493889809, 0.08744853734970093, -0.023331252858042717, -0.026405083015561104, 0.014627908356487751, -0.06609891355037689, -0.0025946833193302155, 0.01031573861837387, 0.12248679995536804, -0.05182528868317604, -0.011053628288209438, -0.1125030517578125, 0.032511431723833084, -0.217254638671875, -0.024307383224368095, -0.11424382030963898, -0.026300087571144104, 0.01362545881420374, -0.10061898827552795, -0.05862952023744583, -0.01725754328072071, -0.12937885522842407, -0.007549150846898556, -0.04703173041343689, 0.06979170441627502, -0.0763704925775528, -0.014994636178016663, 0.11036611348390579, -0.015904610976576805, 0.09824073314666748, 0.12144549190998077, -0.06797390431165695, 0.11077778041362762, -0.11803363263607025, -0.10365992784500122, 0.07377855479717255, 0.05363307520747185, 0.049412667751312256, 0.05126699060201645, -0.020940937101840973, 0.1323661357164383, 0.0288582481443882, 0.03125322610139847, 0.03443850204348564, -0.12331870198249817, -0.05373541638255119, -0.021508801728487015, -0.1332063525915146, -0.005571283400058746, -0.0764768049120903, 0.09016256034374237, -0.0013989058788865805, 0.13499559462070465, -0.047524694353342056, 0.06462052464485168, -0.025579456239938736, 0.03176397457718849, -0.03172171860933304, -0.14515528082847595, -0.10508978366851807, -0.10990186035633087, -0.03442465141415596, 0.021118782460689545, 0.2648831605911255, -0.005337374284863472, 0.0012814834481105208, 0.04757673665881157, 0.054363690316677094, -0.03685133531689644, 0.007122877519577742, 0.23439310491085052, 0.06838024407625198, -0.0170397087931633, -0.15886057913303375, 0.018881281837821007, -0.00029291684040799737, -0.08804967999458313, 0.07782614976167679, 0.10193083435297012, 0.0590987503528595, 0.10643068701028824, -0.0014022745890542865, 0.022171083837747574, -0.11255063861608505, -0.14739079773426056, 0.02453802339732647, 0.061320606619119644, -0.015085319057106972, 0.148247629404068, 0.1538991779088974, -0.054011065512895584, 0.01587776094675064, -0.055076248943805695, -0.00511200912296772, -0.16579687595367432, -0.11958655714988708, -0.06243414804339409, -0.15489375591278076, -0.0021427576430141926, -0.04482037201523781, 0.05829314514994621, 0.025453194975852966, 0.06489785015583038, -0.08174669742584229, 0.029153384268283844, -0.005546310916543007, -0.1018201932311058, 0.03291289508342743, -0.0061867316253483295, 0.041669849306344986, -0.03593461588025093, -0.04636894166469574, -0.10637789219617844, -0.029033398255705833, -0.031560346484184265, 0.04554250091314316, -0.030077103525400162, 0.017830070108175278, -0.10179177671670914, -0.05530541017651558, -0.05987798422574997, 0.032342538237571716, 0.00988853070884943, 0.18253985047340393, 0.02003398723900318, 0.02724720537662506, 0.07256946712732315, 0.16459809243679047, -0.0657791793346405, -0.16680262982845306, -0.04718565568327904, 0.15148746967315674, 0.038652099668979645, 0.08124640583992004, -0.029272859916090965, 0.03857157379388809, -0.07916721701622009, 0.3752616345882416, 0.292827308177948, -0.1053149625658989, 0.0033097034320235252, 0.013900203630328178, 0.03630892187356949, 0.07913520932197571, 0.11803770065307617, 0.13366371393203735, 0.23882296681404114, -0.07088543474674225, -0.04957311227917671, -0.05907680094242096, -0.010196390561759472, -0.16879765689373016, 0.09755412489175797, -0.035922128707170486, -0.09010794758796692, -0.003909599501639605, 0.098528653383255, -0.1530018001794815, 0.07680101692676544, -0.05139228701591492, -0.14355266094207764, -0.02236301265656948, -0.00709264213219285, 0.23845860362052917, 0.047594644129276276, 0.017791958525776863, -0.01820891909301281, -0.06764550507068634, 0.15346106886863708, 0.0018395523075014353, -0.19666363298892975, -0.015460237860679626, 0.07367359101772308, -0.13531483709812164, 0.07907529175281525, -0.009247825480997562, 0.014479131437838078, 0.08238725364208221, 0.11055996268987656, -0.04856988787651062, 0.06833895295858383, 0.03149348124861717, -0.010429680347442627, -0.003245649393647909, -0.07518105208873749, 0.0030713132582604885, -0.1038050577044487, 0.05629083141684532, -0.08729172497987747, 0.07629577070474625, 0.0710417851805687, -0.0495808869600296, 0.004276629537343979, 0.04677342250943184, -0.09312651306390762, 0.059433531016111374, 0.010366233065724373, -0.02605981007218361, -0.04089651629328728, -0.05410965159535408, -0.02597547322511673, 0.016477933153510094, -0.11129258573055267, -0.055782560259103775, -0.027025800198316574, -0.09070666134357452, 0.07387197762727737, 0.03957268223166466, -0.13001255691051483, 0.003519404446706176, -0.1014084741473198, 0.04918479919433594, -0.16684433817863464, 0.07441151142120361, 0.06271760165691376, -0.00783870555460453, 0.011641022749245167, -0.12295518815517426, 0.040618252009153366, 0.027962731197476387, -0.07359583675861359, -0.0802050530910492 ]
null
null
transformers
# Marian Translation model For further models, scripts and details, see [our repository](https://github.com/nytud/machine-translation) or [our demo site](https://juniper.nytud.hu/demo/nlp). There is a description of the REST API of our service. This model has been traind with a [MarianNMT](https://github.com/marian-nmt/marian-dev) v1.10.23; commit: 42f0b8b7 transformer-big typed environment. This repository contains our translation model (en-hu) which were published in MSZNY 2022 conference. - Source language: English - Target language: Hungarian - Pretrained on subcorpora from OPUS - Segments: 56.837.602 ## Limitations ## Results | Model | BLEU | chrF-3 | | ------------- | ------------- | ------------- | | Google en-hu | 25.30 | 54.08 | | **Marian-big-enhu** | **37.30** | **61.61** | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {laki-yang-mt, title = {{Jobban fordítunk magyarra, mint a Google!}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Laki, László and Yang, Zijian Győző}, pages = {357--372} } ```
{"language": ["en", "hu"], "license": "gpl-3.0", "tags": ["translation"], "metrics": ["sacrebleu", "chrf"], "widget": [{"text": "This may not make much sense to you, sir, but I'd like to ask your permission to date your daughter.", "example_title": "Translation: English-Hungarian"}]}
translation
NYTK/translation-marianmt-en-hu
[ "transformers", "pytorch", "marian", "text2text-generation", "translation", "en", "hu", "license:gpl-3.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en", "hu" ]
TAGS #transformers #pytorch #marian #text2text-generation #translation #en #hu #license-gpl-3.0 #autotrain_compatible #endpoints_compatible #region-us
Marian Translation model ======================== For further models, scripts and details, see our repository or our demo site. There is a description of the REST API of our service. This model has been traind with a MarianNMT v1.10.23; commit: 42f0b8b7 transformer-big typed environment. This repository contains our translation model (en-hu) which were published in MSZNY 2022 conference. * Source language: English * Target language: Hungarian * Pretrained on subcorpora from OPUS + Segments: 56.837.602 Limitations ----------- Results ------- Model: Google en-hu, BLEU: 25.30, chrF-3: 54.08 Model: Marian-big-enhu, BLEU: 37.30, chrF-3: 61.61 If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #marian #text2text-generation #translation #en #hu #license-gpl-3.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #marian #text2text-generation #translation #en #hu #license-gpl-3.0 #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03982246294617653, 0.060247208923101425, -0.006599431857466698, 0.024083083495497704, 0.1429542601108551, 0.03137698397040367, 0.1352204531431198, 0.08991644531488419, -0.018424801528453827, -0.05819499492645264, 0.14203888177871704, 0.1746869534254074, 0.0125145697966218, 0.06582781672477722, -0.05189390853047371, -0.29253801703453064, 0.05831745266914368, 0.043569572269916534, -0.0009060234879143536, 0.08596526831388474, 0.09827691316604614, -0.04610617458820343, 0.1089295819401741, -0.003743057604879141, -0.0868421345949173, 0.025141119956970215, 0.017442556098103523, -0.10891643166542053, 0.08816611021757126, 0.07440773397684097, 0.072719044983387, 0.07067558169364929, 0.013590732589364052, -0.18843291699886322, 0.0152318449690938, -0.04873841255903244, -0.07846008241176605, 0.03386329486966133, 0.06911739706993103, -0.061306729912757874, 0.1530361771583557, 0.04576868563890457, -0.05673055350780487, 0.07782268524169922, -0.13922631740570068, -0.08511760085821152, -0.09591495245695114, 0.06605695933103561, 0.02012445218861103, 0.06267563998699188, 0.019117359071969986, 0.11949793994426727, -0.07455295324325562, 0.06481709331274033, 0.1464611291885376, -0.3960771858692169, 0.0016433355631306767, 0.07145688682794571, 0.08674900978803635, 0.057048723101615906, 0.023203393444418907, 0.1235203966498375, 0.051688019186258316, 0.010421478189527988, -0.07787527143955231, -0.10835402458906174, -0.07339181751012802, 0.0308778528124094, -0.06527221947908401, -0.034404877573251724, 0.23855076730251312, -0.05686654895544052, 0.028656132519245148, -0.02098102495074272, -0.036750707775354385, -0.0043408991768956184, -0.030997496098279953, 0.0038075761403888464, -0.0006306271534413099, 0.07459788769483566, 0.06049608439207077, -0.06628260016441345, -0.14243365824222565, 0.024984711781144142, -0.2308036834001541, 0.12999360263347626, 0.028989512473344803, 0.06024986132979393, -0.14639459550380707, 0.05522279813885689, 0.038722116500139236, -0.10263974219560623, -0.0010803978657349944, -0.09440214186906815, 0.13185617327690125, 0.015227559953927994, -0.037853680551052094, 0.019744232296943665, 0.07427939772605896, 0.11278866231441498, -0.018755493685603142, 0.03187983110547066, -0.0412822961807251, 0.13781200349330902, 0.011061600409448147, 0.08959071338176727, -0.0008008757722564042, -0.044079236686229706, 0.06840376555919647, -0.15698547661304474, 0.05511670932173729, -0.026283733546733856, -0.22059698402881622, -0.058883149176836014, -0.060634877532720566, 0.12004408240318298, 0.011935919523239136, 0.05235243961215019, -0.05080585554242134, 0.0056757028214633465, 0.05088125541806221, -0.03218870982527733, 0.020081598311662674, 0.0036886699963361025, 0.018053434789180756, 0.15825456380844116, 0.024478472769260406, 0.028053566813468933, -0.07787357270717621, 0.06726320832967758, -0.05299898236989975, 0.012641318142414093, -0.021198896691203117, -0.04474511370062828, 0.07573647797107697, -0.0731460303068161, 0.05847820267081261, -0.17925024032592773, -0.11371936649084091, 0.008373036049306393, 0.02396521531045437, -0.04629949852824211, -0.029686138033866882, -0.02873247303068638, 0.01018210407346487, 0.02842402085661888, -0.09114798158407211, -0.04803132265806198, -0.0736394003033638, 0.07922574877738953, -0.020842691883444786, 0.06235600262880325, -0.17155107855796814, 0.051368553191423416, -0.10582055151462555, -0.011867661029100418, -0.07627937197685242, 0.031569529324769974, -0.0594557449221611, 0.16555069386959076, -0.01511226687580347, -0.06982959061861038, -0.03936656937003136, 0.06844110041856766, -0.03611801937222481, 0.18586789071559906, -0.09779155999422073, -0.07689907401800156, 0.22233149409294128, -0.10361720621585846, -0.1580118089914322, 0.08954864740371704, 0.021261150017380714, 0.055358804762363434, 0.068097785115242, 0.18888625502586365, 0.008192284032702446, -0.06206442043185234, 0.12081527709960938, 0.10531759262084961, -0.05317474156618118, -0.13001807034015656, 0.037330787628889084, -0.04565746337175369, -0.06479151546955109, 0.04788137972354889, -0.002091356785967946, 0.10428964346647263, -0.04522936791181564, -0.04916759580373764, -0.0005506716552190483, -0.01576307788491249, 0.00824549701064825, 0.020754827186465263, 0.10488446056842804, -0.08609747141599655, 0.004205604083836079, -0.02174496278166771, -0.012171288952231407, 0.026987584307789803, 0.06350193172693253, -0.055283620953559875, 0.12377023696899414, 0.03813453018665314, 0.03090031072497368, -0.08324402570724487, -0.004629572853446007, -0.04024110734462738, 0.09789381176233292, 0.05924958735704422, 0.10203021764755249, 0.02652576006948948, -0.020784521475434303, -0.02800508588552475, 0.026213321834802628, 0.14529314637184143, 0.020377587527036667, -0.015552245080471039, -0.09582541882991791, 0.06477722525596619, -0.012265501543879509, -0.012904765084385872, -0.06464166194200516, -0.004173939116299152, 0.07686459273099899, 0.11742591112852097, -0.04013124853372574, 0.0923098549246788, -0.04412519931793213, 0.07347965240478516, -0.07472807914018631, 0.006547986995428801, 0.12894372642040253, 0.0016309418715536594, -0.0908140242099762, 0.28307288885116577, -0.1307586282491684, 0.2595599889755249, 0.2286950796842575, -0.2725001275539398, 0.02432309277355671, -0.043351978063583374, -0.0035461043007671833, 0.021164074540138245, 0.07538611441850662, 0.04564926400780678, 0.07687249779701233, -0.008561722002923489, 0.18733252584934235, -0.05707545578479767, 0.0053220451809465885, -0.0033906311728060246, -0.08143351227045059, -0.07691936939954758, 0.07626032829284668, 0.1363440752029419, -0.20104096829891205, 0.1743215024471283, 0.23737213015556335, 0.05310637503862381, 0.18686942756175995, -0.019320568069815636, 0.020322609692811966, 0.034692708402872086, -0.024604197591543198, -0.024601422250270844, 0.004709382075816393, -0.1327686756849289, -0.051350243389606476, 0.0608646385371685, 0.058814577758312225, 0.09742151200771332, -0.1443001627922058, -0.07343149930238724, -0.005074227694422007, -0.001533192815259099, -0.03741070255637169, 0.09871584922075272, -0.008153777569532394, 0.09328451752662659, -0.02708052285015583, -0.024020548909902573, 0.09163615107536316, 0.016152260825037956, -0.09709810465574265, 0.146553635597229, -0.1592170000076294, -0.2920166254043579, -0.2130100578069687, -0.163958340883255, -0.018038678914308548, 0.036553382873535156, 0.1294773817062378, -0.0420793853700161, -0.05192315950989723, 0.030545178800821304, 0.0314292348921299, -0.11608506739139557, -0.0598364919424057, -0.041581325232982635, 0.05881721153855324, -0.06936201453208923, -0.08268089592456818, -0.06250263750553131, 0.008111446164548397, -0.040557511150836945, 0.10447689890861511, -0.12281684577465057, 0.09737060964107513, 0.08206894993782043, 0.01325998269021511, 0.051334358751773834, -0.05411477014422417, 0.14059333503246307, -0.07447066903114319, -0.031066183000802994, 0.20858034491539001, 0.006663524080067873, 0.0534573495388031, 0.13388372957706451, 0.028103958815336227, -0.06642686575651169, -0.023282112553715706, -0.08966183662414551, -0.08015602082014084, -0.2364071160554886, -0.13581199944019318, -0.1325961798429489, 0.056268442422151566, 0.0049598608165979385, 0.033828165382146835, 0.04226673021912575, 0.08926906436681747, -0.0016214862698689103, 0.02782411500811577, -0.0015773907070979476, 0.09181087464094162, 0.330051064491272, -0.031777553260326385, 0.11366862803697586, -0.09534206986427307, -0.07990776747465134, 0.11237441003322601, 0.08945228159427643, 0.10309762507677078, 0.12346809357404709, 0.057074908167123795, 0.07777944952249527, 0.1313352882862091, 0.11699366569519043, 0.07549741119146347, 0.07667182385921478, -0.019250234588980675, -0.04832470044493675, -0.02588089182972908, -0.03707404062151909, 0.03442797809839249, 0.03153160586953163, -0.16713467240333557, -0.08011334389448166, -0.07368011027574539, 0.05555097013711929, 0.03887490555644035, 0.03756671026349068, -0.13326582312583923, 0.029182737693190575, 0.10988926142454147, 0.03273474797606468, -0.092989981174469, 0.1303187906742096, -0.023461444303393364, -0.15437808632850647, 0.13175556063652039, 0.0010958256898447871, 0.12673339247703552, -0.05304130166769028, 0.055793680250644684, -0.07123423367738724, -0.11938939988613129, 0.07233232259750366, 0.13151146471500397, -0.32898837327957153, 0.26162880659103394, 0.0006285416311584413, -0.042223501950502396, -0.051017649471759796, -0.032680243253707886, 0.022694820538163185, 0.21377691626548767, 0.11430712044239044, 0.023077968508005142, -0.13832540810108185, -0.09085220843553543, 0.012118805199861526, 0.027563020586967468, 0.10340113192796707, 0.03876212611794472, -0.03637661784887314, -0.05796961858868599, 0.005212212447077036, -0.06239071488380432, 0.08317791670560837, -0.04977747052907944, -0.18434356153011322, 0.0658528059720993, 0.06700444966554642, 0.07006066292524338, -0.011859970167279243, -0.04250162094831467, -0.15320999920368195, 0.18015126883983612, -0.05547358840703964, -0.0684751570224762, -0.10571932047605515, -0.11586176604032516, 0.05702942982316017, -0.0706528052687645, 0.04237781837582588, -0.07591833174228668, -0.06068265065550804, -0.08133028447628021, -0.1652512550354004, 0.1269058734178543, -0.10407844185829163, -0.03355472907423973, -0.039553407579660416, 0.1573730707168579, -0.09778248518705368, 0.02093593403697014, 0.03252136707305908, 0.0023886391427367926, -0.0909373089671135, -0.11475273966789246, -0.005937389098107815, 0.0007111963350325823, 0.059637341648340225, -0.011835909448564053, -0.12288510054349899, -0.015161625109612942, -0.007971981540322304, -0.0990181490778923, 0.24559645354747772, 0.18987183272838593, -0.058149971067905426, 0.1790049821138382, 0.16735194623470306, -0.1084149032831192, -0.3222751021385193, -0.13090626895427704, -0.15650202333927155, -0.018639640882611275, 0.006701847538352013, -0.1447596400976181, 0.015405998565256596, -0.0009679451468400657, -0.04586038738489151, 0.1186026930809021, -0.22686494886875153, -0.09137401729822159, 0.15617038309574127, -0.0016553744208067656, 0.3891366124153137, -0.134832963347435, -0.10358119010925293, -0.06496183574199677, -0.2622004449367523, 0.10330671817064285, -0.010594364255666733, 0.08245284855365753, -0.004968917928636074, 0.1229347512125969, 0.00320058292709291, -0.033127062022686005, 0.11826857179403305, -0.05156389996409416, -0.024332698434591293, -0.13067477941513062, -0.03440698981285095, 0.07122620195150375, 0.03252752497792244, 0.002514185616746545, -0.08845537155866623, 0.012102918699383736, -0.11267027258872986, -0.03642494976520538, -0.06308335810899734, 0.05644063279032707, -0.011219770647585392, -0.038429852575063705, -0.014545382931828499, -0.00985063798725605, -0.004518007859587669, 0.007448259741067886, 0.22064530849456787, -0.047058697789907455, 0.08266885578632355, 0.07123645395040512, 0.1332242786884308, -0.18819250166416168, 0.11294175684452057, -0.1153898760676384, -0.07965560257434845, 0.04719803109765053, -0.0369156077504158, 0.02202659659087658, 0.13347531855106354, -0.07071422040462494, 0.04577192664146423, 0.07161439210176468, 0.0006647033151239157, 0.016959693282842636, 0.15620845556259155, -0.17483678460121155, -0.045990847051143646, -0.06575144827365875, 0.005485850386321545, 0.18027128279209137, 0.07254433631896973, 0.16649162769317627, 0.019150374457240105, -0.05234125629067421, -0.004602005705237389, 0.0082780746743083, -0.07960517704486847, 0.03442151099443436, 0.01354991551488638, -0.01866151951253414, -0.1274731457233429, 0.08521836251020432, 0.0016363771865144372, -0.14350396394729614, -0.004929027520120144, 0.16771727800369263, -0.14351686835289001, -0.130251944065094, -0.07136455923318863, 0.06562274694442749, -0.2174961119890213, -0.0805710107088089, -0.049953803420066833, -0.1572050005197525, 0.03146584331989288, 0.09874195605516434, 0.068329356610775, 0.05539197474718094, -0.042440932244062424, -0.06213689595460892, 0.04611987993121147, -0.045498888939619064, -0.019617652520537376, 0.011138916946947575, -0.10442529618740082, 0.05185993015766144, -0.007394832093268633, 0.1320023238658905, -0.06078147143125534, -0.021336957812309265, -0.14070665836334229, 0.038840651512145996, -0.1637415587902069, -0.04984772577881813, -0.11177753657102585, -0.036253128200769424, -0.009052593261003494, -0.08490899205207825, -0.06525323539972305, -0.022328907623887062, -0.12529471516609192, 0.011308906599879265, -0.03611273318529129, 0.0665121078491211, -0.04764266312122345, -0.012131769210100174, 0.08725592494010925, 0.0028998132329434156, 0.09486240148544312, 0.10131412744522095, -0.06289069354534149, 0.10092181712388992, -0.14756521582603455, -0.09343916922807693, 0.0883777067065239, 0.034924812614917755, 0.06063735857605934, 0.04488515481352806, 0.011787069961428642, 0.1266961693763733, 0.028313931077718735, 0.04988742992281914, 0.02198030985891819, -0.11686436086893082, -0.03487618640065193, -0.02217867039144039, -0.14278410375118256, -0.007589556742459536, -0.03220684826374054, 0.08421855419874191, -0.024695180356502533, 0.12030918896198273, -0.07125094532966614, 0.06568346172571182, 0.0008140761055983603, 0.03292497992515564, -0.012060028500854969, -0.147907093167305, -0.10500548034906387, -0.12480746954679489, -0.006050729658454657, 0.028264464810490608, 0.29731085896492004, 0.038199711591005325, -0.008484852500259876, 0.04357009008526802, 0.0790257528424263, -0.012049623765051365, 0.0022956556640565395, 0.21750572323799133, 0.06932459026575089, -0.026801468804478645, -0.15929050743579865, 0.07575202733278275, -0.013730374164879322, -0.06352702528238297, 0.11687324941158295, 0.06811674684286118, 0.013975189998745918, 0.09058070182800293, 0.027182605117559433, 0.0265812911093235, -0.09669099748134613, -0.16777028143405914, 0.04139094799757004, 0.0470161996781826, -0.04587871581315994, 0.0748606026172638, 0.14686673879623413, -0.05832866206765175, 0.02253577671945095, -0.06815174221992493, -0.016652755439281464, -0.1897427886724472, -0.1667560338973999, -0.06720205396413803, -0.13873332738876343, 0.02133006975054741, -0.050604213029146194, 0.05393235385417938, -0.013329745270311832, 0.06764804571866989, -0.10325950384140015, 0.019236935302615166, -0.027689510956406593, -0.10730480402708054, 0.03419572487473488, 0.0013911868445575237, 0.07741120457649231, -0.032108232378959656, -0.025580009445548058, -0.12111686915159225, -0.054960742592811584, -0.021324368193745613, 0.060399655252695084, -0.04995674267411232, 0.004871029406785965, -0.1122121512889862, -0.062095437198877335, -0.06411665678024292, 0.06905494630336761, 0.026284990832209587, 0.16544696688652039, -0.002937095472589135, 0.005886625498533249, 0.045292459428310394, 0.1625366359949112, -0.026799453422427177, -0.1172366663813591, -0.0049366517923772335, 0.18925879895687103, 0.06025054678320885, 0.11506258696317673, -0.03059283271431923, 0.029487524181604385, -0.0643734559416771, 0.38531264662742615, 0.3472530245780945, -0.10075505077838898, 0.007147164549678564, 0.04710977524518967, 0.03918956592679024, 0.12545573711395264, 0.11888911575078964, 0.09682652354240417, 0.2711331248283386, -0.06702335178852081, -0.06017507240176201, -0.06295382231473923, 0.019396044313907623, -0.13011153042316437, 0.125177800655365, -0.006385018117725849, -0.07621147483587265, -0.010464292950928211, 0.10906848311424255, -0.18614701926708221, 0.1272786557674408, -0.015366186387836933, -0.17708393931388855, -0.011486810632050037, -0.014345983043313026, 0.1790640652179718, 0.04792778193950653, 0.051582928746938705, -0.02272799238562584, -0.08321769535541534, 0.10176485031843185, 0.032218463718891144, -0.2300364077091217, -0.004749396815896034, 0.04382834583520889, -0.07881951332092285, 0.05066438764333725, -0.01962277665734291, 0.03718530014157295, 0.0851556733250618, 0.09972003102302551, -0.028413549065589905, 0.06746221333742142, 0.02432462014257908, -0.04013274610042572, 0.011612179689109325, -0.05483820289373398, -0.00007733907841611654, -0.10794729739427567, 0.05727575719356537, -0.09546452760696411, 0.09372437745332718, 0.01995893381536007, -0.050031598657369614, 0.013039394281804562, 0.05667632073163986, -0.09473668783903122, 0.05292470008134842, 0.04692121595144272, 0.0010016130981966853, -0.03777975216507912, -0.0622624009847641, -0.013245970010757446, 0.045068930834531784, -0.09940982609987259, -0.0553489550948143, -0.04794752970337868, -0.10736902803182602, 0.07150579988956451, 0.039505910128355026, -0.156833678483963, 0.022881923243403435, -0.10190700739622116, 0.05162711814045906, -0.1705024540424347, 0.08721653372049332, 0.08346975594758987, -0.0005935790250077844, 0.00423855846747756, -0.11673199385404587, 0.06420018523931503, 0.04400194436311722, -0.09808410704135895, -0.07294831424951553 ]
null
null
transformers
# mT5 Translation model For further models, scripts and details, see [our repository](https://github.com/nytud/machine-translation) or [our demo site](https://juniper.nytud.hu/demo/nlp). - Source language: English - Target language: Hungarian - Pretrained model used: mT5-small - Finetuned on subcorpora from OPUS - Segments: 56.837.602 - prefix: "translate English to Hungarian: " ## Limitations - tokenized input text (tokenizer: [HuSpaCy](https://huggingface.co/huspacy)) - max_source_length = 128 - max_target_length = 128 ## Results | Model | BLEU | chrF-3 | chrF-6 | | ------------- | ------------- | ------------- | ------------- | | Google en-hu | 25.30 | 54.08 | 49.06 | | BART | 36.89 | 60.77 | 56.4 | | **mT5** | **27.69** | **53.73** | **48.57** | ## Citation If you use this model, please cite the following paper: ``` @inproceedings {laki-yang-mt, title = {{Jobban fordítunk magyarra, mint a Google!}}, booktitle = {XVIII. Magyar Számítógépes Nyelvészeti Konferencia}, year = {2022}, publisher = {Szegedi Tudományegyetem, Informatikai Intézet}, address = {Szeged, Magyarország}, author = {Laki, László and Yang, Zijian Győző}, pages = {357--372} } ```
{"language": ["en", "hu"], "license": "apache-2.0", "tags": ["translation"], "metrics": ["sacrebleu", "chrf"], "widget": [{"text": "translate English to Hungarian: This may not make much sense to you, sir, but I'd like to ask your permission to date your daughter."}]}
translation
NYTK/translation-mt5-small-128-en-hu
[ "transformers", "pytorch", "mt5", "text2text-generation", "translation", "en", "hu", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en", "hu" ]
TAGS #transformers #pytorch #mt5 #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
mT5 Translation model ===================== For further models, scripts and details, see our repository or our demo site. * Source language: English * Target language: Hungarian * Pretrained model used: mT5-small * Finetuned on subcorpora from OPUS + Segments: 56.837.602 * prefix: "translate English to Hungarian: " Limitations ----------- * tokenized input text (tokenizer: HuSpaCy) * max\_source\_length = 128 * max\_target\_length = 128 Results ------- If you use this model, please cite the following paper:
[]
[ "TAGS\n#transformers #pytorch #mt5 #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 64 ]
[ "passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #translation #en #hu #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.03012421913444996, 0.06119458004832268, -0.005736252758651972, 0.023281440138816833, 0.12731942534446716, 0.0076950350776314735, 0.14911067485809326, 0.12369327247142792, -0.035028308629989624, -0.05117388814687729, 0.15245118737220764, 0.15901730954647064, 0.030278697609901428, 0.05043081194162369, -0.05533650517463684, -0.25035035610198975, 0.06782747060060501, 0.017396027222275734, -0.0011117703979834914, 0.1040365993976593, 0.09201504290103912, -0.03494405373930931, 0.10699646919965744, -0.012998126447200775, -0.052233725786209106, 0.01638597622513771, 0.015052243135869503, -0.11135940253734589, 0.08684442192316055, 0.05516466125845909, 0.033983733505010605, 0.04060162231326103, -0.007035200949758291, -0.2034655213356018, 0.015819260850548744, 0.007812012452632189, -0.07048661261796951, 0.05086607486009598, 0.07340044528245926, -0.05074597895145416, 0.12816502153873444, 0.02663305588066578, -0.0634818896651268, 0.08309716731309891, -0.10017582774162292, -0.049059052020311356, -0.09160304069519043, 0.06242898479104042, 0.07556197792291641, 0.09295839071273804, 0.016893550753593445, 0.1192094162106514, -0.07855207473039627, 0.07269471883773804, 0.16452713310718536, -0.38936519622802734, 0.008413265459239483, 0.03411959856748581, 0.06018590182065964, 0.1142965778708458, -0.00801903661340475, 0.07070142775774002, 0.062450800091028214, 0.029098644852638245, -0.016464730724692345, -0.08082984387874603, -0.12188971787691116, 0.03449036553502083, -0.08448049426078796, -0.04405617713928223, 0.2665291428565979, -0.03990503028035164, 0.04538101702928543, -0.00913927610963583, -0.07445546984672546, -0.028978217393159866, -0.019216105341911316, 0.04323025047779083, -0.0025353487581014633, 0.09441318362951279, 0.09166225790977478, -0.05185571312904358, -0.12595579028129578, 0.001065360032953322, -0.18651463091373444, 0.008448274806141853, 0.023514145985245705, 0.055465273559093475, -0.17271284759044647, 0.07454252988100052, 0.07482399791479111, -0.12564003467559814, 0.01550989132374525, -0.08665879815816879, 0.12444666028022766, 0.0464007593691349, -0.045733388513326645, 0.00015876004181336612, 0.10627930611371994, 0.12203966826200485, 0.0180306788533926, 0.04425225034356117, -0.09386339783668518, 0.1297619789838791, -0.021654987707734108, 0.05492526665329933, -0.010093934834003448, -0.08730962127447128, 0.078253835439682, -0.108038529753685, 0.05981012061238289, -0.03361833840608597, -0.18563330173492432, -0.05426274240016937, -0.021471407264471054, 0.12141531705856323, 0.04375864192843437, 0.06763410568237305, -0.011275000870227814, -0.00008769508713157848, 0.054886896163225174, -0.06016379967331886, -0.0037580912467092276, 0.017350561916828156, 0.014259646646678448, 0.16605456173419952, 0.06285741925239563, 0.033525701612234116, -0.1340465545654297, 0.03004252165555954, -0.057593934237957, 0.015041333623230457, -0.03687087818980217, -0.03726638853549957, 0.07625441998243332, -0.058812886476516724, 0.02001000940799713, -0.1480977088212967, -0.1564067155122757, 0.013149525970220566, 0.026952624320983887, -0.011156138963997364, -0.059448957443237305, -0.025356220081448555, -0.03298146650195122, 0.04151412472128868, -0.09741539508104324, 0.020851075649261475, -0.0735306441783905, 0.06341809034347534, -0.09451916813850403, 0.05830970034003258, -0.16081947088241577, 0.06973076611757278, -0.11208424717187881, 0.004371588584035635, -0.08578280359506607, 0.03589853644371033, -0.05852457508444786, 0.18214423954486847, -0.05400312319397926, -0.04761961102485657, -0.01582031138241291, 0.020412355661392212, -0.028517739847302437, 0.18764781951904297, -0.12065570801496506, -0.06934020668268204, 0.19983789324760437, -0.10422919690608978, -0.17790716886520386, 0.10343849658966064, 0.028771143406629562, 0.03241072595119476, 0.06401997804641724, 0.17865099012851715, 0.04244399815797806, -0.026027530431747437, 0.08462335169315338, 0.12980584800243378, -0.03719453886151314, -0.14576347172260284, 0.048096612095832825, -0.04326016083359718, -0.0893280953168869, 0.04696172848343849, 0.01408229861408472, 0.09572596848011017, -0.01777653768658638, -0.0650668814778328, -0.03730258345603943, -0.022761719301342964, -0.012939833104610443, -0.022857731208205223, 0.09077388793230057, -0.05497691035270691, 0.011317715980112553, -0.02240855060517788, 0.015369780361652374, 0.005341912154108286, 0.08262153714895248, -0.03736351802945137, 0.07701487094163895, 0.016605475917458534, 0.05017327144742012, -0.09575734287500381, 0.04062708467245102, -0.026187706738710403, 0.05950850620865822, 0.035156067460775375, 0.04699363559484482, 0.029038794338703156, -0.023389369249343872, -0.020826509222388268, 0.0032077766954898834, 0.13564889132976532, 0.010257105343043804, -0.057491153478622437, -0.1461641788482666, 0.04441084712743759, -0.011876088567078114, 0.015241594985127449, -0.03951026871800423, 0.027127936482429504, -0.00030413095373660326, 0.12066660076379776, -0.04671173170208931, 0.10524633526802063, -0.03287224471569061, 0.023401357233524323, -0.07767730206251144, -0.0017246556235477328, 0.10633203387260437, 0.009058291092514992, -0.10697442293167114, 0.26158323884010315, -0.13140517473220825, 0.19880321621894836, 0.22798112034797668, -0.24039611220359802, 0.08436658978462219, -0.05812081694602966, -0.01869501732289791, -0.009613468311727047, 0.07138217985630035, 0.015785973519086838, 0.06104080006480217, 0.008302322588860989, 0.17938071489334106, -0.07084432244300842, -0.02960101328790188, -0.018147150054574013, -0.06447846442461014, -0.0246974378824234, 0.08521544933319092, 0.16260872781276703, -0.1675427407026291, 0.1778912991285324, 0.29513558745384216, 0.014394816011190414, 0.15813997387886047, -0.05042546987533569, -0.026576533913612366, 0.06257107853889465, -0.011109651997685432, -0.03552859276533127, -0.03671756759285927, -0.15083667635917664, -0.023446757346391678, 0.08170377463102341, 0.08133476972579956, 0.1108366847038269, -0.10940375924110413, -0.03756774961948395, -0.031034188345074654, -0.015633538365364075, -0.03470692038536072, 0.05080723762512207, 0.007017847150564194, 0.11908596009016037, -0.047556135803461075, -0.017793947830796242, 0.09049830585718155, -0.0008746909443289042, -0.122430719435215, 0.15728537738323212, -0.20113463699817657, -0.27319836616516113, -0.19699493050575256, -0.14135020971298218, -0.06516861915588379, 0.008550547063350677, 0.13846483826637268, -0.08352125436067581, -0.03991347551345825, -0.020584315061569214, 0.04421471059322357, -0.067540742456913, -0.02079600840806961, -0.009950279258191586, 0.056513313204050064, -0.03392341732978821, -0.10505062341690063, -0.04085954278707504, 0.027627594769001007, -0.05149156227707863, 0.10441377758979797, -0.11440613865852356, 0.06609752774238586, 0.1342397928237915, 0.04574650526046753, 0.034933026880025864, -0.033453501760959625, 0.14334796369075775, -0.039124034345149994, 0.0036912281066179276, 0.22942227125167847, -0.009901626035571098, 0.06936300545930862, 0.12903271615505219, 0.0015375487273558974, -0.056166332215070724, 0.02001749351620674, -0.06314808875322342, -0.062310196459293365, -0.2874743342399597, -0.1251056045293808, -0.133767768740654, 0.08381181955337524, 0.023802319541573524, 0.051390331238508224, 0.08413155376911163, 0.061462219804525375, -0.04447343945503235, 0.054815929383039474, 0.012159453704953194, 0.0849570780992508, 0.244098499417305, -0.034714117646217346, 0.11137198656797409, -0.10206661373376846, -0.05461069196462631, 0.10967826843261719, 0.09504836052656174, 0.12705904245376587, 0.09280721843242645, 0.0817730501294136, 0.0722009688615799, 0.14845600724220276, 0.10466955602169037, 0.12295109778642654, 0.043556418269872665, 0.007391633465886116, -0.058094605803489685, -0.05058877542614937, -0.052579350769519806, 0.04908564314246178, -0.022284729406237602, -0.12449901551008224, -0.06305757910013199, -0.03149304538965225, 0.08713509887456894, 0.14077389240264893, 0.0358286090195179, -0.15055179595947266, 0.031049001961946487, 0.11285743117332458, -0.010917854495346546, -0.07881584018468857, 0.13919998705387115, -0.007977585308253765, -0.12267641723155975, 0.12660111486911774, -0.005152266472578049, 0.15843690931797028, -0.0034413356333971024, 0.0655745193362236, -0.07003867626190186, -0.09323133528232574, 0.0526110976934433, 0.1232118308544159, -0.3687989413738251, 0.203800767660141, -0.004952235147356987, -0.04038519412279129, -0.08649324625730515, -0.00345149589702487, 0.053383585065603256, 0.19824975728988647, 0.09548717737197876, 0.022661352530121803, -0.10238807648420334, 0.0005816256161779165, -0.05296789109706879, 0.050663262605667114, 0.06922199577093124, 0.018153825774788857, -0.02883920632302761, -0.05503254756331444, -0.0016501692589372396, -0.038230910897254944, 0.08369750529527664, -0.05382871627807617, -0.16806438565254211, 0.06019318103790283, 0.08512952923774719, 0.07824735343456268, -0.024206412956118584, -0.020988021045923233, -0.11300548911094666, 0.14392724633216858, -0.09530454128980637, -0.08228807896375656, -0.10897024720907211, -0.12126200646162033, 0.06283251941204071, -0.059203267097473145, 0.028184857219457626, -0.06384647637605667, -0.013597486540675163, -0.06546809524297714, -0.20185086131095886, 0.11262129992246628, -0.12115216255187988, -0.057091765105724335, -0.029509123414754868, 0.16432592272758484, -0.0850682258605957, 0.03671884909272194, 0.01853145658969879, -0.018102411180734634, -0.10449302941560745, -0.12976574897766113, -0.027649283409118652, 0.04677760973572731, 0.05065310001373291, 0.004855792038142681, -0.1290539801120758, -0.025947105139493942, -0.025074588134884834, -0.06780260801315308, 0.25578564405441284, 0.1578608900308609, -0.054066598415374756, 0.2001018226146698, 0.19476766884326935, -0.12609867751598358, -0.3163837492465973, -0.14233146607875824, -0.1483708918094635, -0.04830048605799675, -0.02117086760699749, -0.15600991249084473, 0.07496500015258789, 0.006920720916241407, -0.04332192614674568, 0.10978484153747559, -0.25323376059532166, -0.08757973462343216, 0.15790767967700958, -0.024529358372092247, 0.32607489824295044, -0.1416778862476349, -0.09744871407747269, -0.100704625248909, -0.23325522243976593, 0.11692418903112411, -0.1353263556957245, 0.06987981498241425, -0.00897324550896883, 0.09442006796598434, -0.010224561206996441, -0.039290424436330795, 0.12443143129348755, 0.015933021903038025, -0.011581756174564362, -0.11459632217884064, 0.05620882287621498, 0.0756182074546814, 0.005747559480369091, 0.028197001665830612, -0.16635967791080475, 0.018306707963347435, -0.1349550187587738, -0.007569972425699234, -0.06155870854854584, 0.057668160647153854, -0.015977511182427406, -0.04860885068774223, -0.009687214158475399, -0.03732816129922867, 0.06163955479860306, 0.02299511432647705, 0.20342621207237244, -0.015214715152978897, 0.1007375419139862, 0.15086930990219116, 0.1327122300863266, -0.18767917156219482, 0.09518581628799438, -0.10030939429998398, -0.06662168353796005, 0.071778304874897, -0.10483893007040024, 0.04790119081735611, 0.12589839100837708, -0.0716283991932869, 0.07056192308664322, 0.08124373108148575, 0.008073332719504833, -0.015901876613497734, 0.12308186292648315, -0.1846296638250351, -0.010648190043866634, -0.05083993077278137, 0.07209920883178711, 0.09331245720386505, 0.07639497518539429, 0.17989611625671387, -0.011733146384358406, -0.03662256523966789, 0.004171303007751703, 0.04966158792376518, -0.08508508652448654, 0.05979374423623085, 0.013640842400491238, -0.002728187944740057, -0.14585570991039276, 0.12796664237976074, 0.011150338687002659, -0.12945353984832764, -0.005568896885961294, 0.18696686625480652, -0.1500338912010193, -0.1332184374332428, -0.011519922874867916, 0.08907816559076309, -0.13855206966400146, -0.09257099777460098, -0.04235365241765976, -0.1697482019662857, 0.07168670743703842, 0.09250195324420929, 0.057966481894254684, 0.045075807720422745, -0.03538932278752327, -0.07792988419532776, 0.03281556814908981, -0.016774678602814674, -0.04439006373286247, 0.0034198432695120573, -0.08853990584611893, 0.045505549758672714, -0.005447360686957836, 0.12115491181612015, -0.05027720704674721, -0.013888903893530369, -0.10599911957979202, 0.027878517284989357, -0.20302452147006989, -0.023529600352048874, -0.09825914353132248, -0.035322535783052444, -0.015669485554099083, -0.06045733392238617, -0.08482389152050018, -0.010956056416034698, -0.12580721080303192, -0.015199168585240841, -0.06393686681985855, 0.08716177940368652, -0.07269575446844101, -0.0009857548866420984, 0.08794190734624863, -0.030140118673443794, 0.11957558244466782, 0.11024752259254456, -0.09566573798656464, 0.10738423466682434, -0.11288941651582718, -0.09865406155586243, 0.08220367133617401, 0.05312743037939072, 0.036566171795129776, 0.04823172464966774, -0.014958469197154045, 0.1151602566242218, 0.0108727365732193, 0.034253984689712524, 0.007650292012840509, -0.11521503329277039, -0.02935955487191677, -0.05061638355255127, -0.10216113179922104, -0.02769636921584606, -0.06377580761909485, 0.071767158806324, 0.0042570107616484165, 0.12974560260772705, -0.05394069850444794, 0.07459127902984619, -0.05934203416109085, 0.03505942225456238, -0.001601535128429532, -0.1322481781244278, -0.09796611964702606, -0.10145173221826553, 0.0009078817092813551, 0.00476880231872201, 0.2566799819469452, 0.0023661628365516663, -0.038641054183244705, 0.05090663582086563, 0.08238908648490906, -0.04883802309632301, 0.009795521385967731, 0.23493756353855133, 0.06576208770275116, -0.015168624930083752, -0.12139841914176941, 0.010144807398319244, 0.01832139864563942, -0.02712063305079937, 0.133482426404953, 0.09552353620529175, 0.07281477749347687, 0.11148224771022797, 0.0090011116117239, 0.01428293902426958, -0.09465676546096802, -0.09822990000247955, 0.03279714658856392, 0.09098482131958008, -0.026268374174833298, 0.11324656009674072, 0.16926753520965576, -0.04993331432342529, 0.01503363810479641, -0.046922020614147186, -0.01830534264445305, -0.18156084418296814, -0.13239015638828278, -0.06130644679069519, -0.16155219078063965, -0.0055177160538733006, -0.0801444798707962, 0.0657796785235405, 0.03319879621267319, 0.08123582601547241, -0.08243367820978165, 0.014373372308909893, -0.005408754106611013, -0.11214099079370499, 0.04130071774125099, -0.024720098823308945, 0.06359905004501343, -0.053086359053850174, -0.03321593999862671, -0.052932385355234146, -0.032906461507081985, -0.024480214342474937, 0.056466855108737946, -0.02297704853117466, 0.03419407084584236, -0.10488206893205643, -0.062945656478405, -0.04560399800539017, 0.02855812944471836, -0.0009456022526137531, 0.19882771372795105, 0.018396373838186264, -0.016532670706510544, 0.06436686962842941, 0.162070631980896, -0.07604042440652847, -0.1452019065618515, -0.043812043964862823, 0.1804371178150177, 0.039099182933568954, 0.07090497016906738, -0.022758053615689278, 0.028458524495363235, -0.08665724098682404, 0.3612220883369446, 0.30973950028419495, -0.1134825199842453, -0.007668625097721815, 0.0036473581567406654, 0.033586472272872925, 0.08699434250593185, 0.12055911868810654, 0.14118662476539612, 0.20168565213680267, -0.06354645639657974, -0.015258733183145523, -0.06132802367210388, -0.008808441460132599, -0.1508403718471527, 0.11659428477287292, -0.025694193318486214, -0.10763968527317047, 0.019213492050766945, 0.08999517560005188, -0.19624316692352295, 0.07307286560535431, -0.09839751571416855, -0.13997170329093933, -0.03110342100262642, -0.01112690195441246, 0.2011229395866394, 0.051375292241573334, 0.020163850858807564, -0.013468805700540543, -0.06277480721473694, 0.1273600310087204, 0.00728821475058794, -0.19275033473968506, -0.01873440109193325, 0.06736741960048676, -0.12474706768989563, 0.05606522038578987, 0.004357900004833937, 0.004093902185559273, 0.07940870523452759, 0.09155236184597015, -0.053531620651483536, 0.05625979229807854, 0.021882787346839905, -0.01664876751601696, 0.009470565244555473, -0.04140753671526909, 0.0008544982993043959, -0.07065396010875702, 0.04895639419555664, -0.08845516294240952, 0.05557098984718323, 0.034238480031490326, -0.05112785845994949, 0.008946594782173634, 0.014253830537199974, -0.07233871519565582, 0.060636021196842194, 0.03936086595058441, -0.03061705082654953, -0.022037280723452568, -0.07153080403804779, -0.03844400495290756, -0.016073493286967278, -0.13511987030506134, -0.042565569281578064, -0.06801547855138779, -0.09303370863199234, 0.04587217792868614, 0.04396318271756172, -0.16389402747154236, 0.010259256698191166, -0.0965229719877243, 0.032804396003484726, -0.20356489717960358, 0.06962146610021591, 0.08108217269182205, -0.014685899950563908, 0.006107510533183813, -0.11384868621826172, 0.05323712155222893, 0.06078515946865082, -0.08244850486516953, -0.07623148709535599 ]
null
null
transformers
# My Awesome Model
{"tags": ["conversational"]}
text-generation
nabarun/DialoGPT-small-joshua
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# My Awesome Model
[ "# My Awesome Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# My Awesome Model" ]
[ 51, 4 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# My Awesome Model" ]
[ -0.05259015038609505, 0.05521034821867943, -0.005910294596105814, 0.017722278833389282, 0.15250112116336823, 0.02286236733198166, 0.07657632976770401, 0.09513414651155472, -0.025391526520252228, -0.047348517924547195, 0.15119488537311554, 0.19781284034252167, -0.020334534347057343, 0.101333387196064, -0.04688440263271332, -0.3143521845340729, 0.06439975649118423, 0.05463787540793419, -0.015605635941028595, 0.12023304402828217, 0.09468326717615128, -0.0530015267431736, 0.08742043375968933, -0.012155864387750626, -0.1293085366487503, -0.0027921805158257484, -0.002384399762377143, -0.10180269181728363, 0.11194873601198196, 0.033712033182382584, 0.05166437849402428, 0.0182647667825222, -0.05843055993318558, -0.139859139919281, 0.03845210000872612, -0.015005595050752163, -0.05602653697133064, 0.05648263916373253, 0.059830192476511, -0.07164353132247925, 0.1669619083404541, 0.13275989890098572, -0.04237370565533638, 0.056127581745386124, -0.17620700597763062, 0.017941240221261978, 0.01800798624753952, 0.019184142351150513, 0.05306641012430191, 0.10830496996641159, -0.03932326287031174, 0.09217294305562973, -0.11410652846097946, 0.08313368260860443, 0.07800983637571335, -0.29151955246925354, -0.025312699377536774, 0.10440942645072937, 0.06437138468027115, 0.048375632613897324, -0.013386772945523262, 0.0621674507856369, 0.02149512618780136, 0.008602659218013287, 0.02225899137556553, -0.06727100163698196, -0.05789240449666977, 0.032748885452747345, -0.0967593789100647, -0.03634428232908249, 0.19753605127334595, -0.024647634476423264, 0.053590498864650726, -0.06265407055616379, -0.11300963163375854, -0.039751436561346054, -0.050429005175828934, -0.029761891812086105, -0.05090925097465515, 0.09489558637142181, 0.004352911841124296, -0.09534718841314316, -0.13405443727970123, -0.01370926946401596, -0.1618979275226593, 0.15892250835895538, 0.012579603120684624, 0.046201955527067184, -0.19210097193717957, 0.11465331166982651, -0.03857925534248352, -0.08259090781211853, 0.030513519421219826, -0.12010065466165543, 0.03160654753446579, -0.008132083341479301, -0.019599268212914467, -0.049325279891490936, 0.061037879437208176, 0.08101806789636612, 0.018783701583743095, 0.005755073390901089, 0.018167443573474884, 0.05343452841043472, 0.05891622602939606, 0.10033947974443436, -0.02891627699136734, -0.0625043511390686, 0.0025436533614993095, -0.12051084637641907, -0.01122665498405695, -0.05357983708381653, -0.18095199763774872, 0.002246231772005558, 0.02455340512096882, 0.05192234739661217, 0.011778532527387142, 0.09955989569425583, -0.028496338054537773, -0.026898741722106934, 0.06898727267980576, 0.002862759632989764, -0.015707949176430702, -0.005368964280933142, -0.010934269987046719, 0.11485416442155838, -0.023099146783351898, 0.04774846136569977, -0.12022071331739426, 0.020393015816807747, -0.07851235568523407, -0.0019349842332303524, -0.06214260309934616, -0.04864754155278206, -0.0019346009939908981, -0.06985589861869812, 0.021118074655532837, -0.14833110570907593, -0.17990200221538544, -0.005064866971224546, 0.021302316337823868, -0.052403319627046585, -0.09162671118974686, -0.0982397273182869, -0.02586611732840538, 0.03574685752391815, -0.05873546749353409, 0.013170980848371983, -0.06884536147117615, 0.06542801111936569, 0.0029820678755640984, 0.05682007595896721, -0.14085575938224792, 0.08719147741794586, -0.12582023441791534, -0.023288866505026817, -0.061977192759513855, 0.1109607070684433, 0.024780582636594772, 0.1267160177230835, 0.004311583004891872, -0.0033308975398540497, -0.08729329705238342, 0.08271238207817078, -0.04243258014321327, 0.22770646214485168, -0.10479787737131119, -0.08809807151556015, 0.2632525563240051, -0.05423165112733841, -0.16432519257068634, 0.10179096460342407, -0.014350244775414467, 0.12198644131422043, 0.13850919902324677, 0.16080057621002197, 0.007628654129803181, 0.03313867375254631, 0.10115300863981247, 0.08631709218025208, -0.08573295921087265, -0.0611947737634182, 0.023627014830708504, -0.011463395319879055, -0.10670105367898941, 0.046802595257759094, 0.04794782027602196, 0.08188598603010178, -0.04982871189713478, -0.028600862249732018, -0.01972118206322193, -0.044152840971946716, 0.05264130234718323, 0.007675500120967627, 0.13217447698116302, -0.03674980252981186, -0.03692879155278206, -0.023745311424136162, 0.01699630729854107, -0.03115241602063179, 0.007061392068862915, -0.05687357112765312, 0.11091547459363937, -0.03406180441379547, 0.051789235323667526, -0.16953988373279572, -0.04873261600732803, -0.02087729424238205, 0.1402055323123932, 0.04973345249891281, 0.1329866498708725, 0.06287940591573715, -0.010758201591670513, 0.00859389640390873, 0.007998145185410976, 0.13181665539741516, 0.007865442894399166, -0.07660657912492752, -0.047718439251184464, 0.09176599979400635, -0.05973208695650101, 0.06147782504558563, -0.098741315305233, -0.004747362341731787, -0.01433002483099699, 0.08674649894237518, 0.006352655589580536, 0.029382232576608658, -0.006192679051309824, 0.003654100699350238, -0.06161240115761757, 0.017873648554086685, 0.12492607533931732, -0.01421504095196724, -0.07439801841974258, 0.22084392607212067, -0.15798072516918182, 0.18006981909275055, 0.18165533244609833, -0.3081994652748108, 0.024602634832262993, -0.08860466629266739, -0.036338552832603455, 0.03426366671919823, 0.0491504967212677, -0.034147560596466064, 0.16587987542152405, -0.016766328364610672, 0.201018825173378, -0.03547777235507965, -0.01287798210978508, -0.010399105958640575, -0.03656993433833122, -0.010632630437612534, 0.09065473079681396, 0.15122920274734497, -0.1677125245332718, 0.18270380795001984, 0.1660280078649521, 0.06873020529747009, 0.17776396870613098, 0.034313347190618515, -0.006856906693428755, 0.07112615555524826, -0.022670727223157883, -0.07675548642873764, -0.049287427216768265, -0.26302891969680786, -0.027947327122092247, 0.06471601128578186, 0.04510856419801712, 0.11924877762794495, -0.10971947014331818, -0.037208184599876404, 0.010892451740801334, -0.013165894895792007, 0.02132410928606987, 0.09682225435972214, 0.01171150617301464, 0.11804302036762238, -0.021027036011219025, -0.05209195241332054, 0.0898953229188919, 0.02727191150188446, -0.0787680521607399, 0.19168277084827423, -0.10074768215417862, -0.3233809769153595, -0.11354339867830276, -0.18166927993297577, -0.017843691632151604, 0.05878754332661629, 0.08049646019935608, -0.09228580445051193, -0.02625267766416073, -0.01639235019683838, 0.0758359357714653, -0.09145816415548325, -0.015880629420280457, -0.09367848187685013, 0.034986745566129684, -0.10827737301588058, -0.07011983543634415, -0.05141967162489891, -0.03368452936410904, -0.04457031562924385, 0.13157756626605988, -0.12242637574672699, 0.06396433711051941, 0.2076517641544342, 0.06227295100688934, 0.05622440204024315, -0.0229496993124485, 0.23288212716579437, -0.10842552781105042, 0.02383521944284439, 0.1717897206544876, -0.03566030040383339, 0.0727933868765831, 0.13435456156730652, 0.006721907295286655, -0.08144525438547134, 0.03465581312775612, -0.04592517390847206, -0.08630958944559097, -0.20441576838493347, -0.14156180620193481, -0.12814727425575256, 0.07913564145565033, 0.03285396471619606, 0.05478321388363838, 0.15024253726005554, 0.11386489123106003, 0.007987297140061855, 0.00976672861725092, -0.006888182368129492, 0.05438044294714928, 0.17482298612594604, -0.05838097631931305, 0.10041683167219162, -0.037591226398944855, -0.1924494504928589, 0.08022978901863098, 0.04309763014316559, 0.08280511945486069, 0.07474655658006668, 0.0856199786067009, 0.013537914492189884, 0.03723837807774544, 0.10897084325551987, 0.1165735274553299, 0.031679023057222366, -0.038079675287008286, -0.04882059991359711, -0.026300756260752678, -0.03285675123333931, 0.05745977535843849, 0.07790146768093109, -0.1608346849679947, -0.06348084658384323, -0.06350091099739075, 0.07662643492221832, 0.09017108380794525, 0.11811108142137527, -0.21219493448734283, 0.01579318381845951, 0.092556893825531, -0.0494147390127182, -0.1304239183664322, 0.07402537018060684, -0.00466050673276186, -0.1397053301334381, 0.037663187831640244, -0.014095795340836048, 0.1359514445066452, -0.0778401643037796, 0.10336452722549438, -0.08307972550392151, -0.06147889420390129, 0.03632286190986633, 0.1355396956205368, -0.30774354934692383, 0.2137020230293274, -0.022472934797406197, -0.05296783149242401, -0.10508129745721817, -0.011727629229426384, 0.020913105458021164, 0.09079049527645111, 0.10090240091085434, -0.0025442070327699184, 0.0061299679800868034, -0.0345483273267746, -0.053218815475702286, 0.024456629529595375, 0.07957815378904343, -0.08542889356613159, 0.0017540202243253589, -0.02361489273607731, -0.004407065454870462, -0.032844748347997665, -0.01189463958144188, -0.011617658659815788, -0.16786961257457733, 0.06556065380573273, -0.002625665394589305, 0.11129079759120941, 0.03491498529911041, 0.0024013579823076725, -0.1009332686662674, 0.19977013766765594, 0.01796281896531582, -0.08052749931812286, -0.08830537647008896, -0.03254766762256622, 0.03660419583320618, -0.06121435388922691, 0.027481911703944206, -0.06916457414627075, 0.033381566405296326, -0.06441576033830643, -0.18325145542621613, 0.1268530637025833, -0.10945470631122589, -0.03609596937894821, -0.04321056231856346, 0.18323224782943726, -0.00929707009345293, -0.0011623724130913615, 0.05866571143269539, 0.0032208464108407497, -0.1347510665655136, -0.10740556567907333, 0.020214511081576347, -0.015275230631232262, 0.009142245166003704, 0.05559912323951721, -0.009665844030678272, 0.00045268211397342384, -0.039558928459882736, -0.023234419524669647, 0.32348164916038513, 0.10732097923755646, -0.04944206401705742, 0.17007054388523102, 0.13087597489356995, -0.0827672928571701, -0.30699312686920166, -0.10971353948116302, -0.10529600828886032, -0.026918673887848854, -0.037983208894729614, -0.19617970287799835, 0.09504909813404083, -0.03528566658496857, -0.022136637941002846, 0.11253651231527328, -0.2759084105491638, -0.0770430713891983, 0.1826775223016739, 0.003314757253974676, 0.3998824954032898, -0.10265109688043594, -0.08777514100074768, -0.06741699576377869, -0.1120782196521759, 0.2033512443304062, -0.05560711398720741, 0.08663415163755417, -0.00517998356372118, 0.15513743460178375, 0.055607251822948456, -0.02176513522863388, 0.08932057023048401, -0.005811662413179874, -0.0546204075217247, -0.1219351515173912, -0.03444604203104973, -0.009159418754279613, 0.007239421829581261, 0.03589896112680435, -0.04242607578635216, 0.01279151439666748, -0.1399589478969574, -0.045490626245737076, -0.0764620453119278, 0.024699507281184196, 0.021008269861340523, -0.0652410089969635, -0.01643640361726284, -0.03945036977529526, -0.012804778292775154, 0.03164318576455116, 0.15236099064350128, -0.06478006392717361, 0.1476556956768036, 0.04904455319046974, 0.15412139892578125, -0.14745712280273438, -0.02258288487792015, -0.06896031647920609, -0.05498642474412918, 0.04900865629315376, -0.10053684562444687, 0.050061121582984924, 0.1202658861875534, -0.0742902010679245, 0.0987328365445137, 0.0922594666481018, -0.01938629150390625, 0.0012483424507081509, 0.1226617842912674, -0.2489612102508545, -0.07742628455162048, -0.10509459674358368, 0.013337249867618084, 0.10138551890850067, 0.06995654851198196, 0.17304721474647522, -0.0037713919300585985, -0.036284226924180984, -0.0064643872901797295, 0.025414984673261642, -0.03540204465389252, 0.05724727362394333, -0.002706433180719614, 0.016663886606693268, -0.15213344991207123, 0.060368724167346954, -0.00024176653823815286, -0.1438901126384735, -0.013603870756924152, 0.16073721647262573, -0.11208858340978622, -0.15145981311798096, -0.007263668347150087, 0.13685113191604614, -0.13171035051345825, -0.03302847594022751, -0.03708777576684952, -0.170182466506958, 0.07439173012971878, 0.1024777740240097, 0.08549231290817261, 0.08025266975164413, -0.06620611250400543, -0.00807863101363182, -0.011656313203275204, -0.026087598875164986, 0.031810320913791656, -0.023377234116196632, -0.09044221043586731, 0.03872343525290489, -0.026654237881302834, 0.13591371476650238, -0.09607382118701935, -0.09331836551427841, -0.135749951004982, 0.039314381778240204, -0.12405620515346527, -0.08138058334589005, -0.12200927734375, -0.0591500885784626, 0.00224387738853693, -0.0001289021165575832, -0.035674065351486206, -0.06687422841787338, -0.13582271337509155, 0.04366770386695862, -0.04484611004590988, 0.0013091047294437885, -0.040241483598947525, 0.04561002552509308, 0.06766383349895477, -0.03493715822696686, 0.13722217082977295, 0.11722734570503235, -0.07864081114530563, 0.08946478366851807, -0.16657429933547974, -0.0683990865945816, 0.08854512125253677, 0.008173754438757896, 0.06165994703769684, 0.06743349134922028, 0.033807408064603806, 0.06109451875090599, 0.04151686280965805, 0.03488299250602722, 0.01739438995718956, -0.09271225333213806, 0.015541021712124348, 0.022296719253063202, -0.1294609159231186, -0.04801803454756737, -0.029226921498775482, 0.00939185917377472, 0.008117396384477615, 0.11003357172012329, -0.0426274873316288, 0.09439733624458313, -0.05888751894235611, 0.036728594452142715, 0.016222506761550903, -0.16461637616157532, -0.020102784037590027, -0.11915475130081177, 0.028684545308351517, -0.0033096212428063154, 0.25625869631767273, 0.06346847862005234, 0.020517030730843544, 0.01250078622251749, 0.08567021042108536, 0.07241600006818771, 0.02562166005373001, 0.1956365555524826, 0.10854171961545944, -0.05020022392272949, -0.12334850430488586, 0.09686340391635895, 0.034720368683338165, 0.06432123482227325, 0.13385434448719025, -0.026959087699651718, 0.002498799469321966, 0.11019360274076462, 0.011678861454129219, 0.04961980879306793, -0.09859088063240051, -0.16400282084941864, -0.00994415208697319, 0.061864156275987625, -0.04559077322483063, 0.12240655720233917, 0.11382720619440079, -0.020697353407740593, 0.03180128335952759, -0.010503606870770454, -0.05694027617573738, -0.16998925805091858, -0.1630837321281433, -0.08357038348913193, -0.11794789135456085, -0.0027763545513153076, -0.11386270076036453, 0.013879159465432167, 0.06452289968729019, 0.0604364387691021, -0.09019444137811661, 0.08891061693429947, 0.0687386617064476, -0.11843101680278778, 0.08828350901603699, -0.033263903111219406, 0.07249268144369125, 0.0015160300536081195, 0.003872724948450923, -0.13800905644893646, 0.032393742352724075, -0.008493867702782154, 0.04159298539161682, -0.09244006127119064, 0.022458361461758614, -0.11297028511762619, -0.07659684121608734, -0.07971972227096558, 0.05093973129987717, -0.03541257977485657, 0.1390930563211441, 0.001295371213927865, -0.035233911126852036, 0.024190181866288185, 0.22729112207889557, -0.06350252777338028, -0.030667411163449287, -0.0618741400539875, 0.21414142847061157, 0.024466563016176224, 0.10703565180301666, -0.016775688156485558, 0.019240234047174454, -0.0764411985874176, 0.3689337372779846, 0.344390869140625, -0.1225387305021286, -0.0015968306688591838, 0.031062176451086998, 0.036916591227054596, 0.11621878296136856, 0.12602226436138153, 0.057955991476774216, 0.2995031177997589, -0.08396036922931671, -0.002026971662417054, -0.02688612788915634, -0.03624163940548897, -0.04409930482506752, 0.10547586530447006, 0.06835740804672241, -0.03330419585108757, -0.027012333273887634, 0.1376710683107376, -0.2966996431350708, 0.12323499470949173, -0.15714547038078308, -0.1487535685300827, -0.06873904913663864, -0.005042468197643757, 0.08589684963226318, 0.04748665541410446, 0.1069009080529213, -0.019124338403344154, -0.08203735202550888, 0.05766449123620987, 0.0320524163544178, -0.22844897210597992, 0.011852608993649483, 0.08361081779003143, -0.06153005734086037, 0.011767351068556309, -0.017906347289681435, 0.038472190499305725, 0.07790610194206238, 0.025976579636335373, -0.032770540565252304, 0.06325861811637878, -0.005814229138195515, -0.05033424496650696, 0.04302205145359039, 0.05059972032904625, 0.017107632011175156, -0.1511564701795578, 0.07320158183574677, -0.1762860119342804, 0.0566408596932888, -0.005331212189048529, -0.04948166385293007, 0.000018263708625454456, 0.01998119056224823, -0.06808236241340637, 0.05880929157137871, 0.0952666699886322, -0.012173139490187168, -0.002317852806299925, -0.056667573750019073, 0.007662574760615826, -0.0679154172539711, -0.0747012197971344, -0.10497893393039703, -0.1338900774717331, -0.11392296850681305, 0.10846775025129318, -0.011928223073482513, -0.19833622872829437, 0.02906924858689308, -0.11258108913898468, 0.04933213070034981, -0.13360801339149475, 0.08599711954593658, 0.1282832771539688, 0.021543797105550766, -0.01265349704772234, 0.04020093381404877, 0.01591683179140091, 0.08550478518009186, -0.09200563281774521, -0.10515180230140686 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-down-sampled-evaluating-student-writing This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.3408 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.5869 | 1.0 | 157 | 2.3949 | | 2.4142 | 2.0 | 314 | 2.3551 | | 2.3792 | 3.0 | 471 | 2.2840 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.9.1 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-down-sampled-evaluating-student-writing", "results": []}]}
fill-mask
NahedAbdelgaber/distilbert-base-uncased-finetuned-down-sampled-evaluating-student-writing
[ "transformers", "pytorch", "tensorboard", "distilbert", "fill-mask", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-down-sampled-evaluating-student-writing ========================================================================= This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 2.3408 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3.0 ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.9.1 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 57, 98, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.11097101122140884, 0.05671774223446846, -0.0023508314043283463, 0.12307032942771912, 0.1574009358882904, 0.026409825310111046, 0.1249372735619545, 0.1136503592133522, -0.09463489055633545, 0.018202858045697212, 0.1314517706632614, 0.16646255552768707, 0.011054330505430698, 0.1348450630903244, -0.025093520060181618, -0.242731511592865, -0.011722954921424389, 0.04722221568226814, -0.10818060487508774, 0.13814495503902435, 0.0922146812081337, -0.13147437572479248, 0.06882426142692566, 0.018038036301732063, -0.21533453464508057, 0.009961939416825771, 0.019853821024298668, -0.06587382405996323, 0.14972279965877533, 0.000472562707727775, 0.14334997534751892, -0.006151910871267319, 0.08600883930921555, -0.15171311795711517, 0.014299712143838406, 0.0534483902156353, 0.008915268816053867, 0.08155018836259842, 0.04308692738413811, 0.009477450512349606, 0.09631228446960449, -0.08305923640727997, 0.05853509530425072, 0.023553304374217987, -0.12589389085769653, -0.2596920430660248, -0.07980577647686005, 0.01467780489474535, 0.06332506984472275, 0.10972393304109573, 0.007212888915091753, 0.1466134786605835, -0.08938312530517578, 0.09084519743919373, 0.23214596509933472, -0.2741619646549225, -0.06981772929430008, 0.01981351524591446, 0.021370628848671913, 0.03638388216495514, -0.10462427884340286, -0.022127678617835045, 0.04338894411921501, 0.05599820613861084, 0.16067181527614594, -0.032582323998212814, -0.09501132369041443, 0.00046274453052319586, -0.13725388050079346, -0.04113246873021126, 0.11143223196268082, 0.027487969025969505, -0.03659077361226082, -0.03243571147322655, -0.0688529834151268, -0.1393333524465561, -0.046834129840135574, -0.008254223503172398, 0.04186150059103966, -0.04270806536078453, -0.078280009329319, -0.004292390774935484, -0.10644230991601944, -0.07862970978021622, -0.0704263374209404, 0.17721818387508392, 0.04208306595683098, 0.02501276135444641, -0.02868807315826416, 0.10742678493261337, 0.005107004661113024, -0.14640338718891144, 0.031897980719804764, 0.03410245478153229, -0.0056366994976997375, -0.027639025822281837, -0.07319214940071106, -0.07053748518228531, 0.013092086650431156, 0.13240762054920197, -0.06205172836780548, 0.04390697181224823, 0.05400848016142845, 0.052997250109910965, -0.11761519312858582, 0.18386942148208618, -0.03946630656719208, -0.006277984008193016, 0.016119709238409996, 0.05156506225466728, 0.022466929629445076, -0.002407738007605076, -0.1033468022942543, 0.0018240237841382623, 0.09245490282773972, 0.012500313110649586, -0.05729355290532112, 0.05769236758351326, -0.06275399774312973, -0.01315303798764944, 0.013052663765847683, -0.10128869861364365, 0.025806806981563568, -0.0114951366558671, -0.07512752711772919, -0.027272893115878105, 0.050626952201128006, 0.002746719168499112, -0.011276829056441784, 0.11331330984830856, -0.08474794775247574, 0.035174574702978134, -0.11270200461149216, -0.10629244893789291, 0.005885119084268808, -0.07433994859457016, 0.021331265568733215, -0.10109689086675644, -0.17345662415027618, -0.0010568902362138033, 0.07929279655218124, -0.02332754246890545, -0.04569878801703453, -0.019063374027609825, -0.07475867867469788, 0.00786494929343462, -0.007741761859506369, 0.1639249473810196, -0.05604800209403038, 0.12059339135885239, 0.05529296025633812, 0.08768423646688461, -0.044871192425489426, 0.05452701449394226, -0.09425909072160721, 0.011895091272890568, -0.20104797184467316, 0.021874748170375824, -0.05089766904711723, 0.05882801115512848, -0.0825384184718132, -0.12203342467546463, -0.005414751823991537, -0.008070740848779678, 0.08622584491968155, 0.09500688314437866, -0.16328415274620056, -0.0795942023396492, 0.16376571357250214, -0.07013178616762161, -0.10258513689041138, 0.12174207717180252, -0.05082519352436066, 0.0442529171705246, 0.05165995657444, 0.12763163447380066, 0.07099790871143341, -0.11109044402837753, 0.038002219051122665, 0.005453215911984444, 0.0393022783100605, -0.07195132970809937, 0.06495638936758041, -0.003013165667653084, -0.007818893529474735, 0.03374223783612251, -0.03612709790468216, 0.07689431309700012, -0.09690573066473007, -0.10252297669649124, -0.04311935603618622, -0.10848312824964523, 0.07445622235536575, 0.07532674819231033, 0.08076170831918716, -0.09458929300308228, -0.08514105528593063, 0.042775169014930725, 0.0794219896197319, -0.05165279284119606, 0.02891542576253414, -0.05636553838849068, 0.06578066945075989, -0.05275287851691246, -0.026531437411904335, -0.1898442953824997, -0.014564595185220242, 0.00617574155330658, -0.03163398429751396, 0.01500257384032011, -0.0008805934921838343, 0.09126686304807663, 0.07624643296003342, -0.06064425781369209, -0.028188422322273254, -0.06384644657373428, -0.008949371986091137, -0.12263912707567215, -0.1900530606508255, -0.04664570465683937, -0.02214939333498478, 0.12274882197380066, -0.15372775495052338, 0.03009440004825592, -0.0676022320985794, 0.07114645838737488, 0.0020068984013050795, -0.013979504816234112, -0.04698893800377846, 0.08320135623216629, -0.014571749605238438, -0.05615217611193657, 0.06798050552606583, 0.003564253216609359, -0.08364176750183105, -0.049096789211034775, -0.08588419109582901, 0.18197767436504364, 0.1305548995733261, -0.12181177735328674, -0.08829483389854431, 0.034461088478565216, -0.07567812502384186, -0.039309803396463394, -0.04045958071947098, 0.038742292672395706, 0.15880180895328522, -0.00963196624070406, 0.13886266946792603, -0.06382579356431961, -0.040931880474090576, 0.03297886997461319, -0.041543006896972656, 0.029795348644256592, 0.08854854851961136, 0.1255166232585907, -0.04206248000264168, 0.13526655733585358, 0.17098163068294525, -0.12178424000740051, 0.1299303025007248, -0.039113324135541916, -0.07759533077478409, -0.018096376210451126, -0.02038475126028061, 0.008745383471250534, 0.12266888469457626, -0.1271645575761795, 0.0068464637733995914, 0.02637021243572235, 0.00009948558727046475, 0.01964007504284382, -0.23591071367263794, -0.04377368092536926, 0.027840999886393547, -0.03837915137410164, -0.018281375989317894, -0.0022505701053887606, 0.00541576137766242, 0.09659863263368607, -0.0012691101292148232, -0.08798395842313766, 0.04473239555954933, 0.007508721202611923, -0.06107628345489502, 0.2174770087003708, -0.08832155913114548, -0.16353817284107208, -0.12733317911624908, -0.06902823597192764, -0.03234335035085678, 0.012815400026738644, 0.054612476378679276, -0.0945233404636383, -0.03505020961165428, -0.038175810128450394, 0.014616127125918865, 0.01254376769065857, 0.05375314876437187, 0.014610004611313343, -0.006091432645916939, 0.09893307834863663, -0.11176077276468277, -0.008370638824999332, -0.046702515333890915, -0.059695493429899216, 0.05814917013049126, 0.056511376053094864, 0.11435960978269577, 0.1424391269683838, -0.015745462849736214, 0.0038061949890106916, -0.015902897343039513, 0.23038159310817719, -0.07227211445569992, -0.025745844468474388, 0.14979968965053558, -0.0008169698412530124, 0.06285487860441208, 0.0881582722067833, 0.07558298856019974, -0.0887628123164177, 0.011323255486786366, 0.02450954169034958, -0.04313500598073006, -0.2026609182357788, -0.042833756655454636, -0.06598573178052902, -0.05162816867232323, 0.09980257600545883, 0.023166097700595856, 0.046643923968076706, 0.06838671118021011, 0.04180781543254852, 0.07906801253557205, -0.056322913616895676, 0.043079789727926254, 0.08029258251190186, 0.04586440697312355, 0.12056761980056763, -0.036494944244623184, -0.07607600837945938, 0.01897326670587063, -0.014188941568136215, 0.22750671207904816, -0.0016947761178016663, 0.10900726169347763, 0.07138312608003616, 0.20596617460250854, -0.009599583223462105, 0.10520622879266739, 0.0020850524306297302, -0.053026121109724045, -0.0038246421609073877, -0.0490058958530426, -0.034393224865198135, 0.00967267993837595, -0.043551549315452576, 0.0702485665678978, -0.10642796009778976, -0.012485108338296413, 0.03995175659656525, 0.2714453637599945, 0.031040146946907043, -0.334468275308609, -0.09270206838846207, -0.01594102941453457, -0.021306170150637627, -0.023285577073693275, 0.0005451858160085976, 0.08665519952774048, -0.09343495219945908, 0.03268403559923172, -0.0813184306025505, 0.08226402848958969, 0.0067596142180264, 0.039211202412843704, 0.07464820146560669, 0.11341404914855957, 0.020031364634633064, 0.06996570527553558, -0.3112756907939911, 0.29231157898902893, -0.0019880046602338552, 0.09123066812753677, -0.08514691144227982, 0.013125461526215076, 0.04758569598197937, 0.02817460149526596, 0.0721469447016716, -0.011584192514419556, 0.002414104761555791, -0.19204580783843994, -0.059769850224256516, 0.026611292734742165, 0.08486974984407425, -0.019098086282610893, 0.08656114339828491, -0.016749883070588112, -0.0060530491173267365, 0.07876115292310715, 0.003856638679280877, -0.0724988579750061, -0.0847976878285408, -0.0027269385755062103, 0.01657848246395588, -0.0700821802020073, -0.075752854347229, -0.1231205090880394, -0.1168215349316597, 0.14709752798080444, 0.005628833547234535, -0.03511848300695419, -0.11649467796087265, 0.08324559777975082, 0.10341226309537888, -0.08346150070428848, 0.06902576982975006, 0.0009037468698807061, 0.06288528442382812, 0.01939893327653408, -0.07274925708770752, 0.10092397779226303, -0.08148651570081711, -0.14582166075706482, -0.061036765575408936, 0.10737677663564682, 0.03844381496310234, 0.06989990919828415, -0.017087191343307495, 0.018368447199463844, -0.03912831470370293, -0.08300784975290298, 0.04025401175022125, -0.039619434624910355, 0.07694835960865021, 0.027622273191809654, -0.05008586123585701, 0.011134452186524868, -0.050068456679582596, -0.01861737109720707, 0.16567601263523102, 0.23660777509212494, -0.09754902869462967, 0.022388696670532227, 0.0426514632999897, -0.051313772797584534, -0.20765788853168488, 0.03035096824169159, 0.07369176298379898, 0.018211835995316505, 0.05386089161038399, -0.17822790145874023, 0.12240049242973328, 0.08530975133180618, -0.017848143354058266, 0.1226930320262909, -0.3336997330188751, -0.1253713220357895, 0.13379301130771637, 0.15739230811595917, 0.136666938662529, -0.13554337620735168, -0.01412287075072527, -0.01894170232117176, -0.11402962356805801, 0.06649487465620041, -0.07328623533248901, 0.13089995086193085, -0.04146890714764595, 0.08164630085229874, -0.0006621305947192013, -0.07738673686981201, 0.1273137331008911, -0.005717543885111809, 0.0928165540099144, -0.05299708619713783, 0.0033316307235509157, 0.06426640599966049, -0.03189043328166008, 0.011430318467319012, -0.06965037435293198, 0.029820067808032036, -0.044990409165620804, -0.013090081512928009, -0.0929354727268219, 0.06614195555448532, -0.029645562171936035, -0.0581827312707901, -0.01952192559838295, 0.025714805349707603, 0.04116999730467796, -0.01984510011970997, 0.09825891256332397, 0.048885881900787354, 0.15907789766788483, 0.10195557028055191, 0.03204367682337761, -0.047499772161245346, -0.1070818081498146, -0.015299643389880657, -0.018693774938583374, 0.0683991089463234, -0.1162789836525917, 0.01844370923936367, 0.1268763393163681, 0.03009268455207348, 0.1200154647231102, 0.08542048186063766, -0.03144059330224991, 0.009485212154686451, 0.07430502772331238, -0.16428881883621216, -0.06552406400442123, 0.0048252735286951065, -0.06640846282243729, -0.1146763488650322, 0.03819472715258598, 0.07334595173597336, -0.07356845587491989, -0.006360506638884544, -0.00949056912213564, 0.005470167845487595, -0.07945552468299866, 0.22567147016525269, 0.05953093245625496, 0.05408433452248573, -0.10044870525598526, 0.06469454616308212, 0.03575424477458, -0.07065840065479279, -0.0073096659034490585, 0.06315632164478302, -0.07471182197332382, -0.03629132732748985, 0.12034065276384354, 0.15347634255886078, -0.033091481775045395, -0.04141398146748543, -0.15186549723148346, -0.11268418282270432, 0.06898108869791031, 0.14707635343074799, 0.11132705956697464, 0.006238215137273073, -0.05674345791339874, 0.023235293105244637, -0.11573848873376846, 0.07659072428941727, 0.05401628836989403, 0.06785022467374802, -0.12481337785720825, 0.1685415506362915, 0.01827045902609825, 0.06210203841328621, -0.025756636634469032, 0.030956273898482323, -0.09012202173471451, 0.02028517797589302, -0.11042068153619766, -0.040912020951509476, -0.016353562474250793, -0.009918230585753918, -0.009619849734008312, -0.05614781007170677, -0.05454074963927269, 0.03237329050898552, -0.1239686906337738, -0.038007285445928574, 0.04102250933647156, 0.03335786238312721, -0.11261436343193054, -0.04850197955965996, 0.038581784814596176, -0.0658906102180481, 0.05175304040312767, 0.06650836020708084, 0.012262833304703236, 0.05952058359980583, -0.13778196275234222, -0.022605041041970253, 0.07457730174064636, 0.011936905793845654, 0.06316103786230087, -0.09203002601861954, -0.014368751086294651, -0.006418999284505844, 0.06426689773797989, 0.004166888538748026, 0.08476167172193527, -0.1494934856891632, 0.001449030707590282, -0.033033620566129684, -0.08261381089687347, -0.062467485666275024, 0.012227525003254414, 0.08300907164812088, 0.016104809939861298, 0.19141310453414917, -0.08993839472532272, 0.05225927010178566, -0.21142332255840302, 0.0006816591485403478, -0.026745950803160667, -0.09817998856306076, -0.11494731903076172, -0.04784141108393669, 0.06944451481103897, -0.05015261098742485, 0.12301347404718399, 0.012515210546553135, 0.049781039357185364, 0.019507700577378273, -0.01818959228694439, 0.01847982220351696, 0.011567835696041584, 0.21084295213222504, 0.0231484305113554, -0.03395431488752365, 0.08026700466871262, 0.063263438642025, 0.09671717882156372, 0.11381793022155762, 0.19868265092372894, 0.15798628330230713, 0.01808294840157032, 0.09853745251893997, 0.024704111739993095, -0.04925797879695892, -0.16283012926578522, 0.017667701467871666, -0.0391167476773262, 0.0976598784327507, -0.01579483598470688, 0.1923743635416031, 0.06306204199790955, -0.16419275104999542, 0.0618336945772171, -0.03680628165602684, -0.08752015233039856, -0.09859681129455566, -0.0389428548514843, -0.07037520408630371, -0.12734487652778625, 0.008525453507900238, -0.08639944344758987, 0.011726225726306438, 0.10798699408769608, -0.0002407114952802658, -0.023088036105036736, 0.20353269577026367, 0.03535061702132225, 0.02904689498245716, 0.04466927424073219, 0.006797177717089653, -0.03155212476849556, -0.07973624020814896, -0.06838840991258621, -0.02597047947347164, -0.007013274356722832, 0.03998454287648201, -0.07006502896547318, -0.08817342668771744, 0.051957789808511734, -0.01158162858337164, -0.103473961353302, 0.014017298817634583, 0.013136743567883968, 0.0673544779419899, 0.04760565981268883, 0.009204373694956303, 0.025074193254113197, -0.023588448762893677, 0.18976230919361115, -0.08387517184019089, -0.09767764806747437, -0.09533651918172836, 0.2602348029613495, 0.03640354424715042, -0.024883463978767395, 0.03178089112043381, -0.054204393178224564, -0.012624591588973999, 0.2554813325405121, 0.202531099319458, -0.08018668740987778, -0.00260761845856905, 0.011559032835066319, -0.01675419509410858, -0.0435592383146286, 0.1252269297838211, 0.14262959361076355, 0.0636211484670639, -0.1034412607550621, -0.050098568201065063, -0.06820463389158249, -0.018040550872683525, -0.06501893699169159, 0.03577527031302452, 0.034127313643693924, 0.0006987846572883427, -0.03803187236189842, 0.05321739986538887, -0.05430250242352486, -0.11054898053407669, 0.0893804207444191, -0.1929842233657837, -0.16317251324653625, -0.012358151376247406, 0.10966217517852783, 0.006654183845967054, 0.06986784189939499, -0.03166131302714348, 0.014461781829595566, 0.07325063645839691, -0.01860218495130539, -0.07340381294488907, -0.09757313877344131, 0.10119930654764175, -0.09973905235528946, 0.21284042298793793, -0.036911725997924805, 0.07680694013834, 0.11796358972787857, 0.07430633902549744, -0.06879826635122299, 0.0634157657623291, 0.03962176665663719, -0.08986658602952957, 0.025933513417840004, 0.11001808196306229, -0.030911333858966827, 0.019178735092282295, 0.022536389529705048, -0.11368346214294434, 0.018444379791617393, -0.07119572907686234, -0.04101870581507683, -0.03698044642806053, -0.03566835820674896, -0.06005791947245598, 0.11433517932891846, 0.2156403511762619, -0.022994915023446083, 0.003303414210677147, -0.0829700157046318, 0.021988147869706154, 0.07076860219240189, 0.011485098861157894, -0.09815896302461624, -0.2297934889793396, 0.014540615491569042, 0.02308056317269802, -0.033978600054979324, -0.217027947306633, -0.10660538822412491, 0.001080575049854815, -0.0871092900633812, -0.08937147259712219, 0.06377711147069931, 0.06494847685098648, 0.058858081698417664, -0.043277230113744736, -0.07458103448152542, -0.08222668617963791, 0.1465541273355484, -0.17588955163955688, -0.0878753587603569 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-evaluating-student-writing This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.9917 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.3485 | 1.0 | 878 | 2.0959 | | 2.1407 | 2.0 | 1756 | 2.0162 | | 2.0843 | 3.0 | 2634 | 1.9846 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.9.1 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "distilbert-base-uncased-finetuned-evaluating-student-writing", "results": []}]}
fill-mask
NahedAbdelgaber/distilbert-base-uncased-finetuned-evaluating-student-writing
[ "transformers", "pytorch", "tensorboard", "distilbert", "fill-mask", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-evaluating-student-writing ============================================================ This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 1.9917 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3.0 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.9.1 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 57, 113, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.09795556217432022, 0.05221904069185257, -0.002713699359446764, 0.10620268434286118, 0.15347066521644592, 0.020438440144062042, 0.12674158811569214, 0.12048611789941788, -0.09215293079614639, 0.01751546561717987, 0.11589160561561584, 0.14104007184505463, 0.024642786011099815, 0.1385703831911087, -0.02804696187376976, -0.2744530141353607, 0.000911591574549675, 0.03609232231974602, -0.10236357897520065, 0.1306752860546112, 0.09193388372659683, -0.12349597364664078, 0.0572713166475296, 0.024668164551258087, -0.1850128024816513, 0.015339920297265053, 0.019382167607545853, -0.06363258510828018, 0.15077240765094757, 0.015540467575192451, 0.13728272914886475, 0.005187901668250561, 0.09527500718832016, -0.18516485393047333, 0.015539131127297878, 0.06218503415584564, 0.013760139234364033, 0.08078470826148987, 0.06279703229665756, 0.015405629761517048, 0.0963517427444458, -0.07280600070953369, 0.07318448275327682, 0.02127937413752079, -0.12782739102840424, -0.28033873438835144, -0.08791951835155487, 0.009502816945314407, 0.06718183308839798, 0.10157062113285065, 0.002406545216217637, 0.1455298662185669, -0.0920582190155983, 0.08892691135406494, 0.2214270830154419, -0.2867872416973114, -0.06518719345331192, 0.004084492102265358, 0.03769718483090401, 0.04683450236916542, -0.10723494738340378, -0.028765788301825523, 0.02935057133436203, 0.053019773215055466, 0.1655544489622116, -0.02483084797859192, -0.0975915938615799, -0.0006728930748067796, -0.13920801877975464, -0.040819596499204636, 0.10939910262823105, 0.03756580129265785, -0.03529743850231171, -0.03934568539261818, -0.06094874441623688, -0.14016224443912506, -0.04973120987415314, -0.014283071272075176, 0.04688243567943573, -0.026894254609942436, -0.07944200187921524, -0.013182071037590504, -0.10398341715335846, -0.0797213464975357, -0.0673721581697464, 0.15275081992149353, 0.04508304223418236, 0.013271773234009743, -0.021637607365846634, 0.10162470489740372, -0.016080787405371666, -0.1393287628889084, 0.024381227791309357, 0.03167524188756943, -0.04106811061501503, -0.036809880286455154, -0.06653566658496857, -0.08077124506235123, 0.006811018101871014, 0.14371350407600403, -0.0626194104552269, 0.05784890800714493, 0.019741153344511986, 0.048677943646907806, -0.11163840442895889, 0.1835811585187912, -0.04974326491355896, 0.006191217340528965, 0.0026731116231530905, 0.05599787458777428, 0.015633678063750267, -0.011868580244481564, -0.09779394418001175, 0.016373006626963615, 0.1005050390958786, 0.012590022757649422, -0.05760052055120468, 0.05684279650449753, -0.05998977646231651, -0.01809583231806755, 0.004841041285544634, -0.09909866750240326, 0.03833548352122307, -0.0076112584210932255, -0.07703728973865509, -0.02703360840678215, 0.044820625334978104, 0.01949135959148407, -0.01565771922469139, 0.11252151429653168, -0.07966762036085129, 0.03774302825331688, -0.11414974182844162, -0.12798413634300232, 0.01843986101448536, -0.052422262728214264, 0.013449818827211857, -0.09378743916749954, -0.17233245074748993, -0.006827419623732567, 0.0831647664308548, -0.02902822196483612, -0.04287848249077797, -0.018059831112623215, -0.057563357055187225, 0.020134294405579567, -0.01583796739578247, 0.15205566585063934, -0.06031282991170883, 0.11304805427789688, 0.03864626586437225, 0.07517916709184647, -0.040389854460954666, 0.06348071247339249, -0.09699969738721848, 0.020119940862059593, -0.19086013734340668, 0.030538367107510567, -0.0600435808300972, 0.056108538061380386, -0.09532185643911362, -0.11377248167991638, -0.0073051112703979015, -0.010562330484390259, 0.09842250496149063, 0.08239170163869858, -0.17062678933143616, -0.07533826678991318, 0.16829220950603485, -0.07697288691997528, -0.09566255658864975, 0.1301446110010147, -0.05691390857100487, 0.03445061296224594, 0.05068065971136093, 0.14514844119548798, 0.07055220752954483, -0.11132746189832687, 0.046896521002054214, -0.018100060522556305, 0.04490628466010094, -0.048388250172138214, 0.07227447628974915, -0.005875132512301207, -0.005112949293106794, 0.02425067313015461, -0.025323575362563133, 0.08448699116706848, -0.09170442074537277, -0.09623781591653824, -0.03611697629094124, -0.09132417291402817, 0.06429752707481384, 0.06916564702987671, 0.0725843533873558, -0.0955011174082756, -0.10077093541622162, 0.0585976243019104, 0.07888183742761612, -0.05855188146233559, 0.03411174938082695, -0.0641200840473175, 0.05350212752819061, -0.046735912561416626, -0.019347213208675385, -0.1895698755979538, -0.026864169165492058, 0.011674364097416401, -0.0401526540517807, 0.030142445117235184, 0.0006500285235233605, 0.0906836986541748, 0.07546460628509521, -0.05574168264865875, -0.03189779072999954, -0.06255397200584412, -0.0020077412482351065, -0.11712810397148132, -0.19938039779663086, -0.04867181554436684, -0.027259329333901405, 0.13030585646629333, -0.1743859499692917, 0.02726929262280464, -0.04704038053750992, 0.07840597629547119, 0.0175790973007679, -0.011084762401878834, -0.04368507117033005, 0.08801031112670898, -0.017032941803336143, -0.0566134974360466, 0.06727215647697449, 0.012137860991060734, -0.0888095423579216, -0.02373259700834751, -0.11419393867254257, 0.16803842782974243, 0.12213703244924545, -0.10648354887962341, -0.08348706364631653, 0.0036600453313440084, -0.06345373392105103, -0.05104091018438339, -0.03840802237391472, 0.04059864208102226, 0.15934504568576813, 0.0022044056095182896, 0.14327631890773773, -0.06206483393907547, -0.034234143793582916, 0.029219498857855797, -0.037627361714839935, 0.019084708765149117, 0.0869683250784874, 0.12319090217351913, -0.06180392950773239, 0.13127677142620087, 0.17025423049926758, -0.11243472248315811, 0.13926200568675995, -0.04064158722758293, -0.08617367595434189, -0.023813702166080475, -0.01938861981034279, 0.013082575984299183, 0.11441435664892197, -0.11823096871376038, 0.012159877456724644, 0.025701502338051796, 0.015580804087221622, 0.01966671273112297, -0.23295913636684418, -0.03303695470094681, 0.03622455149888992, -0.04593503102660179, -0.027649883180856705, -0.012548048049211502, 0.010959274135529995, 0.09491395205259323, 0.0019432601984590292, -0.08986024558544159, 0.038671497255563736, 0.0035413787700235844, -0.06559351086616516, 0.20000876486301422, -0.09965644776821136, -0.17092156410217285, -0.1284305304288864, -0.07883597165346146, -0.025044012814760208, 0.005912286229431629, 0.058624010533094406, -0.0804319679737091, -0.03720453381538391, -0.047159139066934586, 0.012816284783184528, 0.0039957244880497456, 0.03582217916846275, 0.0029556637164205313, -0.012006652541458607, 0.10151936858892441, -0.10769345611333847, -0.006759026553481817, -0.040864475071430206, -0.05752323940396309, 0.059151146560907364, 0.05599956586956978, 0.11598677188158035, 0.15022793412208557, -0.01988132670521736, 0.005417577922344208, -0.023463856428861618, 0.21678370237350464, -0.07085074484348297, -0.02528292126953602, 0.13868668675422668, -0.008266939781606197, 0.06668955832719803, 0.10833670198917389, 0.07343348860740662, -0.08026912063360214, 0.01788799837231636, 0.03670423850417137, -0.03602435439825058, -0.2022721916437149, -0.03775912895798683, -0.06379310041666031, -0.04818906635046005, 0.10421977192163467, 0.022822709754109383, 0.05217202007770538, 0.05656412988901138, 0.037792086601257324, 0.0600501149892807, -0.03658908233046532, 0.056734584271907806, 0.1051846444606781, 0.04805203154683113, 0.12867088615894318, -0.03190552815794945, -0.07479614019393921, 0.024742960929870605, -0.006127221044152975, 0.23171453177928925, 0.01681627705693245, 0.1302075833082199, 0.06151857227087021, 0.19469726085662842, 0.005566464737057686, 0.094133660197258, 0.010984901338815689, -0.04370525851845741, -0.007664699573069811, -0.0452825203537941, -0.0311620831489563, 0.019466014578938484, -0.021644916385412216, 0.04381146654486656, -0.11288733780384064, -0.022289210930466652, 0.034312404692173004, 0.2923630475997925, 0.03120640106499195, -0.326263427734375, -0.0855104923248291, -0.01050463318824768, -0.04176606982946396, -0.026570679619908333, 0.0024144938215613365, 0.08339064568281174, -0.09606251865625381, 0.03864497318863869, -0.08825249969959259, 0.09426908195018768, -0.0008360020583495498, 0.034723397344350815, 0.07639645785093307, 0.11358580738306046, 0.013304388150572777, 0.07071974873542786, -0.30203768610954285, 0.29184025526046753, -0.0028703713323920965, 0.07879754900932312, -0.07642244547605515, 0.017416035756468773, 0.05195973813533783, 0.022702466696500778, 0.05200543627142906, -0.0183849073946476, -0.05341656878590584, -0.19985276460647583, -0.061174582690000534, 0.021069683134555817, 0.08453536033630371, -0.01276671513915062, 0.10302712023258209, -0.03079351782798767, 0.00023716958821751177, 0.08177746087312698, -0.004281550645828247, -0.09264524281024933, -0.09331411123275757, 0.007040034048259258, 0.023233581334352493, -0.03182203322649002, -0.08428633213043213, -0.12421375513076782, -0.1090175211429596, 0.1522095650434494, 0.005796882789582014, -0.02196858637034893, -0.12147501856088638, 0.08780104666948318, 0.10442947596311569, -0.08127538114786148, 0.058883488178253174, 0.004031189251691103, 0.0742059126496315, 0.022890470921993256, -0.07006295770406723, 0.11548236757516861, -0.07475478202104568, -0.14505736529827118, -0.059415530413389206, 0.10029131919145584, 0.03444092720746994, 0.07195711135864258, -0.017633136361837387, 0.022913534194231033, -0.02552822045981884, -0.0797773152589798, 0.036722876131534576, -0.03894126042723656, 0.05423491820693016, 0.02950257435441017, -0.053699858486652374, 0.01753918081521988, -0.051848456263542175, -0.026132674887776375, 0.17673428356647491, 0.2439061999320984, -0.0962943509221077, 0.03458694741129875, 0.052951380610466, -0.05865074321627617, -0.1955355405807495, 0.025983955711126328, 0.06979396194219589, 0.01677456684410572, 0.06986937671899796, -0.1951414793729782, 0.0946643054485321, 0.09504419565200806, -0.02135143242776394, 0.09936236590147018, -0.3222299814224243, -0.12789386510849, 0.13727699220180511, 0.14527074992656708, 0.09238902479410172, -0.142705500125885, -0.017137771472334862, -0.014928285963833332, -0.10410749167203903, 0.08333346247673035, -0.07761821895837784, 0.13302379846572876, -0.030061140656471252, 0.07698848098516464, 0.00475630396977067, -0.07062868028879166, 0.12766337394714355, -0.02927619405090809, 0.10002689808607101, -0.05850823596119881, 0.028064550831913948, 0.06211274489760399, -0.03841559216380119, 0.016331922262907028, -0.07381889224052429, 0.01787632517516613, -0.05077666416764259, -0.020809734240174294, -0.08314937353134155, 0.049263834953308105, -0.025380220264196396, -0.05776335671544075, -0.01846611127257347, 0.039353080093860626, 0.04726143181324005, -0.017186954617500305, 0.11064747720956802, 0.026168324053287506, 0.1583564281463623, 0.11196326464414597, 0.049830127507448196, -0.045054156333208084, -0.09576822072267532, -0.016524367034435272, -0.020831260830163956, 0.06699089705944061, -0.10186438262462616, 0.03277001529932022, 0.1375444233417511, 0.018338678404688835, 0.14112737774848938, 0.07815708965063095, -0.049318473786115646, 0.015243039466440678, 0.07230018079280853, -0.14484521746635437, -0.09641852229833603, 0.004978744313120842, -0.02539539523422718, -0.11033646762371063, 0.015538913197815418, 0.09144583344459534, -0.0704813152551651, -0.007014913484454155, 0.0006296528154052794, 0.012950828298926353, -0.0664166510105133, 0.22244657576084137, 0.042373839765787125, 0.050280217081308365, -0.09729068726301193, 0.07338199764490128, 0.057979270815849304, -0.09320854395627975, 0.006473762448877096, 0.07816850394010544, -0.06384219229221344, -0.03569558635354042, 0.09980639070272446, 0.16052401065826416, -0.03194953873753548, -0.05678383260965347, -0.146465465426445, -0.12294360995292664, 0.07425154000520706, 0.14218725264072418, 0.09310147166252136, 0.004681310150772333, -0.05346551537513733, 0.0249693151563406, -0.10961884260177612, 0.08549118041992188, 0.06562014669179916, 0.06484343111515045, -0.13229382038116455, 0.16267386078834534, 0.01990371011197567, 0.05040819197893143, -0.021504158154129982, 0.016033513471484184, -0.09180942177772522, 0.01621941477060318, -0.1371692717075348, -0.036369264125823975, -0.025113269686698914, -0.002087465487420559, -0.0024041186552494764, -0.05644397437572479, -0.057494696229696274, 0.0244685560464859, -0.12594486773014069, -0.040865007787942886, 0.027707539498806, 0.037719547748565674, -0.11731883883476257, -0.05226053297519684, 0.04381919279694557, -0.06670372933149338, 0.056064896285533905, 0.04964207485318184, 0.017385944724082947, 0.054767265915870667, -0.13862158358097076, -0.017295310273766518, 0.06118965148925781, 0.01160958968102932, 0.05730839818716049, -0.11331415176391602, -0.016368664801120758, -0.003912237007170916, 0.05839540436863899, -0.000669952598400414, 0.07685501128435135, -0.14618012309074402, -0.01656770333647728, -0.03118138574063778, -0.08823094516992569, -0.059213247150182724, 0.02171345241367817, 0.07026813179254532, 0.0310820359736681, 0.19528646767139435, -0.09950204938650131, 0.04933803156018257, -0.21694250404834747, 0.0048372503370046616, -0.026395505294203758, -0.09371975809335709, -0.09225770831108093, -0.05327621102333069, 0.06455809623003006, -0.05207769572734833, 0.12182304263114929, 0.015666209161281586, 0.053510162979364395, 0.02640491910278797, -0.04061734676361084, 0.021508146077394485, 0.015324820764362812, 0.2026205211877823, 0.0293501615524292, -0.03604836016893387, 0.07575385272502899, 0.054472461342811584, 0.09001877158880234, 0.13038092851638794, 0.19789187610149384, 0.16117540001869202, 0.02714432217180729, 0.08668201416730881, 0.03940790519118309, -0.0657094344496727, -0.15174274146556854, 0.04312353953719139, -0.030592218041419983, 0.10995138436555862, -0.021988043561577797, 0.2080436795949936, 0.07840510457754135, -0.1623391956090927, 0.06751440465450287, -0.041662659496068954, -0.08768976479768753, -0.10375228524208069, -0.05110977962613106, -0.0782153457403183, -0.14477618038654327, -0.002197509165853262, -0.0893981009721756, 0.03224540874361992, 0.09966469556093216, 0.007477893494069576, -0.01627388410270214, 0.1745830476284027, 0.0274337325245142, 0.022712133824825287, 0.04520295932888985, -0.0016530879074707627, -0.0270516499876976, -0.07484405487775803, -0.08001077175140381, -0.004572503734380007, -0.007187072187662125, 0.03829674422740936, -0.05765603110194206, -0.07375086098909378, 0.054203230887651443, -0.018466951325535774, -0.10628313571214676, 0.013890358619391918, 0.012377583421766758, 0.0678177997469902, 0.057382140308618546, 0.01868828944861889, 0.017200004309415817, -0.021423090249300003, 0.20174351334571838, -0.08259673416614532, -0.09853146225214005, -0.11421532928943634, 0.2596128582954407, 0.03148290142416954, -0.020943965762853622, 0.030841967090964317, -0.06128788739442825, -0.0033829137682914734, 0.22978581488132477, 0.18970416486263275, -0.07536392658948898, -0.005694493651390076, 0.0031965873204171658, -0.01948739029467106, -0.04624159261584282, 0.12160585075616837, 0.13717243075370789, 0.05193359777331352, -0.10265611112117767, -0.044626787304878235, -0.06674555689096451, -0.023521164432168007, -0.06444591283798218, 0.039399076253175735, 0.038707513362169266, 0.005802573170512915, -0.037009187042713165, 0.0617404542863369, -0.04271448403596878, -0.09709338843822479, 0.055463388562202454, -0.17901931703090668, -0.16504095494747162, -0.01620483584702015, 0.10009519010782242, 0.003604242578148842, 0.05892566591501236, -0.03370342031121254, 0.015455138869583607, 0.07059568911790848, -0.015366611070930958, -0.06641363352537155, -0.10063577443361282, 0.10954035818576813, -0.09007957577705383, 0.21338966488838196, -0.03703644499182701, 0.06264127790927887, 0.12202766537666321, 0.07505351305007935, -0.06771015375852585, 0.07352156937122345, 0.0421641543507576, -0.09312829375267029, 0.02963845431804657, 0.12088589370250702, -0.03517043590545654, 0.05033542960882187, 0.03058283030986786, -0.12562678754329681, 0.023532304912805557, -0.10214661806821823, -0.05196072906255722, -0.027117978781461716, -0.03246212750673294, -0.054945167154073715, 0.12079275399446487, 0.21843858063220978, -0.024894867092370987, 0.010389016941189766, -0.08465559780597687, 0.015549208968877792, 0.06040939688682556, 0.028954362496733665, -0.08827868103981018, -0.22833259403705597, 0.018478674814105034, 0.02430649846792221, -0.023408496752381325, -0.22943098843097687, -0.11455782502889633, 0.004854592494666576, -0.07822137326002121, -0.08745983242988586, 0.08649537712335587, 0.08072478324174881, 0.055873289704322815, -0.049014005810022354, -0.09233646094799042, -0.07783137261867523, 0.15024907886981964, -0.16863712668418884, -0.07165078818798065 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # evaluating-student-writing-distibert-ner-with-metric This model is a fine-tuned version of [NahedAbdelgaber/evaluating-student-writing-distibert-ner](https://huggingface.co/NahedAbdelgaber/evaluating-student-writing-distibert-ner) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7535 - Precision: 0.0614 - Recall: 0.2590 - F1: 0.0993 - Accuracy: 0.6188 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.7145 | 1.0 | 1755 | 0.7683 | 0.0546 | 0.2194 | 0.0875 | 0.6191 | | 0.6608 | 2.0 | 3510 | 0.7504 | 0.0570 | 0.2583 | 0.0934 | 0.6136 | | 0.5912 | 3.0 | 5265 | 0.7535 | 0.0614 | 0.2590 | 0.0993 | 0.6188 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.9.1 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "evaluating-student-writing-distibert-ner-with-metric", "results": []}]}
token-classification
NahedAbdelgaber/evaluating-student-writing-distibert-ner-with-metric
[ "transformers", "pytorch", "tensorboard", "distilbert", "token-classification", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
evaluating-student-writing-distibert-ner-with-metric ==================================================== This model is a fine-tuned version of NahedAbdelgaber/evaluating-student-writing-distibert-ner on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.7535 * Precision: 0.0614 * Recall: 0.2590 * F1: 0.0993 * Accuracy: 0.6188 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.9.1 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 58, 98, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.1014125868678093, 0.09075818955898285, -0.002467429032549262, 0.12267791479825974, 0.17413294315338135, 0.018761955201625824, 0.11100985109806061, 0.11889481544494629, -0.10679095983505249, 0.010913259349763393, 0.11900975555181503, 0.19080479443073273, 0.0009212163276970387, 0.11131377518177032, -0.04684891924262047, -0.25607025623321533, -0.009142452850937843, 0.05835021659731865, -0.09046708047389984, 0.13730183243751526, 0.10210499912500381, -0.14031027257442474, 0.0784536749124527, 0.014068050310015678, -0.2301352471113205, 0.010712897405028343, 0.01438791025429964, -0.06616786122322083, 0.14944903552532196, 0.008072900585830212, 0.13147179782390594, -0.0005818838835693896, 0.08502921462059021, -0.1671520322561264, 0.00593516044318676, 0.047634851187467575, 0.01844220794737339, 0.09046745300292969, 0.062065090984106064, 0.0056029921397566795, 0.10299374908208847, -0.06961929053068161, 0.05340816453099251, 0.021402830258011818, -0.11562331765890121, -0.2540675699710846, -0.08329880982637405, 0.03281364589929581, 0.07223544269800186, 0.09830613434314728, 0.006675561424344778, 0.14151328802108765, -0.09705425053834915, 0.09183768928050995, 0.23003524541854858, -0.27415671944618225, -0.0611264668405056, 0.04277321696281433, -0.0005139838322065771, 0.05467066168785095, -0.10958898812532425, -0.03534365072846413, 0.05275692790746689, 0.050816625356674194, 0.14617007970809937, -0.038833051919937134, -0.11152318865060806, 0.01486247032880783, -0.14885364472866058, -0.032841846346855164, 0.13347873091697693, 0.03113054297864437, -0.03243713453412056, -0.036620672792196274, -0.05652482435107231, -0.152609184384346, -0.03308582678437233, -0.016345912590622902, 0.04622365161776543, -0.04052136838436127, -0.06273304671049118, 0.012513183057308197, -0.10803531110286713, -0.06773343682289124, -0.07194725424051285, 0.1496744453907013, 0.04907630383968353, 0.015004717744886875, -0.020501311868429184, 0.11695768684148788, 0.02086743898689747, -0.12612444162368774, 0.030065808445215225, 0.027420666068792343, 0.00292726862244308, -0.050327666103839874, -0.06888934969902039, -0.03426756337285042, 0.009252399206161499, 0.12208397686481476, -0.06419100612401962, 0.041229721158742905, 0.05274280160665512, 0.044533103704452515, -0.09924724698066711, 0.19605840742588043, -0.03795261308550835, 0.0029092810582369566, 0.0038837939500808716, 0.04387401416897774, -0.0035609458573162556, 0.00716844666749239, -0.11188700795173645, -0.001159797771833837, 0.12649764120578766, 0.018490154296159744, -0.07460496574640274, 0.06657669693231583, -0.056355249136686325, -0.03531349077820778, 0.015896417200565338, -0.09810607880353928, 0.03034871816635132, -0.008009626530110836, -0.08942050486803055, -0.009575676172971725, 0.023619787767529488, 0.01253758929669857, -0.02003627084195614, 0.11668211966753006, -0.08153509348630905, 0.040615662932395935, -0.09783627837896347, -0.09655247628688812, 0.009304361417889595, -0.08013652265071869, 0.03341028466820717, -0.1051836758852005, -0.15782125294208527, -0.009284218773245811, 0.060137826949357986, -0.01620856113731861, -0.06256274878978729, -0.03226739913225174, -0.06996475160121918, -0.009248308837413788, -0.018680356442928314, 0.14344677329063416, -0.054735004901885986, 0.11174842715263367, 0.03781531751155853, 0.06286393851041794, -0.041894685477018356, 0.06313630193471909, -0.1055997908115387, 0.008291548117995262, -0.19189366698265076, 0.031435031443834305, -0.05479273945093155, 0.0725594311952591, -0.09866545349359512, -0.12340289354324341, 0.03352949023246765, -0.015840919688344002, 0.06539065390825272, 0.0778081938624382, -0.15069155395030975, -0.07613537460565567, 0.13500145077705383, -0.06787216663360596, -0.10927927494049072, 0.11239717900753021, -0.05700390040874481, 0.04921993240714073, 0.06944793462753296, 0.1584271788597107, 0.08774491399526596, -0.06558320671319962, 0.02641044184565544, 0.00006797917740186676, 0.046516627073287964, -0.07930173724889755, 0.06259425729513168, 0.005916349124163389, -0.00814844947308302, 0.03985147178173065, -0.04318093881011009, 0.07387175410985947, -0.10077293962240219, -0.09462931752204895, -0.039354316890239716, -0.104148268699646, 0.05673662945628166, 0.07944861799478531, 0.09273205697536469, -0.08633115142583847, -0.0586748905479908, 0.08739485591650009, 0.08628086000680923, -0.054711613804101944, 0.026821264997124672, -0.053402792662382126, 0.06380469352006912, -0.04352298751473427, -0.027935443446040154, -0.19500845670700073, -0.004881743807345629, 0.016453087329864502, -0.025133680552244186, 0.016280759125947952, 0.019081946462392807, 0.06758318096399307, 0.06222737208008766, -0.055852893739938736, -0.028363991528749466, -0.02769485116004944, 0.0004970168229192495, -0.13388371467590332, -0.18501269817352295, -0.03653061389923096, -0.015087821520864964, 0.10813824087381363, -0.1855277568101883, 0.033749211579561234, -0.030989928171038628, 0.08200264722108841, 0.0035695445258170366, -0.005968616809695959, -0.04826213791966438, 0.08871178328990936, -0.03165549039840698, -0.051425330340862274, 0.07619890570640564, 0.0002165260520996526, -0.08110260218381882, -0.052780117839574814, -0.07746853679418564, 0.1878650039434433, 0.1320020705461502, -0.13502947986125946, -0.08195916563272476, -0.007336829788982868, -0.06740163266658783, -0.03696572780609131, -0.0408419705927372, 0.04801449924707413, 0.1684926301240921, -0.01666688360273838, 0.1516459733247757, -0.06834262609481812, -0.05136967822909355, 0.021649779751896858, -0.03691391274333, 0.03371569141745567, 0.11677756160497665, 0.1168564185500145, -0.07869498431682587, 0.14435267448425293, 0.17218555510044098, -0.10278567671775818, 0.11082373559474945, -0.05550200864672661, -0.06911862641572952, -0.014105632901191711, -0.01712469942867756, -0.004115298390388489, 0.09025029093027115, -0.11840084195137024, 0.0019155838526785374, 0.02562630921602249, 0.028037268668413162, 0.020016565918922424, -0.23130278289318085, -0.03727160394191742, 0.027440091595053673, -0.03432244807481766, 0.011067723855376244, -0.012957178987562656, 0.011673924513161182, 0.10107330232858658, -0.0020931384060531855, -0.0997939258813858, 0.045948147773742676, 0.008884645067155361, -0.07191096246242523, 0.21761208772659302, -0.08730953931808472, -0.14726294577121735, -0.11994501948356628, -0.07425177842378616, -0.04628434032201767, 0.00926303956657648, 0.06257305294275284, -0.09286876767873764, -0.02572893723845482, -0.046247147023677826, 0.017415953800082207, -0.006331046111881733, 0.04391733556985855, 0.0036721520591527224, 0.0085928188636899, 0.08354000002145767, -0.10812465101480484, -0.006076974794268608, -0.053513769060373306, -0.06750627607107162, 0.05300552397966385, 0.03969081863760948, 0.10140358656644821, 0.15686213970184326, -0.022396478801965714, 0.006557372398674488, -0.027524571865797043, 0.22920164465904236, -0.05628824979066849, -0.033465076237916946, 0.14817290008068085, -0.002351973671466112, 0.058272190392017365, 0.09863971173763275, 0.07497283816337585, -0.0850599929690361, 0.007446563336998224, 0.02864236570894718, -0.03405527397990227, -0.2179020345211029, -0.05101267248392105, -0.05031958594918251, -0.027534998953342438, 0.10426028072834015, 0.02945934794843197, 0.05010470002889633, 0.07329826802015305, 0.048523783683776855, 0.0820775106549263, -0.04734272137284279, 0.05395292863249779, 0.12215416878461838, 0.045620135962963104, 0.12212542444467545, -0.039796244353055954, -0.07431069016456604, 0.029590817168354988, -0.003917716443538666, 0.23724381625652313, 0.0006786203593946993, 0.10864491015672684, 0.0635596513748169, 0.1993224024772644, -0.004213239066302776, 0.08556944131851196, -0.002664931584149599, -0.03984560817480087, -0.012358914129436016, -0.035823117941617966, -0.03382076323032379, 0.012328770942986012, -0.05377194657921791, 0.06118446961045265, -0.11559144407510757, -0.018216896802186966, 0.050214625895023346, 0.26525765657424927, 0.02070482447743416, -0.32176077365875244, -0.08905867487192154, -0.008906709030270576, -0.03691934794187546, -0.01462319865822792, 0.022837389260530472, 0.07532095909118652, -0.10044645518064499, 0.019819121807813644, -0.07284487783908844, 0.0898677334189415, -0.040775611996650696, 0.044401150196790695, 0.08444000035524368, 0.08623065054416656, 0.023497765883803368, 0.08643178641796112, -0.3181987702846527, 0.2717965543270111, -0.003805592656135559, 0.06659778207540512, -0.07494669407606125, 0.0025883864145725965, 0.034827910363674164, 0.06387096643447876, 0.05661039426922798, -0.009209955111145973, -0.023828286677598953, -0.21725864708423615, -0.048356954008340836, 0.02526845782995224, 0.08153552561998367, -0.02999722585082054, 0.08108000457286835, -0.030810214579105377, 0.008999685756862164, 0.07544448226690292, -0.03313062712550163, -0.04966875538229942, -0.08041016012430191, -0.00858659204095602, 0.016736069694161415, -0.040352560579776764, -0.0663783848285675, -0.11385336518287659, -0.13421018421649933, 0.1462317705154419, -0.012565976940095425, -0.04024636745452881, -0.11458701640367508, 0.07884563505649567, 0.09452766180038452, -0.08805044740438461, 0.05639554187655449, 0.004546948242932558, 0.05461203306913376, 0.040367189794778824, -0.07355312258005142, 0.10348596423864365, -0.06749822199344635, -0.16583867371082306, -0.052311986684799194, 0.10335959494113922, 0.03757540509104729, 0.06083470210433006, -0.010931707918643951, 0.011040536686778069, -0.042630378156900406, -0.09371862560510635, 0.020396219566464424, -0.013931818306446075, 0.07884365320205688, 0.025536363944411278, -0.05939440801739693, 0.0200610663741827, -0.06416560709476471, -0.03479636460542679, 0.1788552701473236, 0.23192334175109863, -0.10063673555850983, 0.015130789019167423, 0.041986316442489624, -0.06377039849758148, -0.1895148903131485, 0.033080216497182846, 0.06988605856895447, -0.00749611621722579, 0.04663105681538582, -0.17522278428077698, 0.14334334433078766, 0.10837924480438232, -0.016759183257818222, 0.09940455853939056, -0.33376237750053406, -0.12303275614976883, 0.13480070233345032, 0.1532336324453354, 0.11940544843673706, -0.13693992793560028, -0.01705029420554638, -0.009134764783084393, -0.12412462383508682, 0.09578665345907211, -0.06075488030910492, 0.12003934383392334, -0.04133094847202301, 0.08548875153064728, 0.0023391598369926214, -0.06140637770295143, 0.11735318601131439, 0.034574657678604126, 0.10163091123104095, -0.052287645637989044, -0.03163446858525276, 0.030638208612799644, -0.03522863611578941, 0.013734662905335426, -0.05792967975139618, 0.03784271702170372, -0.08838833123445511, -0.015075955539941788, -0.08881045132875443, 0.052188847213983536, -0.028114615008234978, -0.06617336720228195, -0.039252039045095444, 0.0317319892346859, 0.04582614451646805, -0.01691739447414875, 0.1282726675271988, 0.04250316321849823, 0.14632780849933624, 0.12208203226327896, 0.04782870039343834, -0.06052170321345329, -0.08708420395851135, -0.03043731302022934, -0.015595758333802223, 0.06304178386926651, -0.1267162263393402, 0.03227095678448677, 0.14863671362400055, 0.021368062123656273, 0.12645110487937927, 0.08493506163358688, -0.01477496325969696, 0.0030314496252685785, 0.05777708441019058, -0.16186542809009552, -0.06637313216924667, -0.0037124434020370245, -0.05529704689979553, -0.09843015670776367, 0.06777544319629669, 0.0807940885424614, -0.0745311826467514, -0.01519420463591814, -0.00526744220405817, 0.0005248524248600006, -0.0654555931687355, 0.21784372627735138, 0.06514397263526917, 0.05003305897116661, -0.10579697042703629, 0.07534564286470413, 0.0585285946726799, -0.06719841063022614, -0.016583452001214027, 0.05988636240363121, -0.08992882817983627, -0.0431877039372921, 0.10101819038391113, 0.1533958613872528, -0.07634861022233963, -0.04209554195404053, -0.14227508008480072, -0.12185247242450714, 0.07918932288885117, 0.13632746040821075, 0.1304212510585785, 0.020150460302829742, -0.06087858974933624, 0.009334135800600052, -0.12143420428037643, 0.08005441725254059, 0.04769893363118172, 0.07947983592748642, -0.15658511221408844, 0.16573114693164825, 0.011549719609320164, 0.05873943865299225, -0.02260264754295349, 0.02463473752140999, -0.09760038554668427, 0.02030974254012108, -0.11333828419446945, -0.035862889140844345, -0.023935817182064056, 0.007404348347336054, -0.0017535947263240814, -0.06407768279314041, -0.045034896582365036, 0.026242300868034363, -0.123744897544384, -0.01950187236070633, 0.03717142343521118, 0.060158006846904755, -0.10978809744119644, -0.04935046657919884, 0.024052247405052185, -0.056262481957674026, 0.06540629267692566, 0.04763885959982872, 0.015522106550633907, 0.05871525779366493, -0.12310691177845001, -0.009900948964059353, 0.08198992908000946, 0.013335069641470909, 0.06779351830482483, -0.09678349643945694, -0.009118112735450268, 0.005767220631241798, 0.06130923703312874, 0.013516428880393505, 0.06803887337446213, -0.1526506245136261, -0.014211378991603851, -0.03730376064777374, -0.08114766329526901, -0.07053985446691513, 0.014656107872724533, 0.08728232234716415, 0.01794968731701374, 0.19904780387878418, -0.07194003462791443, 0.04165142774581909, -0.20692576467990875, -0.004274177830666304, -0.022740544751286507, -0.11688142269849777, -0.13123759627342224, -0.05896218121051788, 0.05629758536815643, -0.04899691417813301, 0.13555683195590973, 0.022283749654889107, 0.044029656797647476, 0.025440368801355362, -0.02453080378472805, 0.013749277219176292, 0.024739032611250877, 0.2178729623556137, 0.030104152858257294, -0.029567157849669456, 0.0731731727719307, 0.05678229406476021, 0.08963661640882492, 0.1077963188290596, 0.1796475201845169, 0.16280141472816467, -0.02676352858543396, 0.0776941180229187, 0.02332521416246891, -0.046601422131061554, -0.1729491800069809, 0.03519928455352783, -0.03795383498072624, 0.09665229916572571, -0.01817985065281391, 0.21660158038139343, 0.05218779668211937, -0.17006585001945496, 0.04962167143821716, -0.046951837837696075, -0.08363942801952362, -0.09564687311649323, -0.031539276242256165, -0.07612583786249161, -0.14431096613407135, -0.0015012508956715465, -0.10047055780887604, 0.010250158607959747, 0.11854425817728043, 0.00844356045126915, -0.028230363503098488, 0.15843532979488373, 0.031939808279275894, 0.024723095819354057, 0.04989280551671982, -0.0009115756838582456, -0.03418908268213272, -0.10280431061983109, -0.06341537088155746, -0.0257462989538908, -0.006393080111593008, 0.033940523862838745, -0.06875063478946686, -0.07718493044376373, 0.02984398789703846, -0.025363082066178322, -0.09293609857559204, 0.019189603626728058, 0.014973671175539494, 0.05871713161468506, 0.03690188750624657, 0.004224845673888922, 0.016709230840206146, -0.01690727472305298, 0.21213582158088684, -0.08012492209672928, -0.08345185965299606, -0.08883672207593918, 0.29011666774749756, 0.047774672508239746, -0.012927907519042492, 0.040540434420108795, -0.055724553763866425, -0.00543086277320981, 0.24750523269176483, 0.18574637174606323, -0.0827755331993103, -0.012005025520920753, 0.0079543087631464, -0.01659558154642582, -0.02513514645397663, 0.12058744579553604, 0.14366629719734192, 0.0410347543656826, -0.10172156989574432, -0.036451779305934906, -0.06284210085868835, -0.009373300708830357, -0.05773118510842323, 0.05361120030283928, 0.04175867512822151, 0.0019398032454773784, -0.042814530432224274, 0.053477879613637924, -0.06302352994680405, -0.09382207691669464, 0.06957395374774933, -0.18737579882144928, -0.16337791085243225, -0.01357231754809618, 0.11188734322786331, 0.0011467322474345565, 0.05575171113014221, -0.03373391926288605, 0.008252647705376148, 0.06924227625131607, -0.02208365872502327, -0.08349192142486572, -0.08319561928510666, 0.10574822872877121, -0.09376432001590729, 0.1935030221939087, -0.040225207805633545, 0.07725731283426285, 0.11884785443544388, 0.06555818766355515, -0.07671859115362167, 0.05519767105579376, 0.03876644745469093, -0.08010345697402954, 0.030505791306495667, 0.08597806096076965, -0.02138400636613369, 0.05205347016453743, 0.025679927319288254, -0.12657803297042847, 0.014548941515386105, -0.06632713228464127, -0.0444754995405674, -0.043008774518966675, -0.04919106513261795, -0.04910591244697571, 0.12136314809322357, 0.21105042099952698, -0.02995743602514267, 0.004743493162095547, -0.08183210343122482, 0.015959106385707855, 0.06032324954867363, 0.0030218136962503195, -0.08020364493131638, -0.22026097774505615, 0.012107840739190578, 0.04898585006594658, -0.02604782022535801, -0.19058160483837128, -0.09825283288955688, 0.004347836598753929, -0.08622192591428757, -0.09621527045965195, 0.08068482577800751, 0.05796568840742111, 0.05512860044836998, -0.05104370787739754, -0.06944847106933594, -0.09444984793663025, 0.14648741483688354, -0.15935665369033813, -0.09036308526992798 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # evaluating-student-writing-distibert-ner This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7688 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.871 | 1.0 | 1755 | 0.8158 | | 0.7476 | 2.0 | 3510 | 0.7688 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.9.1 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "evaluating-student-writing-distibert-ner", "results": []}]}
token-classification
NahedAbdelgaber/evaluating-student-writing-distibert-ner
[ "transformers", "pytorch", "tensorboard", "distilbert", "token-classification", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
evaluating-student-writing-distibert-ner ======================================== This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.7688 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.9.1 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 58, 98, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.9.1\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.10157131403684616, 0.09349068999290466, -0.00252246274612844, 0.12250173091888428, 0.17314772307872772, 0.018792646005749702, 0.11104336380958557, 0.11949589848518372, -0.10486545413732529, 0.011507238261401653, 0.11870589852333069, 0.18957743048667908, 0.0013091450091451406, 0.1112174466252327, -0.047133736312389374, -0.25668635964393616, -0.009756161831319332, 0.058383531868457794, -0.09222351759672165, 0.13705015182495117, 0.10175320506095886, -0.14017581939697266, 0.0779552236199379, 0.014424161054193974, -0.2305837869644165, 0.010303880088031292, 0.015565106645226479, -0.06580176204442978, 0.14989560842514038, 0.008418084122240543, 0.13169923424720764, -0.00044614693615585566, 0.08555957674980164, -0.167671337723732, 0.005785465706139803, 0.04761800542473793, 0.01826784946024418, 0.09121037274599075, 0.06193725764751434, 0.005496647208929062, 0.10368841141462326, -0.07003939151763916, 0.05238093063235283, 0.021121419966220856, -0.11529729515314102, -0.2559807598590851, -0.08268439024686813, 0.03321302682161331, 0.07201096415519714, 0.09743687510490417, 0.006608219351619482, 0.1408289074897766, -0.09667883068323135, 0.09106003493070602, 0.2290431410074234, -0.27509182691574097, -0.060329779982566833, 0.04048286750912666, -0.0014901269460096955, 0.054959144443273544, -0.11005833745002747, -0.03598234802484512, 0.05279902368783951, 0.04959215596318245, 0.14550065994262695, -0.038771338760852814, -0.11224057525396347, 0.01463452260941267, -0.14778469502925873, -0.033552102744579315, 0.13239313662052155, 0.03099949285387993, -0.0323869064450264, -0.034913524985313416, -0.058082204312086105, -0.15111291408538818, -0.03297216445207596, -0.016349755227565765, 0.04650384932756424, -0.039269860833883286, -0.062162142246961594, 0.011456837877631187, -0.10848429054021835, -0.06808838248252869, -0.07236099988222122, 0.1497027575969696, 0.049222588539123535, 0.01566082052886486, -0.020979519933462143, 0.11669053882360458, 0.019860317930579185, -0.12700094282627106, 0.03021078370511532, 0.028566626831889153, 0.0026680196169763803, -0.04936431348323822, -0.06888812780380249, -0.03345296159386635, 0.009168497286736965, 0.12169317156076431, -0.06484970450401306, 0.040503110736608505, 0.05143202096223831, 0.04515545442700386, -0.0996192991733551, 0.19596342742443085, -0.03694561868906021, 0.003415626473724842, 0.004234118852764368, 0.04310545325279236, -0.002950364025309682, 0.006793114356696606, -0.11250622570514679, -0.0022249058820307255, 0.12672019004821777, 0.017719469964504242, -0.07436680793762207, 0.0651831328868866, -0.05665809288620949, -0.034780409187078476, 0.014697679318487644, -0.09838169813156128, 0.03163037449121475, -0.0071814716793596745, -0.08955083042383194, -0.010922719724476337, 0.02379988133907318, 0.012941040098667145, -0.020290840417146683, 0.11621030420064926, -0.08162897080183029, 0.04124804586172104, -0.09756466746330261, -0.09646149724721909, 0.009812173433601856, -0.08033905923366547, 0.033428821712732315, -0.10490447282791138, -0.1572360098361969, -0.00852180551737547, 0.06041090562939644, -0.016315707936882973, -0.06332248449325562, -0.03256785124540329, -0.06997314095497131, -0.009052609093487263, -0.01840241625905037, 0.14318585395812988, -0.054495859891176224, 0.11226916313171387, 0.0366857685148716, 0.06353183835744858, -0.04224229231476784, 0.06274763494729996, -0.10610365122556686, 0.00853379163891077, -0.19090856611728668, 0.03177057206630707, -0.05396650731563568, 0.07099489867687225, -0.09914238005876541, -0.12244203686714172, 0.032671622931957245, -0.01639375276863575, 0.06528753787279129, 0.07808302342891693, -0.15164630115032196, -0.07684312015771866, 0.13759492337703705, -0.06856616586446762, -0.1079990342259407, 0.11296457797288895, -0.057406000792980194, 0.05042954534292221, 0.06933726370334625, 0.1593238115310669, 0.08840915560722351, -0.06477148085832596, 0.026725798845291138, 0.00066765007795766, 0.04646060988306999, -0.078978031873703, 0.06305713951587677, 0.005717976950109005, -0.007645385339856148, 0.03914140164852142, -0.04255397990345955, 0.07365205883979797, -0.10087122768163681, -0.09490270912647247, -0.03984110429883003, -0.10380860418081284, 0.05819815397262573, 0.07935849577188492, 0.09347687661647797, -0.08544919639825821, -0.05958665907382965, 0.08793900161981583, 0.08635881543159485, -0.054603319615125656, 0.02730610780417919, -0.05332687869668007, 0.06365387886762619, -0.04371881112456322, -0.027696670964360237, -0.19554241001605988, -0.0036673718132078648, 0.016876129433512688, -0.02689436450600624, 0.016917003318667412, 0.020393062382936478, 0.06735631823539734, 0.06229816749691963, -0.055630240589380264, -0.0278757493942976, -0.028791220858693123, 0.0003005470789503306, -0.13475508987903595, -0.1854153573513031, -0.035661693662405014, -0.015086770989000797, 0.11008621007204056, -0.18677321076393127, 0.03375377133488655, -0.029780017212033272, 0.08262377977371216, 0.004351509269326925, -0.005956398788839579, -0.04943365976214409, 0.08770736306905746, -0.03179140016436577, -0.05096161365509033, 0.07594846189022064, 0.0005478553357534111, -0.0817658081650734, -0.052252646535634995, -0.07700439542531967, 0.18690763413906097, 0.13215841352939606, -0.1347247064113617, -0.08083002269268036, -0.006462545599788427, -0.06719977408647537, -0.037321191281080246, -0.04163702204823494, 0.04854147508740425, 0.16845574975013733, -0.01653023436665535, 0.15211237967014313, -0.06846822053194046, -0.050825271755456924, 0.02203814685344696, -0.037703417241573334, 0.034685224294662476, 0.1169273853302002, 0.11658647656440735, -0.07931557297706604, 0.14434698224067688, 0.1726522445678711, -0.10337045788764954, 0.11057896167039871, -0.0555478036403656, -0.07000288367271423, -0.014874213375151157, -0.017154060304164886, -0.004304046742618084, 0.09040926396846771, -0.11877954006195068, 0.002469810890033841, 0.026330605149269104, 0.027714503929018974, 0.01983053609728813, -0.23270221054553986, -0.03629384934902191, 0.027419667690992355, -0.03471803292632103, 0.009533523581922054, -0.01397203654050827, 0.012443175539374352, 0.10156866908073425, -0.0021597337909042835, -0.09904389828443527, 0.04577483981847763, 0.008723175153136253, -0.07211662828922272, 0.2169746309518814, -0.08723098784685135, -0.14722590148448944, -0.12021668255329132, -0.07177209854125977, -0.04448215290904045, 0.00998769048601389, 0.06286855041980743, -0.09159588813781738, -0.02515576221048832, -0.04587024822831154, 0.01752874068915844, -0.005947553552687168, 0.0443168543279171, 0.003746612463146448, 0.009657477028667927, 0.08383160829544067, -0.1077953577041626, -0.0057969288900494576, -0.05328642576932907, -0.06759145110845566, 0.05299430340528488, 0.040345411747694016, 0.1003018394112587, 0.15734457969665527, -0.023215923458337784, 0.006444083992391825, -0.026949608698487282, 0.22798584401607513, -0.05671955272555351, -0.03264690563082695, 0.14779682457447052, -0.0037057027220726013, 0.058874886482954025, 0.0995480939745903, 0.07460452616214752, -0.08423647284507751, 0.006641348823904991, 0.028369424864649773, -0.03368440642952919, -0.2178128957748413, -0.051267724484205246, -0.05008760839700699, -0.027573034167289734, 0.10509137809276581, 0.029115833342075348, 0.049163322895765305, 0.07285648584365845, 0.048564471304416656, 0.08265149593353271, -0.04738842323422432, 0.05367567017674446, 0.12184527516365051, 0.045836325734853745, 0.12146265804767609, -0.04028896242380142, -0.07427579909563065, 0.029950492084026337, -0.004168899729847908, 0.23791225254535675, 0.0007414097199216485, 0.10968799144029617, 0.06256458908319473, 0.20025143027305603, -0.0036088929045945406, 0.08452253043651581, -0.0020167052280157804, -0.03890075907111168, -0.012549259699881077, -0.03546467795968056, -0.03464210033416748, 0.014344841241836548, -0.05336162447929382, 0.060573771595954895, -0.11530958861112595, -0.017629727721214294, 0.0498148649930954, 0.2647526264190674, 0.022327696904540062, -0.3204793930053711, -0.08963557332754135, -0.007978083565831184, -0.03741117939352989, -0.015011219307780266, 0.022595852613449097, 0.07748547941446304, -0.10020129382610321, 0.01885417476296425, -0.07295472919940948, 0.09012960642576218, -0.04098981246352196, 0.04436150938272476, 0.08237243443727493, 0.08490699529647827, 0.024304987862706184, 0.0872531458735466, -0.3162821829319, 0.27251148223876953, -0.0035118621308356524, 0.06602108478546143, -0.07557486742734909, 0.0024755194317549467, 0.03443848341703415, 0.0631520003080368, 0.05663766339421272, -0.00847687292844057, -0.0259780902415514, -0.21586869657039642, -0.048727549612522125, 0.025256292894482613, 0.08177629113197327, -0.030732588842511177, 0.08087407052516937, -0.03112526424229145, 0.008841573260724545, 0.0753592774271965, -0.03406723216176033, -0.04871645197272301, -0.08016218990087509, -0.008461657911539078, 0.01645497977733612, -0.040045976638793945, -0.06686762720346451, -0.11361823976039886, -0.13151128590106964, 0.146908700466156, -0.014478667639195919, -0.04019096493721008, -0.1141204759478569, 0.0807265043258667, 0.09433779865503311, -0.08906488865613937, 0.056427713483572006, 0.004763653501868248, 0.05618469417095184, 0.0403650626540184, -0.07339448481798172, 0.10392115265130997, -0.06638116389513016, -0.16689422726631165, -0.05233537405729294, 0.10404738783836365, 0.03746505454182625, 0.06103800609707832, -0.010518264025449753, 0.01139659434556961, -0.04276663437485695, -0.09297512471675873, 0.020595569163560867, -0.01335721556097269, 0.0787399560213089, 0.025067994371056557, -0.059371452778577805, 0.019681911915540695, -0.06369323283433914, -0.034875307232141495, 0.17943419516086578, 0.2329009622335434, -0.10168200731277466, 0.01639571413397789, 0.042330898344516754, -0.06417684257030487, -0.1901925802230835, 0.032355792820453644, 0.07048771530389786, -0.0072563448920845985, 0.04654482752084732, -0.17575795948505402, 0.14269855618476868, 0.1080239936709404, -0.01723964512348175, 0.10025654733181, -0.3346121609210968, -0.12317898869514465, 0.13435600697994232, 0.15178608894348145, 0.12070640921592712, -0.13616013526916504, -0.018153192475438118, -0.009529996663331985, -0.12443296611309052, 0.09639251977205276, -0.06180505454540253, 0.11932124942541122, -0.0407995767891407, 0.08657896518707275, 0.0028171264566481113, -0.06089264526963234, 0.11797226965427399, 0.03450179845094681, 0.10074040293693542, -0.05168309435248375, -0.032468438148498535, 0.031002087518572807, -0.035184595733881, 0.01280198898166418, -0.05727643892168999, 0.03797644004225731, -0.08832880109548569, -0.01472645066678524, -0.08819061517715454, 0.05219382792711258, -0.027814041823148727, -0.06603356450796127, -0.03946244716644287, 0.030892642214894295, 0.04626777768135071, -0.016503699123859406, 0.12867487967014313, 0.043196119368076324, 0.14550930261611938, 0.1241157054901123, 0.046416446566581726, -0.05978991463780403, -0.08584525436162949, -0.030413251370191574, -0.015340480022132397, 0.06364728510379791, -0.12871761620044708, 0.03382185101509094, 0.1485341489315033, 0.0222712941467762, 0.12661974132061005, 0.08456214517354965, -0.014846193604171276, 0.003613626817241311, 0.05737221613526344, -0.16194765269756317, -0.06907642632722855, -0.0033905457239598036, -0.054840344935655594, -0.09808161854743958, 0.06678633391857147, 0.08172890543937683, -0.07489777356386185, -0.01545796263962984, -0.005316929426044226, 0.000732658663764596, -0.06578140705823898, 0.21678611636161804, 0.06494463980197906, 0.050055477768182755, -0.10536226630210876, 0.07577268779277802, 0.059774260967969894, -0.06833784282207489, -0.016467630863189697, 0.05887994170188904, -0.08945135027170181, -0.043142858892679214, 0.10109204798936844, 0.15372608602046967, -0.07561752200126648, -0.042544666677713394, -0.14167149364948273, -0.12121687084436417, 0.07915420085191727, 0.1362648755311966, 0.13042211532592773, 0.02055777609348297, -0.061073895543813705, 0.009461781941354275, -0.12138591706752777, 0.08071893453598022, 0.04903947189450264, 0.07905827462673187, -0.15618456900119781, 0.1672392189502716, 0.011263718828558922, 0.05939793586730957, -0.022746413946151733, 0.024558519944548607, -0.09722242504358292, 0.02003728225827217, -0.11402522772550583, -0.03415508195757866, -0.02346336841583252, 0.0069771138951182365, -0.0014554181834682822, -0.06508858501911163, -0.04496513307094574, 0.02575247548520565, -0.12406784296035767, -0.019407285377383232, 0.03598460182547569, 0.05989709496498108, -0.11034476011991501, -0.04966014251112938, 0.025180626660585403, -0.056477561593055725, 0.06575867533683777, 0.04826177656650543, 0.01585858315229416, 0.0589401014149189, -0.12228889018297195, -0.009552311152219772, 0.08242971450090408, 0.014030206017196178, 0.06789232045412064, -0.0971333235502243, -0.009098833426833153, 0.0056411391124129295, 0.06151135265827179, 0.012682363390922546, 0.06958896666765213, -0.15275073051452637, -0.014238297939300537, -0.03813319280743599, -0.08243647962808609, -0.07064160704612732, 0.014369173906743526, 0.0861504003405571, 0.01801864057779312, 0.19914422929286957, -0.07132250815629959, 0.041957974433898926, -0.20711378753185272, -0.004042515996843576, -0.022621940821409225, -0.11596845090389252, -0.12988221645355225, -0.05970165133476257, 0.05591471493244171, -0.04924644157290459, 0.13415658473968506, 0.02225290611386299, 0.04384450986981392, 0.025474073365330696, -0.02543630450963974, 0.012524261139333248, 0.025091852992773056, 0.21838481724262238, 0.030924208462238312, -0.029757333919405937, 0.07221536338329315, 0.055553585290908813, 0.08995425701141357, 0.10823005437850952, 0.17972326278686523, 0.1630435734987259, -0.026054412126541138, 0.07775992155075073, 0.022853700444102287, -0.04651666432619095, -0.174968883395195, 0.03722415119409561, -0.03874797374010086, 0.09635321795940399, -0.01793629489839077, 0.2182619720697403, 0.052919235080480576, -0.17034445703029633, 0.04970400780439377, -0.046197857707738876, -0.08386348187923431, -0.09611968696117401, -0.03198215365409851, -0.07731682062149048, -0.1443140208721161, -0.001933841616846621, -0.09962309151887894, 0.011216786690056324, 0.11773963272571564, 0.007312968373298645, -0.028157910332083702, 0.15686410665512085, 0.03073590248823166, 0.024024371057748795, 0.049498751759529114, -0.00018909445498138666, -0.03390111029148102, -0.10221646726131439, -0.063963383436203, -0.025020528584718704, -0.005351644940674305, 0.03391876444220543, -0.06820616126060486, -0.07622960209846497, 0.029339371249079704, -0.025939155369997025, -0.09253719449043274, 0.019156161695718765, 0.014912665821611881, 0.05862680822610855, 0.035182662308216095, 0.004545911680907011, 0.016919463872909546, -0.016842158511281013, 0.21368074417114258, -0.08080590516328812, -0.08407122641801834, -0.08877523988485336, 0.29008781909942627, 0.04748781397938728, -0.01257461216300726, 0.04126480221748352, -0.056648679077625275, -0.0053301104344427586, 0.2465147078037262, 0.18474853038787842, -0.08315616846084595, -0.011853124015033245, 0.008229353465139866, -0.01676662638783455, -0.024868473410606384, 0.12024780362844467, 0.14298218488693237, 0.04193711280822754, -0.10185492783784866, -0.03721879422664642, -0.06289547681808472, -0.009268596768379211, -0.05904142186045647, 0.05562226474285126, 0.03969120979309082, 0.0017080010147765279, -0.04345044493675232, 0.05299265682697296, -0.06207139045000076, -0.09576716274023056, 0.07043609023094177, -0.18805314600467682, -0.1636521965265274, -0.012589734978973866, 0.1133674830198288, 0.0009165772935375571, 0.05533875897526741, -0.03429296612739563, 0.00785968266427517, 0.07024545967578888, -0.021957632154226303, -0.08417799323797226, -0.08190391957759857, 0.10478020459413528, -0.09594832360744476, 0.1936238706111908, -0.03998677432537079, 0.07774116843938828, 0.11844697594642639, 0.06628917902708054, -0.07656536996364594, 0.05419738590717316, 0.03906223922967911, -0.08002494275569916, 0.029949678108096123, 0.08698388189077377, -0.02180090919137001, 0.051260314881801605, 0.02691015414893627, -0.12712417542934418, 0.013321330770850182, -0.06654039770364761, -0.04546433687210083, -0.04259371757507324, -0.05045497789978981, -0.04919775202870369, 0.12036410719156265, 0.20986242592334747, -0.030026080086827278, 0.005432017147541046, -0.081694096326828, 0.015403516590595245, 0.060688119381666183, 0.004151199944317341, -0.07960010319948196, -0.21983881294727325, 0.011987659148871899, 0.049943890422582626, -0.025073792785406113, -0.18960802257061005, -0.09829900413751602, 0.0037656286731362343, -0.08612839132547379, -0.09564343094825745, 0.08121465146541595, 0.05799389258027077, 0.054150696843862534, -0.05114701762795448, -0.07339171320199966, -0.09389763325452805, 0.14657104015350342, -0.15874528884887695, -0.09063027799129486 ]
null
null
transformers
Test from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("NakHyun/electra_kr_v1") model = AutoModel.from_pretrained("NakHyun/electra_kr_v1")
{}
feature-extraction
NakHyun/electra_kr_v1
[ "transformers", "pytorch", "electra", "feature-extraction", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #electra #feature-extraction #endpoints_compatible #region-us
Test from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("NakHyun/electra_kr_v1") model = AutoModel.from_pretrained("NakHyun/electra_kr_v1")
[]
[ "TAGS\n#transformers #pytorch #electra #feature-extraction #endpoints_compatible #region-us \n" ]
[ 30 ]
[ "passage: TAGS\n#transformers #pytorch #electra #feature-extraction #endpoints_compatible #region-us \n" ]
[ -0.07975126057863235, 0.013228539377450943, -0.008717965334653854, 0.0005652483669109643, 0.17895299196243286, 0.030872993171215057, -0.04548921063542366, 0.08153141289949417, 0.05322251841425896, 0.01501636952161789, 0.11114959418773651, 0.2656104862689972, -0.02554168738424778, 0.06629408895969391, -0.07814864814281464, -0.2829738259315491, 0.0996851772069931, 0.09957648813724518, -0.053218159824609756, 0.09103173017501831, 0.06503909826278687, -0.09707950055599213, 0.05788804963231087, -0.0020796358585357666, -0.12059040367603302, 0.05761711671948433, 0.0014988153707236052, -0.11447042226791382, 0.12541234493255615, 0.044451165944337845, 0.17340326309204102, -0.007852103561162949, -0.09485998004674911, -0.1724833846092224, 0.03126726299524307, -0.01424057874828577, -0.07514118403196335, 0.0287702064961195, 0.03934376314282417, -0.11897691339254379, 0.10361217707395554, 0.11986999213695526, 0.030615607276558876, 0.01662052422761917, -0.16053521633148193, -0.15828438103199005, -0.046094249933958054, 0.026999905705451965, 0.029413625597953796, 0.104136161506176, 0.0131473233923316, 0.16007021069526672, -0.15614217519760132, 0.07499712705612183, 0.1846417635679245, -0.23557943105697632, 0.016101744025945663, 0.020876213908195496, 0.13717424869537354, -0.007830524817109108, 0.007435155101120472, 0.024285534396767616, 0.008768629282712936, 0.026845304295420647, -0.005279554985463619, -0.0608995296061039, -0.051700882613658905, 0.07372178137302399, -0.10274198651313782, -0.10504782944917679, 0.22384469211101532, -0.01605829782783985, 0.0648103877902031, 0.019357100129127502, -0.10316639393568039, -0.08141423761844635, -0.008584714494645596, 0.011911441572010517, -0.03755253925919533, 0.0491216666996479, 0.04484780877828598, 0.007420660927891731, -0.11204095929861069, 0.026651205494999886, -0.2012098729610443, 0.26606041193008423, 0.030946342274546623, 0.09488136321306229, -0.1879807859659195, 0.04517587646842003, -0.04959866404533386, -0.08674703538417816, 0.009586879052221775, -0.08235432207584381, 0.03309215232729912, -0.00026125286240130663, -0.08662062883377075, -0.0036277580074965954, 0.08106914907693863, 0.18340478837490082, 0.018101466819643974, 0.014939660206437111, 0.05728588253259659, 0.10301002115011215, 0.068782739341259, 0.108731210231781, -0.00419941870495677, 0.012119399383664131, 0.021682417020201683, -0.1367391049861908, -0.01808958314359188, -0.06230565533041954, -0.11832704395055771, -0.03827453404664993, 0.03130539506673813, 0.1076701283454895, 0.027019545435905457, -0.014169634319841862, -0.0965222716331482, -0.01970943808555603, 0.059046968817710876, -0.08004218339920044, -0.009877580218017101, -0.012588908895850182, -0.0000809586126706563, 0.18774954974651337, -0.019328497350215912, -0.015675239264965057, -0.05984220653772354, 0.07282158732414246, -0.081241674721241, 0.006030561402440071, -0.04663662984967232, -0.06432642042636871, 0.04778647422790527, -0.20038902759552002, 0.07847640663385391, -0.1804504096508026, -0.11331234872341156, 0.024923766031861305, 0.05147114768624306, 0.00016070638957899064, -0.005094334948807955, -0.010797188617289066, -0.010099806822836399, -0.0057349056005477905, -0.06762165576219559, -0.12872333824634552, -0.0692242905497551, 0.10589279979467392, 0.015581343322992325, 0.053894031792879105, -0.1071445494890213, 0.08844299614429474, -0.08977712690830231, 0.013172886334359646, -0.15997636318206787, -0.0015831842320039868, -0.004914687015116215, 0.16621315479278564, -0.01796848140656948, -0.07167142629623413, -0.08373596519231796, 0.05589810386300087, -0.038116585463285446, 0.140919491648674, -0.03454277664422989, -0.1293887048959732, 0.18863651156425476, -0.15312761068344116, -0.16845650970935822, 0.05457654967904091, -0.006283015478402376, -0.03739482909440994, 0.06701026111841202, 0.12873634696006775, 0.07904829829931259, -0.06362074613571167, 0.039268024265766144, 0.06777035444974899, -0.16436335444450378, -0.1574442982673645, 0.03385745361447334, -0.025416402146220207, -0.02597181499004364, 0.027035698294639587, 0.029592260718345642, 0.11748123914003372, -0.07502547651529312, -0.029503902420401573, -0.017801474779844284, -0.013899890705943108, 0.09550482779741287, 0.053734008222818375, 0.07020214945077896, -0.03148980066180229, -0.013148372992873192, 0.03528037294745445, 0.016498388722538948, -0.015243849717080593, 0.03276162967085838, -0.10607010126113892, 0.19772940874099731, -0.09009984135627747, 0.0020353556610643864, -0.264507919549942, -0.08777495473623276, -0.02269466407597065, 0.037877801805734634, -0.046344030648469925, 0.1475614309310913, 0.06552953273057938, -0.06771820783615112, 0.042607422918081284, -0.04770008102059364, 0.04966551437973976, 0.015863219276070595, -0.006036301143467426, -0.03018331527709961, -0.016981635242700577, -0.08852780610322952, -0.06921114772558212, -0.04321291297674179, -0.00677314680069685, 0.0659976601600647, 0.11103405803442001, 0.003450920572504401, 0.041572924703359604, -0.05606473609805107, 0.0641997680068016, -0.05152468755841255, 0.017149297520518303, 0.0704510509967804, -0.013807657174766064, -0.06005179136991501, 0.10324951261281967, -0.13244596123695374, 0.3336680829524994, 0.17116715013980865, -0.2853701114654541, 0.0160454660654068, -0.016749458387494087, -0.005376600660383701, 0.03404312953352928, 0.042773179709911346, -0.024159282445907593, 0.10180145502090454, 0.034237440675497055, 0.13534998893737793, -0.024803852662444115, -0.017678748816251755, 0.01575644314289093, -0.037999700754880905, -0.03886386752128601, 0.06893280148506165, 0.138071671128273, -0.14914625883102417, 0.1559152454137802, 0.17900478839874268, 0.005311029963195324, 0.1130446344614029, -0.0587434321641922, -0.038595836609601974, 0.03340664878487587, 0.0011338302865624428, -0.026796849444508553, 0.013377377763390541, -0.22129112482070923, -0.035620901733636856, 0.06734651327133179, 0.00929180532693863, 0.08849363029003143, -0.1187056228518486, -0.04433098062872887, 0.04622649773955345, 0.016978934407234192, -0.08401510119438171, 0.07531425356864929, 0.053462713956832886, 0.04944980517029762, 0.0010704711312428117, -0.08343631029129028, 0.097700335085392, 0.024639934301376343, -0.039427485316991806, 0.18418832123279572, -0.10854988545179367, -0.2732328176498413, -0.12938417494297028, -0.10337808728218079, 0.045887839049100876, 0.02803945541381836, 0.10158901661634445, -0.056575510650873184, -0.04984118789434433, 0.07168640941381454, 0.04107372835278511, -0.11689796298742294, 0.011672618798911572, -0.04417510703206062, 0.04222816228866577, -0.09591475874185562, -0.08092901110649109, -0.056530483067035675, -0.057272545993328094, 0.023776547983288765, 0.11642104387283325, -0.10204865038394928, 0.11598237603902817, 0.1394692212343216, 0.031042762100696564, 0.0781932845711708, -0.0006158645264804363, 0.15662018954753876, -0.07675057649612427, -0.08369571715593338, 0.24728062748908997, -0.040322195738554, 0.09096495062112808, 0.09831584244966507, 0.01792307384312153, -0.0655917152762413, -0.02121458388864994, -0.0728529840707779, -0.11975870281457901, -0.20165081322193146, -0.12280479818582535, -0.13724978268146515, 0.004936037585139275, 0.03375361114740372, 0.052771709859371185, 0.10091763734817505, 0.08637923002243042, 0.04704452306032181, -0.08524936437606812, -0.0423298105597496, 0.03749385476112366, 0.233053058385849, 0.009277258068323135, 0.09522668272256851, -0.06728071719408035, -0.11434231698513031, 0.06428046524524689, 0.020384352654218674, 0.28862258791923523, 0.10554949194192886, 0.05884178355336189, 0.04892561584711075, 0.14234581589698792, 0.14491945505142212, 0.17808018624782562, 0.016511302441358566, -0.025136925280094147, -0.0037247634027153254, 0.03076115809381008, -0.06903129816055298, 0.02211008220911026, 0.183019757270813, -0.13306458294391632, -0.09973535686731339, -0.16983669996261597, 0.059729184955358505, 0.08308405429124832, 0.02057812735438347, -0.24731557071208954, -0.00424434058368206, 0.04568008333444595, -0.030408242717385292, -0.06239582598209381, 0.049747955054044724, -0.05104197934269905, -0.13373053073883057, 0.05354537069797516, -0.0633578896522522, 0.1001935675740242, -0.03343118354678154, 0.04730112850666046, -0.05711689591407776, -0.11069042980670929, 0.06442993134260178, 0.07333261519670486, -0.1543048918247223, 0.26777878403663635, -0.029466597363352776, -0.03939468413591385, -0.06098778918385506, 0.012997339479625225, 0.01174523402005434, 0.13362795114517212, 0.09875590354204178, 0.0010104253888130188, -0.13643300533294678, -0.1746012270450592, 0.05228845030069351, 0.03974445164203644, 0.09650742262601852, -0.05740637332201004, 0.010435737669467926, -0.025400858372449875, 0.003463223110884428, -0.03723874315619469, 0.02638002671301365, 0.10657165944576263, -0.15091116726398468, 0.03205259516835213, -0.03057526797056198, 0.10444937646389008, 0.009391237050294876, -0.019762082025408745, -0.02139825001358986, 0.08341127634048462, -0.10447879880666733, -0.07324764877557755, -0.11099875718355179, -0.08410975337028503, 0.1322663426399231, -0.10731025785207748, 0.14327050745487213, -0.054098136723041534, -0.0013054092414677143, -0.055807776749134064, -0.17620569467544556, 0.12249738723039627, -0.11120586842298508, 0.05011958256363869, -0.01507540326565504, 0.1351047158241272, -0.025458209216594696, -0.012950251810252666, 0.033199772238731384, 0.03907063975930214, -0.13127674162387848, -0.07765430957078934, -0.022318199276924133, 0.0018533346010372043, 0.07570883631706238, 0.11292707175016403, -0.016734914854168892, 0.017834192141890526, 0.013650454580783844, 0.023115767166018486, 0.244540274143219, 0.13714243471622467, -0.044600676745176315, 0.09204702079296112, 0.1395150125026703, -0.02236473187804222, -0.23822186887264252, -0.11307679116725922, -0.11351732909679413, -0.02309395745396614, 0.029662970453500748, -0.16450589895248413, 0.11853472143411636, 0.07705652713775635, -0.009711038321256638, 0.14149731397628784, -0.23944030702114105, -0.03298620134592056, 0.1584709882736206, -0.00360135012306273, 0.39042362570762634, -0.11187373846769333, -0.07177730649709702, -0.0072438218630850315, -0.24779574573040009, 0.08644714206457138, 0.05328495800495148, 0.06454228609800339, -0.022433822974562645, 0.060150351375341415, 0.03522184118628502, -0.06072523444890976, 0.12971241772174835, 0.028064202517271042, 0.028078528121113777, -0.07394811511039734, -0.09276772290468216, 0.02046889066696167, -0.02170201949775219, 0.03488045930862427, 0.02498980239033699, 0.016767585650086403, -0.18310804665088654, -0.039251141250133514, -0.10932015627622604, 0.08341920375823975, 0.07111708074808121, -0.010001208633184433, -0.011767618358135223, -0.04973473772406578, 0.01050595287233591, 0.03693966940045357, 0.2714177668094635, -0.018161725252866745, 0.14646054804325104, 0.003293430432677269, 0.057011619210243225, -0.17836447060108185, -0.1862558275461197, -0.0855146050453186, -0.031160874292254448, 0.09857257455587387, -0.04462972283363342, 0.060989443212747574, 0.1532907485961914, -0.030870938673615456, 0.03997436538338661, 0.12055464088916779, 0.0035813837312161922, 0.03087995946407318, 0.13379040360450745, -0.14922462403774261, 0.01102323830127716, -0.024050580337643623, -0.011923084035515785, 0.08065382391214371, 0.08827652037143707, 0.0961829423904419, 0.057436905801296234, -0.030109699815511703, -0.04121651500463486, -0.02667907439172268, -0.04913060739636421, 0.05445883050560951, 0.04262576624751091, 0.04978203773498535, -0.14644832909107208, 0.04294656589627266, 0.00014384221867658198, -0.2881898880004883, -0.019938580691814423, 0.06990357488393784, -0.1114058569073677, -0.11321679502725601, -0.06772186607122421, 0.10282948613166809, -0.26965129375457764, -0.06508246809244156, -0.076045922935009, -0.10778965055942535, 0.08611826598644257, 0.2504741847515106, 0.09314929693937302, 0.08882984519004822, -0.02191621996462345, -0.0014355962630361319, -0.01766072027385235, -0.07251518964767456, 0.05426134914159775, 0.0005476965452544391, -0.12471113353967667, -0.03391699120402336, 0.00504904193803668, 0.14899364113807678, -0.07171031832695007, -0.0837659165263176, -0.10669288039207458, 0.08865801244974136, -0.05722878873348236, -0.05089295655488968, -0.13012874126434326, -0.05322761461138725, 0.02886841632425785, -0.03418932855129242, -0.03387526050209999, -0.014967662282288074, -0.1399095058441162, 0.048216789960861206, -0.0013704135781154037, -0.011786703020334244, -0.06474016606807709, -0.03462399169802666, 0.10898532718420029, -0.06056598201394081, 0.09472924470901489, 0.18529395759105682, -0.07336372137069702, 0.11632540076971054, -0.10202792286872864, -0.15101325511932373, 0.12986592948436737, 0.018413515761494637, 0.061212245374917984, 0.025450773537158966, 0.052198637276887894, 0.06718982011079788, -0.016504919156432152, 0.013031188398599625, -0.09930837899446487, -0.14478996396064758, -0.018767938017845154, -0.001338280737400055, -0.16206951439380646, -0.05890277028083801, -0.08583755046129227, 0.13095340132713318, 0.024707365781068802, 0.14739203453063965, 0.024900302290916443, 0.10121426731348038, -0.014291719533503056, 0.0023171231150627136, -0.01087226439267397, -0.1523393988609314, 0.03337506949901581, -0.06093747541308403, -0.005243719555437565, -0.001501764403656125, 0.24448759853839874, -0.026950785890221596, 0.0688345730304718, 0.011539481580257416, 0.06974007934331894, 0.03866555541753769, 0.030348336324095726, 0.25386661291122437, 0.12129001319408417, -0.04315360262989998, -0.09469097852706909, 0.08694335073232651, 0.006282500457018614, -0.043884534388780594, 0.11518506705760956, 0.08474475890398026, 0.03828169032931328, 0.10916043072938919, 0.025192735716700554, 0.05024275183677673, -0.09910264611244202, -0.2522429823875427, 0.011392517946660519, 0.052873946726322174, 0.08602789044380188, 0.08083923161029816, 0.16372832655906677, -0.012347138486802578, 0.0695614218711853, -0.007725466508418322, -0.013160713016986847, -0.13120722770690918, -0.09806905686855316, -0.06602475792169571, -0.13620175421237946, 0.024103427305817604, -0.07684215903282166, 0.009868301451206207, 0.21458494663238525, 0.012583228759467602, -0.02948855608701706, 0.08657270669937134, 0.07180085778236389, -0.0715281292796135, 0.02714904397726059, -0.027250083163380623, 0.00841011106967926, 0.08909905701875687, -0.01466073002666235, -0.10132870823144913, -0.072747603058815, -0.03289971873164177, 0.024446522817015648, -0.10722018778324127, 0.04648353159427643, -0.13491475582122803, -0.11824722588062286, -0.037216391414403915, 0.05696401745080948, -0.07511013001203537, 0.10084313154220581, -0.008445566520094872, 0.0037502041086554527, 0.02280675433576107, 0.1696174144744873, -0.053550999611616135, -0.02420194260776043, -0.04790832847356796, 0.12435431778430939, 0.14046768844127655, 0.10456053912639618, 0.002867148257791996, -0.004481680225580931, -0.06160091608762741, 0.22057892382144928, 0.16673420369625092, 0.014762576669454575, 0.035671114921569824, 0.043346647173166275, 0.03428850322961807, 0.09223761409521103, 0.06545384973287582, 0.1106155514717102, 0.30893802642822266, -0.08850914239883423, -0.04664571210741997, -0.05865371599793434, 0.01738993264734745, -0.12146604806184769, -0.01841397024691105, 0.05152241140604019, -0.06578745692968369, -0.03945121914148331, 0.13280420005321503, -0.1923556923866272, 0.12411537021398544, 0.09844283014535904, -0.17619571089744568, -0.0555637925863266, -0.07906384766101837, 0.17494899034500122, 0.03481259569525719, 0.11692555993795395, -0.04805618152022362, -0.1635507345199585, 0.05788544565439224, 0.05993019789457321, -0.2931281328201294, -0.10165923088788986, 0.10148472338914871, 0.05540014058351517, -0.03156530484557152, -0.035910241305828094, 0.018221035599708557, 0.07300417125225067, 0.09282472729682922, -0.004588735289871693, 0.06366895139217377, 0.031587664037942886, -0.07267936319112778, -0.019923510029911995, 0.017027540132403374, 0.0057847388088703156, -0.08249214291572571, 0.026295410469174385, -0.1836441308259964, 0.037262093275785446, -0.020129075273871422, 0.000710327411070466, 0.021105431020259857, -0.009900368750095367, -0.04712478443980217, 0.04121611639857292, 0.03969786688685417, 0.0065671829506754875, -0.030024314299225807, -0.05382302403450012, 0.01569458283483982, 0.08698170632123947, -0.0685315728187561, -0.16629739105701447, -0.06753115355968475, -0.07877429574728012, 0.06528545916080475, -0.04124598205089569, -0.06158700957894325, -0.03194165602326393, -0.07725261151790619, 0.03468053415417671, -0.11508652567863464, 0.0407571904361248, 0.05525625869631767, 0.0359155647456646, 0.003930909093469381, 0.02967100962996483, 0.03132113069295883, 0.08464955538511276, -0.13966840505599976, -0.08546464890241623 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-cola This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.8239 - Matthews Correlation: 0.5495 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.5235 | 1.0 | 535 | 0.5402 | 0.4156 | | 0.3484 | 2.0 | 1070 | 0.5272 | 0.5233 | | 0.2381 | 3.0 | 1605 | 0.6665 | 0.5050 | | 0.1746 | 4.0 | 2140 | 0.7512 | 0.5429 | | 0.1308 | 5.0 | 2675 | 0.8239 | 0.5495 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.10.0+cu111 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5494735380761103, "name": "Matthews Correlation"}]}]}]}
text-classification
NaliniK/distilbert-base-uncased-finetuned-cola
[ "transformers", "pytorch", "tensorboard", "distilbert", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-cola ====================================== This model is a fine-tuned version of distilbert-base-uncased on the glue dataset. It achieves the following results on the evaluation set: * Loss: 0.8239 * Matthews Correlation: 0.5495 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 5 ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.10.0+cu111 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 67, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.1009262204170227, 0.09821517020463943, -0.002452248940244317, 0.12103411555290222, 0.16565309464931488, 0.03463206812739372, 0.13162177801132202, 0.12641224265098572, -0.084870345890522, 0.02165844850242138, 0.12025084346532822, 0.16090701520442963, 0.023319238796830177, 0.11093048006296158, -0.04821396991610527, -0.2652606964111328, -0.014466065913438797, 0.050281018018722534, -0.05660330131649971, 0.1352313607931137, 0.09067373722791672, -0.12085840106010437, 0.0904083251953125, 0.009713292121887207, -0.19206646084785461, 0.0016439338214695454, -0.00028200578526593745, -0.05116964876651764, 0.1487489640712738, 0.026507411152124405, 0.12207092344760895, 0.004972779657691717, 0.08329641819000244, -0.201603502035141, 0.010789391584694386, 0.046952761709690094, 0.002995898248627782, 0.09303033351898193, 0.04596554487943649, 0.0027834977954626083, 0.11862150579690933, -0.07892757654190063, 0.05430908873677254, 0.024962211027741432, -0.11947300285100937, -0.21228960156440735, -0.07798811793327332, 0.03513416647911072, 0.07692676782608032, 0.10520965605974197, -0.00855263788253069, 0.11890208721160889, -0.08178935199975967, 0.09343662112951279, 0.22585289180278778, -0.2847440838813782, -0.06656396389007568, 0.04357469454407692, 0.013676149770617485, 0.04769432172179222, -0.10208132117986679, -0.03521158546209335, 0.048286229372024536, 0.05155416950583458, 0.12662191689014435, -0.027606520801782608, -0.11644978821277618, 0.005940378177911043, -0.1410057097673416, -0.03161631152033806, 0.16792289912700653, 0.041660599410533905, -0.02741817943751812, -0.055264998227357864, -0.05747583508491516, -0.14893603324890137, -0.03552395850419998, -0.015266702510416508, 0.04888223111629486, -0.023544959723949432, -0.042681433260440826, -0.009025066159665585, -0.10873216390609741, -0.06570208817720413, -0.07231958210468292, 0.11121206730604172, 0.03790200129151344, 0.0065591237507760525, -0.02954426221549511, 0.11270377784967422, -0.0062278639525175095, -0.1221345067024231, 0.025713467970490456, 0.022981993854045868, 0.012306995689868927, -0.03955002874135971, -0.05276529863476753, -0.06348111480474472, 0.010653187520802021, 0.12846051156520844, -0.05477272719144821, 0.04327063634991646, 0.05068491771817207, 0.04879467934370041, -0.09355054050683975, 0.18862763047218323, -0.03341221436858177, -0.02410215325653553, -0.0009851707145571709, 0.050963494926691055, 0.01570958085358143, -0.011945250444114208, -0.12217366695404053, 0.006293424405157566, 0.08844052255153656, 0.008752093650400639, -0.06354348361492157, 0.07395453006029129, -0.059253107756376266, -0.02447742410004139, 0.0006996869342401624, -0.09093241393566132, 0.02193671651184559, 0.0014936517691239715, -0.07127910852432251, -0.02254832349717617, 0.03551747649908066, 0.015455229207873344, -0.019791357219219208, 0.10823856294155121, -0.08746760338544846, 0.027685418725013733, -0.09293241053819656, -0.11049887537956238, 0.01627829298377037, -0.10658751428127289, 0.022449487820267677, -0.09177114814519882, -0.17900583148002625, -0.01735616661608219, 0.06069190055131912, -0.025432970374822617, -0.058068059384822845, -0.05912404507398605, -0.06735910475254059, 0.01163268368691206, -0.00765110133215785, 0.11742366850376129, -0.06415560096502304, 0.09398433566093445, 0.02602791227400303, 0.06323729455471039, -0.03992825374007225, 0.060216259211301804, -0.10190523415803909, 0.013816668651998043, -0.15147393941879272, 0.03966300189495087, -0.05195590481162071, 0.06868549436330795, -0.08297616988420486, -0.1059068962931633, 0.004367377143353224, -0.0025675217621028423, 0.06254179775714874, 0.09625569730997086, -0.18459992110729218, -0.08265714347362518, 0.1632440984249115, -0.07354506105184555, -0.1210056021809578, 0.12062746286392212, -0.058034878224134445, 0.06106926500797272, 0.0579228512942791, 0.17942026257514954, 0.08649908006191254, -0.07792358845472336, 0.0021959522273391485, 0.023013029247522354, 0.05133254453539848, -0.062173277139663696, 0.0674373060464859, 0.002462415723130107, 0.021003486588597298, 0.035587865859270096, -0.026888489723205566, 0.06367671489715576, -0.08849866688251495, -0.09899723529815674, -0.03835056722164154, -0.0825461745262146, 0.04694850742816925, 0.07897617667913437, 0.06829575449228287, -0.09382995963096619, -0.0781160220503807, 0.05025189369916916, 0.08149793744087219, -0.058515578508377075, 0.024353638291358948, -0.04952818900346756, 0.07210984826087952, -0.024726063013076782, -0.021842140704393387, -0.18036986887454987, -0.033578891307115555, 0.007920365780591965, 0.0022146906703710556, 0.018517937511205673, 0.03497752919793129, 0.06383121758699417, 0.06114528700709343, -0.05030279979109764, -0.02001042850315571, -0.03534060716629028, -0.0006398644763976336, -0.12574802339076996, -0.19719381630420685, -0.0294183436781168, -0.022904377430677414, 0.15880218148231506, -0.20850005745887756, 0.05171632394194603, -0.0136674540117383, 0.0696297138929367, 0.012344736605882645, -0.007073594257235527, -0.03784091770648956, 0.07549665123224258, -0.04272909834980965, -0.05051558464765549, 0.08196092396974564, 0.013747869990766048, -0.09034740924835205, -0.0498608760535717, -0.0994524136185646, 0.15740856528282166, 0.12793076038360596, -0.110394686460495, -0.07851946353912354, -0.022778555750846863, -0.06751800328493118, -0.035295549780130386, -0.04679638147354126, 0.026207400485873222, 0.18768873810768127, -0.004946404602378607, 0.14969350397586823, -0.06800218671560287, -0.04371483623981476, 0.01860167644917965, -0.03722519055008888, 0.015319224447011948, 0.13431839644908905, 0.1357617974281311, -0.060376059263944626, 0.1550784707069397, 0.14807982742786407, -0.08442892134189606, 0.15120460093021393, -0.0420568473637104, -0.06637914478778839, -0.016720743849873543, -0.029382532462477684, -0.010929594747722149, 0.09980221092700958, -0.15543542802333832, -0.0018454859964549541, 0.03030187077820301, 0.01580311544239521, 0.02537713758647442, -0.22777874767780304, -0.04106512293219566, 0.037913694977760315, -0.04436817020177841, -0.006399653851985931, -0.0068670823238790035, 0.005117787979543209, 0.100909024477005, -0.0005280202603898942, -0.08723463863134384, 0.03618548810482025, 0.0023230852093547583, -0.08415070921182632, 0.2160075157880783, -0.08078210800886154, -0.1716926097869873, -0.13076327741146088, -0.07216935604810715, -0.04595330357551575, -0.0011342167854309082, 0.06896623224020004, -0.09753651916980743, -0.027056343853473663, -0.07150696963071823, 0.027468010783195496, 0.0074178297072649, 0.023562055081129074, 0.0038045954424887896, 0.007655387278646231, 0.06409827619791031, -0.11164529621601105, -0.014485311694443226, -0.05809342488646507, -0.045136887580156326, 0.04560290649533272, 0.0281759612262249, 0.10995645076036453, 0.15278303623199463, -0.012293346226215363, 0.012210988439619541, -0.03137592226266861, 0.23656125366687775, -0.060351066291332245, -0.020402058959007263, 0.1443767100572586, -0.008785259909927845, 0.052237652242183685, 0.11500951647758484, 0.07460913807153702, -0.07836030423641205, 0.004626953508704901, 0.038622647523880005, -0.03463738411664963, -0.23279862105846405, -0.053224340081214905, -0.05449909716844559, 0.012308281846344471, 0.09103969484567642, 0.023448951542377472, 0.028692500665783882, 0.07016528397798538, 0.041464801877737045, 0.0740201398730278, -0.036079373210668564, 0.051197972148656845, 0.12850120663642883, 0.030704893171787262, 0.12468411028385162, -0.046320971101522446, -0.06452030688524246, 0.04067147150635719, -0.00936245359480381, 0.22532227635383606, 0.010425439104437828, 0.13101448118686676, 0.06575895100831985, 0.16217443346977234, -0.00969642773270607, 0.07628103345632553, -0.009310625493526459, -0.037796199321746826, -0.016485238447785378, -0.03933506831526756, -0.03867710754275322, 0.02539115771651268, -0.06331483274698257, 0.06259248405694962, -0.12422854453325272, 0.015808269381523132, 0.05818132683634758, 0.24913398921489716, 0.035201121121644974, -0.3179529309272766, -0.09780462086200714, 0.001421924913302064, -0.031105637550354004, -0.018362216651439667, 0.026666050776839256, 0.09514304995536804, -0.09752967208623886, 0.028844371438026428, -0.07525903731584549, 0.09557416290044785, -0.05543964356184006, 0.05150289088487625, 0.08248480409383774, 0.088840551674366, 0.01173062901943922, 0.09300586581230164, -0.2877667248249054, 0.2772248089313507, 0.000048030022298917174, 0.055661480873823166, -0.07612543553113937, 0.008348985575139523, 0.0410311184823513, 0.0623110830783844, 0.07956962287425995, -0.012533637695014477, -0.02089948207139969, -0.18772314488887787, -0.06719288975000381, 0.026715757325291634, 0.07020293176174164, -0.042571257799863815, 0.08365686982870102, -0.03167461231350899, 0.009215028025209904, 0.073407843708992, 0.0012398024555295706, -0.053288232535123825, -0.10788240283727646, -0.003987864125519991, 0.02012508735060692, -0.06141512840986252, -0.06115366891026497, -0.12177953869104385, -0.12793827056884766, 0.15645374357700348, -0.035146139562129974, -0.03870423138141632, -0.10660391300916672, 0.08407267183065414, 0.06002400070428848, -0.08979647606611252, 0.04292048513889313, 0.0030619241297245026, 0.07556089758872986, 0.0213178601115942, -0.07040246576070786, 0.1037454903125763, -0.0742620974779129, -0.15673992037773132, -0.06504939496517181, 0.10711076855659485, 0.034122783690690994, 0.06692608445882797, -0.014884852804243565, 0.004078972619026899, -0.04587982967495918, -0.0879385694861412, 0.02262205071747303, 0.0031531741842627525, 0.07648036628961563, 0.018924014642834663, -0.07613077759742737, 0.01236394140869379, -0.06465084105730057, -0.03458128124475479, 0.20516571402549744, 0.22521072626113892, -0.09937387704849243, 0.02386457473039627, 0.02543218433856964, -0.07348236441612244, -0.1981433928012848, 0.034139104187488556, 0.05614576116204262, 0.008183568716049194, 0.04424389824271202, -0.186390683054924, 0.13023509085178375, 0.10588247328996658, -0.011545839719474316, 0.10516155511140823, -0.32340407371520996, -0.12076781690120697, 0.1363218128681183, 0.13628672063350677, 0.09845349192619324, -0.13189087808132172, -0.02303147315979004, -0.017307352274656296, -0.13878008723258972, 0.11419832706451416, -0.09083757549524307, 0.1215454638004303, -0.03721027821302414, 0.07714766263961792, 0.0027360611129552126, -0.058123379945755005, 0.11953246593475342, 0.024509411305189133, 0.09427390247583389, -0.05853839963674545, -0.034566327929496765, 0.030405152589082718, -0.042428139597177505, 0.0335245281457901, -0.09818235784769058, 0.02889910899102688, -0.10219687968492508, -0.025190062820911407, -0.06875866651535034, 0.04450458288192749, -0.045834269374608994, -0.06897126138210297, -0.0363733246922493, 0.026142295449972153, 0.04673600569367409, -0.007892302237451077, 0.12138921767473221, 0.02424238994717598, 0.14825253188610077, 0.09881183505058289, 0.07466599345207214, -0.0657680332660675, -0.08111712336540222, -0.027491822838783264, -0.01090377289801836, 0.050142787396907806, -0.13754954934120178, 0.019683336839079857, 0.15205997228622437, 0.019901322200894356, 0.15364794433116913, 0.08344452828168869, -0.022444015368819237, -0.0008529065526090562, 0.05929025635123253, -0.16524150967597961, -0.09421242028474808, -0.017291469499468803, -0.06770014017820358, -0.12142278999090195, 0.04468310996890068, 0.09404630213975906, -0.06736426055431366, -0.007040363270789385, -0.004721055272966623, 0.014436892233788967, -0.04941984638571739, 0.1857781559228897, 0.06194201111793518, 0.04897340014576912, -0.09603464603424072, 0.07241884618997574, 0.044728003442287445, -0.0724329873919487, 0.003446651389822364, 0.07367967069149017, -0.0847843736410141, -0.05503243952989578, 0.06494900584220886, 0.1905602663755417, -0.043478358536958694, -0.046595942229032516, -0.14575162529945374, -0.12333639711141586, 0.07773446291685104, 0.14070625603199005, 0.11781797558069229, 0.010350635275244713, -0.0662635788321495, 0.004165166523307562, -0.10713481158018112, 0.102436363697052, 0.0481707900762558, 0.06243141368031502, -0.14264069497585297, 0.14274820685386658, 0.01959664188325405, 0.04888249561190605, -0.018233148381114006, 0.023401951417326927, -0.1011165976524353, 0.007509634830057621, -0.0937676653265953, -0.020163772627711296, -0.029028424993157387, 0.01114876288920641, -0.005648795980960131, -0.04692848026752472, -0.05379098653793335, 0.010465285740792751, -0.10777492076158524, -0.02348729409277439, 0.029239974915981293, 0.07235087454319, -0.10887817293405533, -0.035732731223106384, 0.03084230236709118, -0.061991531401872635, 0.07474265992641449, 0.043316345661878586, 0.016156742349267006, 0.05119207501411438, -0.13714458048343658, 0.020409541204571724, 0.07358044385910034, 0.029315326362848282, 0.06089586764574051, -0.09892328828573227, -0.009040141478180885, -0.010013709776103497, 0.0403091125190258, 0.02125541865825653, 0.0754963755607605, -0.14061115682125092, 0.004063779022544622, -0.02399268187582493, -0.08452078700065613, -0.06728451699018478, 0.027385814115405083, 0.08780176192522049, 0.01818467676639557, 0.19885900616645813, -0.07666698843240738, 0.0511007159948349, -0.21860532462596893, 0.007096861023455858, -0.006582548841834068, -0.10937415808439255, -0.09972826391458511, -0.07289181649684906, 0.05480212718248367, -0.061103980988264084, 0.14802128076553345, 0.04621551185846329, 0.019307788461446762, 0.024503663182258606, -0.01150154136121273, 0.011929245665669441, 0.011170055717229843, 0.18743084371089935, 0.030771946534514427, -0.034698452800512314, 0.057695481926202774, 0.04500477388501167, 0.1030961275100708, 0.11355950683355331, 0.20160652697086334, 0.14378784596920013, -0.009556437842547894, 0.09273611009120941, 0.04347359761595726, -0.05783611163496971, -0.157438725233078, 0.05188474431633949, -0.03461603820323944, 0.10894935578107834, -0.02069985680282116, 0.2198943942785263, 0.06416405737400055, -0.16923348605632782, 0.0511995404958725, -0.05199553072452545, -0.08754231035709381, -0.11512524634599686, -0.04823906347155571, -0.07686313986778259, -0.13092930614948273, -0.0037507908418774605, -0.11577211320400238, -0.00255835079587996, 0.1250842958688736, 0.0037776476237922907, -0.02623624913394451, 0.15922318398952484, 0.014503982849419117, 0.022761913016438484, 0.059835005551576614, 0.00865632202476263, -0.03846421465277672, -0.13900741934776306, -0.058679308742284775, -0.013003084808588028, -0.00815436989068985, 0.029894033446907997, -0.06215373054146767, -0.0453263483941555, 0.03077687695622444, -0.020426711067557335, -0.09654181450605392, 0.00580337131395936, 0.011550648137927055, 0.05264320597052574, 0.044483039528131485, 0.00890653021633625, 0.01841517724096775, -0.0033973788376897573, 0.20046915113925934, -0.07252085953950882, -0.06526394188404083, -0.10300551354885101, 0.2344929426908493, 0.03516170009970665, -0.017923330888152122, 0.03434590622782707, -0.06697957217693329, 0.0040816920809447765, 0.248405322432518, 0.21634291112422943, -0.0802062526345253, -0.005508990492671728, 0.017924746498465538, -0.0075508118607103825, -0.021194225177168846, 0.0980217382311821, 0.14326970279216766, 0.04758838564157486, -0.09315204620361328, -0.04464408755302429, -0.05805734172463417, -0.018435288220643997, -0.03343034163117409, 0.07054335623979568, 0.052037313580513, 0.010159195400774479, -0.03613045811653137, 0.056622300297021866, -0.06662857532501221, -0.08938708901405334, 0.0569210983812809, -0.21855111420154572, -0.16739429533481598, -0.016380563378334045, 0.10192067921161652, 0.0016453611897304654, 0.062449563294649124, -0.029784545302391052, -0.004447286017239094, 0.09128595143556595, -0.01890946924686432, -0.09794983267784119, -0.07239370793104172, 0.08538196235895157, -0.11384674906730652, 0.21660040318965912, -0.048202332109212875, 0.05371418222784996, 0.12561482191085815, 0.06765037775039673, -0.06423033773899078, 0.065286785364151, 0.04268036037683487, -0.04173887521028519, 0.023057803511619568, 0.0701998621225357, -0.03378850966691971, 0.06479571759700775, 0.04789621755480766, -0.14026974141597748, 0.023184796795248985, -0.04627499356865883, -0.06967426091432571, -0.042806778103113174, -0.021376337856054306, -0.05995117127895355, 0.12881097197532654, 0.21904878318309784, -0.02507568895816803, -0.009772291406989098, -0.07112479954957962, 0.009488295763731003, 0.05543633550405502, 0.02364332787692547, -0.05695940926671028, -0.21180643141269684, 0.01600048504769802, 0.0450892336666584, -0.01722017116844654, -0.25238776206970215, -0.10116346180438995, 0.0042230007238686085, -0.07256665825843811, -0.09631677716970444, 0.07274890691041946, 0.08881571888923645, 0.0542900450527668, -0.05649314820766449, -0.04837353900074959, -0.07490847259759903, 0.14946399629116058, -0.1459803432226181, -0.09109386801719666 ]
null
null
null
# Configuration `title`: _string_ Display title for the Space `emoji`: _string_ Space emoji (emoji-only character allowed) `colorFrom`: _string_ Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) `colorTo`: _string_ Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) `sdk`: _string_ Can be either `gradio`, `streamlit`, or `static` `sdk_version` : _string_ Only applicable for `streamlit` SDK. See [doc](https://hf.co/docs/hub/spaces) for more info on supported versions. `app_file`: _string_ Path to your main application file (which contains either `gradio` or `streamlit` Python code, or `static` html code). Path is relative to the root of the repository. `pinned`: _boolean_ Whether the Space stays on top of your list.
{"title": "Pdf Table Extractor To CSV", "emoji": ";)", "colorFrom": "yellow", "colorTo": "green", "sdk": "streamlit", "app_file": "App_For_PDF_To_Dataframe.py", "pinned": false}
null
Nalla/PDF_To_CSV
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
# Configuration 'title': _string_ Display title for the Space 'emoji': _string_ Space emoji (emoji-only character allowed) 'colorFrom': _string_ Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) 'colorTo': _string_ Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray) 'sdk': _string_ Can be either 'gradio', 'streamlit', or 'static' 'sdk_version' : _string_ Only applicable for 'streamlit' SDK. See doc for more info on supported versions. 'app_file': _string_ Path to your main application file (which contains either 'gradio' or 'streamlit' Python code, or 'static' html code). Path is relative to the root of the repository. 'pinned': _boolean_ Whether the Space stays on top of your list.
[ "# Configuration\n'title': _string_\nDisplay title for the Space\n'emoji': _string_\nSpace emoji (emoji-only character allowed)\n'colorFrom': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'colorTo': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'sdk': _string_\nCan be either 'gradio', 'streamlit', or 'static'\n'sdk_version' : _string_\nOnly applicable for 'streamlit' SDK. \nSee doc for more info on supported versions.\n\n'app_file': _string_\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code, or 'static' html code).\nPath is relative to the root of the repository.\n\n'pinned': _boolean_\nWhether the Space stays on top of your list." ]
[ "TAGS\n#region-us \n", "# Configuration\n'title': _string_\nDisplay title for the Space\n'emoji': _string_\nSpace emoji (emoji-only character allowed)\n'colorFrom': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'colorTo': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'sdk': _string_\nCan be either 'gradio', 'streamlit', or 'static'\n'sdk_version' : _string_\nOnly applicable for 'streamlit' SDK. \nSee doc for more info on supported versions.\n\n'app_file': _string_\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code, or 'static' html code).\nPath is relative to the root of the repository.\n\n'pinned': _boolean_\nWhether the Space stays on top of your list." ]
[ 6, 236 ]
[ "passage: TAGS\n#region-us \n# Configuration\n'title': _string_\nDisplay title for the Space\n'emoji': _string_\nSpace emoji (emoji-only character allowed)\n'colorFrom': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'colorTo': _string_\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\n'sdk': _string_\nCan be either 'gradio', 'streamlit', or 'static'\n'sdk_version' : _string_\nOnly applicable for 'streamlit' SDK. \nSee doc for more info on supported versions.\n\n'app_file': _string_\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code, or 'static' html code).\nPath is relative to the root of the repository.\n\n'pinned': _boolean_\nWhether the Space stays on top of your list." ]
[ -0.0016679934924468398, 0.07961869984865189, -0.006852890830487013, -0.05592244118452072, 0.08224976062774658, -0.014184586703777313, 0.04152843356132507, 0.08568227291107178, 0.057040102779865265, 0.14579613506793976, 0.03415635973215103, 0.12807074189186096, 0.014497390948235989, 0.07181920111179352, -0.0008711054106242955, -0.2247031182050705, 0.05564660206437111, -0.05508072301745415, 0.09316349774599075, 0.06120958924293518, 0.08574725687503815, -0.025544194504618645, 0.07818562537431717, 0.006766223348677158, -0.1949462890625, 0.028957491740584373, -0.004847502335906029, -0.04076475277543068, 0.009146176278591156, -0.04596146196126938, 0.09512817859649658, -0.02390463650226593, -0.050891246646642685, -0.16081830859184265, 0.030390776693820953, 0.11844640970230103, 0.02946789562702179, 0.008251427672803402, 0.16262765228748322, -0.12688709795475006, 0.19197291135787964, -0.10718955099582672, 0.06843516230583191, 0.01427219808101654, -0.012607994489371777, -0.16702581942081451, -0.03151891380548477, -0.028303533792495728, 0.1456698626279831, 0.008350593037903309, 0.004071514122188091, 0.047789741307497025, -0.08879662305116653, 0.08051235973834991, 0.09665551036596298, -0.033612485975027084, -0.012519240379333496, 0.18985144793987274, 0.0834021121263504, 0.05407338589429855, -0.0997796505689621, 0.01621679775416851, -0.02825704962015152, 0.0202131737023592, -0.041812703013420105, -0.06878664344549179, -0.11232051253318787, 0.0073316143825650215, -0.07358966767787933, 0.0015628706896677613, 0.24924913048744202, 0.006568018347024918, -0.0752674862742424, -0.08693917840719223, -0.07520320266485214, -0.010112213902175426, 0.016257382929325104, 0.04433124139904976, 0.035472214221954346, 0.10150115191936493, 0.10110042244195938, -0.026742590591311455, -0.11428666114807129, -0.03583962842822075, -0.06761927157640457, 0.16576971113681793, -0.01698985882103443, 0.022403476759791374, -0.08598523586988449, 0.09791985899209976, -0.08408855646848679, -0.11500543355941772, 0.008943728171288967, -0.11227600276470184, 0.009290638379752636, 0.07034706324338913, -0.09873855859041214, -0.2444947510957718, 0.09625562280416489, 0.1748645305633545, 0.0598895289003849, 0.1122836172580719, -0.08400436490774155, 0.0590304397046566, 0.09386026114225388, 0.18071289360523224, -0.04656464233994484, 0.020864203572273254, 0.032864365726709366, -0.15453779697418213, 0.09425656497478485, -0.07760502398014069, -0.08558367937803268, 0.020493170246481895, -0.026956818997859955, 0.004241693299263716, 0.10339165478944778, 0.010300773195922375, -0.08777185529470444, -0.07267630100250244, 0.12202277034521103, -0.07550133019685745, 0.046976979821920395, 0.029870137572288513, -0.007798399310559034, 0.11224335432052612, -0.030131515115499496, 0.02246764302253723, 0.019123194739222527, 0.17038702964782715, -0.04378344118595123, -0.03596075624227524, -0.15752474963665009, -0.08719411492347717, 0.03099088929593563, -0.13087120652198792, 0.04686625301837921, -0.08080297708511353, -0.06566599756479263, -0.055793508887290955, 0.019388528540730476, 0.01710202917456627, 0.05093131586909294, 0.04424247518181801, -0.10562992841005325, 0.10327238589525223, 0.047927696257829666, -0.0853591114282608, -0.03115352801978588, 0.02938094176352024, -0.037577785551548004, 0.042727578431367874, -0.08876939862966537, 0.0049893539398908615, -0.09140482544898987, 0.050616901367902756, -0.342907190322876, 0.0519871823489666, -0.07734560966491699, 0.11273375153541565, -0.014712938107550144, -0.013752742670476437, 0.01222923118621111, -0.026286844164133072, -0.03578796982765198, 0.052941516041755676, -0.2101837545633316, -0.004278603009879589, 0.18610452115535736, -0.058381423354148865, -0.008978673256933689, 0.06869252026081085, 0.027811236679553986, -0.19595760107040405, 0.02147543989121914, 0.3707205653190613, 0.1285770833492279, -0.17010092735290527, -0.04049444571137428, 0.005439918953925371, -0.09599529951810837, 0.04776715859770775, 0.07918912172317505, -0.06473775207996368, 0.009394914843142033, 0.07431932538747787, -0.13212943077087402, 0.01722969114780426, 0.07016164809465408, 0.08396941423416138, -0.08188670873641968, 0.030049555003643036, 0.05296258628368378, 0.000629921443760395, -0.10903983563184738, -0.11236317455768585, -0.044130656868219376, 0.023962832987308502, 0.15470147132873535, 0.010851777158677578, 0.006224105600267649, -0.05668012052774429, 0.13945147395133972, 0.015431354753673077, -0.0277208611369133, -0.11320909857749939, -0.08815950900316238, 0.05008167028427124, 0.13177873194217682, -0.026194067671895027, -0.02990906871855259, -0.01845169998705387, 0.006123153027147055, 0.04519420117139816, -0.09490018337965012, -0.02556394413113594, -0.04268809035420418, 0.0757739245891571, -0.04546916484832764, 0.05509335175156593, -0.04471974819898605, -0.09834900498390198, -0.045419298112392426, 0.007582550868391991, 0.1677650362253189, 0.14240029454231262, 0.08917195349931717, -0.08943845331668854, 0.06856756657361984, -0.08952732384204865, -0.06659484654664993, -0.04609978199005127, -0.06570587307214737, 0.019051292911171913, 0.06376440823078156, 0.1219554916024208, -0.12166637182235718, 0.013698591850697994, 0.11932899802923203, -0.03143488988280296, 0.0562695749104023, 0.047635551542043686, -0.012373185716569424, 0.06301360577344894, -0.03970035910606384, -0.04545992240309715, 0.0390261672437191, 0.06864124536514282, -0.015572902746498585, -0.048202358186244965, 0.004013849887996912, -0.0007936290348879993, -0.10953067243099213, -0.020916223526000977, 0.010683157481253147, 0.0996255874633789, 0.038553208112716675, 0.05617387965321541, 0.0182617399841547, 0.05606566369533539, 0.19199329614639282, -0.02676405757665634, -0.05565596744418144, -0.02300594560801983, -0.00810147449374199, -0.047743406146764755, 0.04746156930923462, 0.03321653977036476, -0.0055479127913713455, 0.051181115210056305, 0.029079847037792206, -0.01900813914835453, -0.06406773626804352, -0.05241391807794571, -0.01666184514760971, 0.008693304844200611, 0.08169372379779816, 0.12122996896505356, 0.03800111263990402, 0.004503817763179541, -0.06090591102838516, 0.04792175814509392, -0.06583192944526672, -0.06175568699836731, -0.02340584248304367, 0.09432334452867508, -0.22441089153289795, -0.21170109510421753, -0.09151540696620941, -0.1509796679019928, -0.061439331620931625, 0.0903903990983963, 0.0796906054019928, -0.11056119203567505, -0.05279272794723511, -0.018825337290763855, 0.018303893506526947, -0.14194688200950623, -0.06909061968326569, -0.2053692787885666, 0.009123880416154861, -0.03519390895962715, -0.07924462109804153, -0.03302517160773277, 0.07907189428806305, 0.03325457125902176, 0.1384233981370926, 0.08013919740915298, 0.12125996500253677, 0.12180488556623459, 0.0099647780880332, -0.025816243141889572, 0.023193448781967163, 0.11213234812021255, -0.09237844496965408, 0.11276299506425858, 0.14631696045398712, 0.04362006485462189, 0.13928906619548798, 0.15749233961105347, 0.0055319746024906635, -0.10315023362636566, 0.08789330720901489, 0.047033097594976425, 0.014508186839520931, -0.20832432806491852, -0.11072947829961777, -0.07198183983564377, -0.0015695878537371755, -0.023135660216212273, 0.07159914821386337, 0.0017889834707602859, 0.06383324414491653, -0.014337440952658653, -0.027651634067296982, -0.131711944937706, 0.13742488622665405, 0.11295802146196365, -0.03515460342168808, 0.052875589579343796, -0.03728300705552101, 0.010768812149763107, 0.14027763903141022, -0.018796615302562714, 0.05846685916185379, -0.009307438507676125, 0.020558977499604225, 0.07990027219057083, 0.12197812646627426, 0.050562139600515366, -0.058409448713064194, -0.015885043889284134, 0.011330018751323223, -0.03480825945734978, -0.027658365666866302, -0.055147118866443634, 0.025842400267720222, 0.05163281783461571, -0.08277834206819534, 0.04691877216100693, -0.08282385766506195, 0.040562525391578674, 0.10738080739974976, -0.003821031888946891, -0.1162714883685112, 0.08838793635368347, 0.09295162558555603, 0.06786319613456726, -0.19452881813049316, 0.021345624700188637, 0.1467786580324173, -0.06640968471765518, 0.0581255778670311, 0.04304968938231468, 0.08914528042078018, -0.02477158233523369, -0.01015679631382227, -0.022868147119879723, 0.07102172076702118, 0.005599753465503454, 0.10142718255519867, -0.10348763316869736, -0.06790754944086075, -0.009983590804040432, -0.02506534941494465, 0.05124206840991974, -0.007357414346188307, 0.04565061256289482, 0.183390274643898, -0.016302280128002167, 0.0687904804944992, -0.20839986205101013, -0.07122233510017395, -0.07936394214630127, -0.008523941040039062, 0.16091138124465942, -0.11038480699062347, 0.012581518851220608, 0.0007384602795355022, -0.01493054162710905, -0.04788526892662048, -0.0987960621714592, -0.0812172219157219, -0.07951817661523819, 0.028914298862218857, -0.0009848094778135419, 0.05713600292801857, -0.07859337329864502, 0.049260497093200684, 0.08046047389507294, 0.08454842120409012, 0.0037758518010377884, -0.05158013477921486, -0.11398220807313919, -0.18214596807956696, 0.05547323450446129, -0.03984092175960541, 0.054415419697761536, -0.034627217799425125, 0.15792420506477356, 0.04333425685763359, -0.040963608771562576, 0.07841120660305023, -0.07348878681659698, 0.011220584623515606, -0.14211732149124146, 0.06367121636867523, -0.018756967037916183, -0.03696145862340927, -0.004672241397202015, 0.10013642907142639, -0.08946847170591354, -0.17697523534297943, 0.08432537317276001, 0.17489956319332123, 0.07542400807142258, 0.020440883934497833, -0.026930809020996094, 0.03742779418826103, 0.02425881288945675, -0.009392884559929371, 0.03507726266980171, 0.14037813246250153, -0.10632194578647614, 0.08208420127630234, 0.03014744445681572, -0.047160886228084564, -0.11692943423986435, 0.04826639965176582, 0.008023661561310291, 0.012125400826334953, 0.028426507487893105, -0.22722478210926056, 0.13740241527557373, -0.010228680446743965, 0.005418086890131235, 0.1830706000328064, -0.16029731929302216, -0.03791360184550285, 0.04555530101060867, 0.026517782360315323, -0.0767756775021553, -0.1364380121231079, -0.12087340652942657, -0.03312976658344269, -0.12266402691602707, 0.11031332612037659, -0.05073055252432823, 0.03704184666275978, -0.03607979044318199, 0.10562652349472046, 0.05106467753648758, -0.047691114246845245, 0.11906331777572632, -0.10654604434967041, 0.09277114272117615, -0.09996133297681808, 0.05098763108253479, 0.11066503077745438, -0.06216718629002571, 0.12192294001579285, -0.07519904524087906, 0.08304496109485626, -0.25422170758247375, 0.018176011741161346, -0.025060530751943588, 0.07241550087928772, 0.004813428968191147, -0.0800413265824318, -0.13341647386550903, -0.05839327722787857, -0.0025728780310600996, -0.0061315326020121574, -0.10721564292907715, -0.009787755087018013, -0.08414874970912933, -0.04301075264811516, -0.09695279598236084, 0.04819134995341301, -0.22999340295791626, -0.01984279975295067, -0.014473608694970608, 0.039811767637729645, -0.17181333899497986, -0.042568139731884, 0.03467483073472977, -0.012823354452848434, 0.07009904831647873, -0.02277529425919056, -0.06832171231508255, 0.035377077758312225, 0.08626585453748703, -0.13729454576969147, 0.059833455830812454, -0.03422945365309715, 0.19272106885910034, 0.018490402027964592, -0.09421392530202866, 0.03924950584769249, 0.07226040214300156, -0.04531124234199524, -0.0042091598734259605, 0.04600797966122627, 0.06177495792508125, -0.03279217332601547, 0.07593417167663574, -0.004554476588964462, -0.05429668724536896, 0.03687356784939766, 0.06440235674381256, -0.07483027875423431, 0.0248709823936224, 0.06620178371667862, -0.07442520558834076, -0.03389807790517807, 0.10651365667581558, 0.072269506752491, 0.08404285460710526, -0.0011330534471198916, 0.06095600500702858, -0.039803534746170044, -0.009206678718328476, -0.008032657206058502, 0.06777890771627426, 0.02548944391310215, -0.053981631994247437, -0.029832594096660614, -0.000572928402107209, 0.07788775116205215, -0.035829346626996994, 0.11161095649003983, -0.12251179665327072, -0.11836011707782745, 0.043694593012332916, -0.01309719868004322, -0.005205587018281221, -0.0596785806119442, -0.04334467649459839, -0.055275578051805496, -0.053022000938653946, 0.08659054338932037, 0.13930685818195343, 0.014214800670742989, -0.0013365037739276886, -0.01583939604461193, -0.02659052610397339, -0.019574157893657684, -0.06694404035806656, -0.07650186866521835, -0.031984247267246246, 0.07152252644300461, -0.10878902673721313, -0.06884346157312393, 0.16921208798885345, -0.055656805634498596, -0.011627559550106525, 0.0487440787255764, 0.0006298237130977213, -0.00434878608211875, -0.1365732103586197, -0.0824960321187973, 0.13261374831199646, 0.042180486023426056, -0.013930736109614372, -0.08543398231267929, 0.01805143617093563, -0.005815511103719473, 0.006975076161324978, -0.06560501456260681, 0.04295646399259567, -0.15925638377666473, -0.0038132495246827602, -0.007148658856749535, -0.2049493044614792, -0.07542858272790909, -0.041307516396045685, 0.034904636442661285, 0.10358201712369919, 0.12442594766616821, 0.022113164886832237, 0.026822833344340324, -0.01698550581932068, -0.02370511367917061, 0.013103839941322803, -0.014467629604041576, 0.05382055416703224, 0.03555084392428398, 0.009567608125507832, -0.038966234773397446, 0.11288578063249588, 0.04705914109945297, -0.11832479387521744, -0.02635950781404972, 0.13804611563682556, 0.007249304559081793, 0.015178547240793705, 0.13112062215805054, -0.004941543098539114, 0.029669109731912613, 0.07029477506875992, 0.004581797402352095, 0.06959405541419983, -0.05358295515179634, -0.00952049158513546, 0.0890050083398819, 0.0722501203417778, -0.06618741899728775, -0.06785041838884354, 0.03456345945596695, -0.2658979892730713, -0.08083812892436981, -0.009497691877186298, 0.08840549737215042, -0.002311467658728361, 0.2996513843536377, 0.13925573229789734, -0.09905702620744705, 0.06845778226852417, 0.06005021557211876, -0.03429374098777771, -0.08533226698637009, -0.1266372799873352, -0.026856251060962677, -0.15120762586593628, 0.011682004667818546, -0.10785356163978577, 0.07492760568857193, 0.021876728162169456, -0.014895454049110413, -0.020747903734445572, 0.1028154194355011, 0.018816716969013214, -0.1370389312505722, 0.006994020659476519, -0.024364028126001358, -0.08705741912126541, 0.09430201351642609, 0.054802387952804565, 0.01680399663746357, 0.020304381847381592, 0.08138365298509598, 0.03586159273982048, -0.010125924833118916, 0.005172157194465399, -0.12893353402614594, -0.020439494401216507, 0.04497440159320831, 0.007423239294439554, -0.10155274718999863, 0.036495089530944824, 0.06245061382651329, -0.017526991665363312, 0.004443701822310686, 0.3292556405067444, -0.015783021226525307, 0.020714491605758667, 0.012954487465322018, -0.13917276263237, -0.014494122937321663, 0.04558247700333595, -0.051947157829999924, -0.1663428694009781, -0.09708984941244125, 0.14152705669403076, 0.02412481978535652, 0.060557492077350616, -0.0026155172381550074, 0.01930144615471363, 0.012082808651030064, 0.008701903745532036, 0.09858388453722, 0.05417195335030556, 0.1687723845243454, 0.013584239408373833, 0.03628027066588402, -0.04813036322593689, -0.04724780470132828, -0.14931148290634155, -0.11631274968385696, -0.061656370759010315, -0.12735585868358612, -0.04105759784579277, 0.10341400653123856, -0.009548431262373924, 0.06381510198116302, 0.0077397082932293415, 0.04128197953104973, -0.019982794299721718, 0.0434904471039772, 0.1435697078704834, -0.02431703545153141, 0.06121787428855896, -0.010163367725908756, -0.05812311917543411, 0.12289339303970337, -0.02271151728928089, -0.11975527554750443, -0.032163988798856735, 0.000531640078406781, -0.14196565747261047, 0.15529967844486237, -0.007696673274040222, -0.008007441647350788, 0.05657072737812996, 0.0358254499733448, -0.06958840042352676, 0.08131678402423859, 0.001971188001334667, -0.08998484909534454, -0.003543559927493334, 0.1740906983613968, -0.003120606765151024, -0.012556307017803192, 0.10003197193145752, -0.021542154252529144, 0.011203630827367306, 0.004993262235075235, 0.13523298501968384, -0.14531806111335754, 0.06626521050930023, -0.19900788366794586, 0.014696606434881687, 0.021013515070080757, 0.014230088330805302, -0.005247437860816717, -0.06986316293478012, 0.022471042349934578, -0.00941240880638361, -0.05467680096626282, 0.0040011112578213215, 0.020116180181503296, -0.010607492178678513, 0.30744990706443787, 0.047996584326028824, -0.07778514176607132, -0.047017913311719894, -0.029662901535630226, 0.04738025739789009, -0.0677717849612236, 0.07456685602664948, -0.02862836793065071, -0.0015336949145421386, -0.07029690593481064, -0.14331206679344177, 0.0017641246085986495, 0.08542177826166153, -0.12073539942502975, -0.05417962372303009 ]
null
null
transformers
# Aqua from Konosuba DialoGPT Model
{"tags": ["conversational"]}
text-generation
NamPE/DialoGPT-medium-Aqua-konosuba
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Aqua from Konosuba DialoGPT Model
[ "# Aqua from Konosuba DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Aqua from Konosuba DialoGPT Model" ]
[ 51, 11 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Aqua from Konosuba DialoGPT Model" ]
[ 0.01050625741481781, 0.0680038183927536, -0.005532193463295698, -0.03803853690624237, 0.13971324265003204, -0.014378524385392666, 0.1655605435371399, 0.09412552416324615, 0.04706801101565361, -0.00017497778753750026, 0.02582668699324131, 0.07597702741622925, 0.04033016040921211, 0.16059361398220062, -0.05319662392139435, -0.37525975704193115, 0.08737289160490036, 0.05314725637435913, 0.06381828337907791, 0.10384418815374374, 0.07340990006923676, -0.0602712444961071, 0.08990972489118576, -0.004147929139435291, -0.029661716893315315, -0.008999845013022423, -0.04246683046221733, -0.1771099716424942, 0.09657320380210876, 0.04768796265125275, 0.048574745655059814, 0.02573562040925026, -0.06505481898784637, -0.14210690557956696, 0.04061620682477951, -0.03456174582242966, -0.016493573784828186, 0.011746353469789028, 0.01653420180082321, -0.04418129473924637, 0.12923285365104675, 0.024494675919413567, 0.0288316048681736, -0.011136215180158615, -0.1611437201499939, 0.05944881960749626, -0.013093441724777222, -0.0055307247675955296, 0.15132193267345428, 0.10259252041578293, -0.023182110860943794, 0.13693541288375854, -0.08255966007709503, 0.07275673747062683, 0.03772406280040741, -0.3336787819862366, -0.034481231123209, 0.07697676122188568, 0.04140222817659378, 0.04992622137069702, -0.02794547937810421, 0.10921534895896912, -0.038261063396930695, 0.02161048725247383, -0.13441836833953857, -0.08475222438573837, -0.17527300119400024, -0.017318954691290855, -0.065327949821949, 0.015427276492118835, 0.2529680132865906, -0.017734302207827568, 0.008973428048193455, -0.0415792278945446, -0.06372536718845367, -0.006893041543662548, -0.069735586643219, 0.005820632912218571, -0.05675864964723587, 0.05842205509543419, -0.0055122654885053635, -0.07224557548761368, -0.13187846541404724, -0.002908871043473482, -0.22870834171772003, 0.26187482476234436, 0.02331571839749813, 0.04287871718406677, -0.24565564095973969, 0.1037600189447403, -0.11201062798500061, -0.10734979808330536, -0.006601274944841862, -0.07363195717334747, 0.03394787386059761, 0.0008012311300262809, -0.018744003027677536, -0.1622871309518814, 0.07452721893787384, 0.13134582340717316, 0.07083339989185333, 0.023918569087982178, 0.011996817775070667, 0.03843075782060623, 0.05089084431529045, 0.11412212252616882, 0.0013537518680095673, -0.0986531600356102, 0.07053477317094803, -0.0943513736128807, 0.010574277490377426, -0.07798323780298233, -0.1537552773952484, -0.07487804442644119, 0.03768846020102501, 0.02451474964618683, 0.029483359307050705, 0.08079420775175095, 0.05236346647143364, -0.06526604294776917, 0.04635309427976608, -0.053695548325777054, 0.006118630059063435, -0.02214771881699562, -0.020614247769117355, 0.20412717759609222, 0.018037788569927216, 0.046990614384412766, -0.07192070037126541, 0.051015350967645645, -0.03586127609014511, -0.00925951823592186, -0.008613940328359604, -0.017574729397892952, 0.018773790448904037, 0.03799132630228996, 0.03312155231833458, -0.20710007846355438, -0.15431547164916992, 0.014965429902076721, 0.01979099027812481, -0.06996007263660431, -0.108361154794693, -0.025049621239304543, 0.02333168126642704, 0.014524295926094055, -0.05894526094198227, -0.02173546701669693, -0.04412422329187393, 0.04502982646226883, 0.029808681458234787, 0.09645546972751617, -0.16063635051250458, 0.07648859173059464, -0.10783349722623825, 0.006479198578745127, -0.05665837600827217, 0.05941604822874069, 0.005605829879641533, 0.06210627034306526, -0.07860922813415527, -0.03181964531540871, -0.13297154009342194, 0.025385059416294098, 0.04366890713572502, 0.21406526863574982, -0.06838705390691757, -0.08498269319534302, 0.2895785868167877, -0.0799282118678093, -0.09358137845993042, 0.10032770782709122, 0.03539199009537697, 0.04678718000650406, 0.13145112991333008, 0.1444084197282791, -0.024021275341510773, -0.006898525636643171, 0.06665527820587158, 0.01725205034017563, 0.0033408249728381634, -0.0023069088347256184, 0.03552420064806938, 0.03591623157262802, -0.00637021753937006, 0.08552935719490051, 0.07667457312345505, 0.12986767292022705, -0.04953143000602722, -0.03196932375431061, 0.012122053653001785, -0.000034579272323753685, 0.040551114827394485, -0.0702756717801094, 0.1453072875738144, 0.01241614855825901, -0.04052981361746788, 0.034051522612571716, 0.0365285724401474, -0.04132020100951195, 0.06122702360153198, -0.09143933653831482, 0.15963447093963623, -0.009757005609571934, 0.06795120984315872, -0.10478022694587708, -0.056724369525909424, -0.0013053041184321046, 0.06824338436126709, 0.07522345334291458, -0.010527343489229679, 0.07052560150623322, -0.030273985117673874, -0.02226843871176243, 0.061050813645124435, 0.08879881352186203, 0.002823061542585492, -0.06269370764493942, -0.06405163556337357, 0.10735666006803513, -0.023453177884221077, 0.0996740534901619, -0.10419002175331116, 0.007215957157313824, 0.09177342057228088, 0.12882359325885773, -0.03561008721590042, 0.04541584104299545, 0.04808877035975456, 0.004357482306659222, -0.031438980251550674, 0.016764316707849503, 0.06100557744503021, -0.008306306786835194, -0.08966436982154846, 0.2325141429901123, -0.1734171211719513, 0.09416655451059341, 0.15454864501953125, -0.09188272058963776, 0.023285353556275368, -0.1448976844549179, -0.014467579312622547, -0.02715900167822838, 0.06217217445373535, 0.04744616523385048, 0.2036844938993454, 0.024828359484672546, 0.1309402883052826, -0.008489955216646194, 0.04374782368540764, 0.03830816596746445, -0.0750427395105362, 0.017725376412272453, 0.07399219274520874, 0.06818706542253494, -0.1704811453819275, 0.12609240412712097, 0.039625540375709534, -0.021230436861515045, 0.16864582896232605, 0.009031384252011776, -0.04249672591686249, 0.05335452780127525, 0.027473734691739082, -0.025828799232840538, 0.03165990859270096, -0.27034792304039, -0.021767865866422653, 0.08406805992126465, 0.05799398198723793, 0.10943396389484406, -0.08459711074829102, -0.033985547721385956, 0.010434302501380444, 0.038995299488306046, 0.07376725971698761, 0.13815052807331085, 0.026654036715626717, 0.11087487637996674, -0.012517891824245453, -0.04953749105334282, 0.011530529707670212, 0.02091202884912491, -0.023731881752610207, 0.24440740048885345, -0.06856340169906616, -0.2701997756958008, -0.11125317960977554, -0.20965765416622162, -0.06083185598254204, -0.014064661227166653, 0.07598792761564255, -0.0802818313241005, -0.027834782376885414, 0.014671971090137959, 0.012622639536857605, 0.013078845106065273, 0.039303217083215714, -0.040020719170570374, -0.01229795254766941, -0.16517497599124908, -0.05460788309574127, -0.07318694144487381, -0.06016017869114876, -0.04506775736808777, 0.17418694496154785, -0.12374112010002136, 0.0352698490023613, 0.27215999364852905, -0.03325127065181732, 0.03879598528146744, 0.006373913958668709, 0.2192755490541458, -0.10529989749193192, 0.010906698182225227, 0.21089768409729004, 0.002370924223214388, 0.05589981749653816, 0.12794211506843567, -0.007650428451597691, -0.0983787253499031, 0.03957739844918251, -0.031605314463377, -0.09793643653392792, -0.25269365310668945, -0.1342308223247528, -0.11778204143047333, 0.07966863363981247, -0.037198327481746674, 0.030101710930466652, 0.18812693655490875, 0.06305578351020813, 0.022895071655511856, 0.0612383596599102, 0.08398190885782242, 0.1092337816953659, 0.2857496738433838, -0.00818527303636074, 0.11244454979896545, 0.0025471975095570087, -0.14814865589141846, 0.06759826093912125, 0.08916588872671127, 0.09326352179050446, 0.0712767243385315, 0.04105323553085327, 0.007557317148894072, -0.08747924864292145, 0.14907662570476532, 0.056811630725860596, 0.006739362142980099, -0.04312266781926155, -0.02850760519504547, -0.0561726912856102, 0.014379359781742096, 0.04033534228801727, 0.0344008170068264, -0.1777222603559494, -0.018235301598906517, -0.006597274914383888, 0.03326061740517616, 0.012248440645635128, 0.05811838060617447, -0.13244220614433289, -0.025808265432715416, 0.05327828601002693, -0.017196813598275185, -0.13528509438037872, 0.025418953970074654, -0.010155164636671543, -0.1491447538137436, 0.03434232249855995, -0.04483458027243614, 0.11330191791057587, -0.05201009660959244, 0.03936111554503441, -0.07985924184322357, -0.057617731392383575, -0.00037248950684443116, 0.09341024607419968, -0.2022915929555893, 0.187802255153656, -0.02442953735589981, -0.040170200169086456, -0.07375319302082062, 0.006475883070379496, -0.03088534250855446, 0.1460167020559311, 0.06076383590698242, 0.04527883231639862, -0.06609433144330978, 0.003175487508997321, -0.16558215022087097, 0.05196267366409302, 0.05906374752521515, 0.0020676308777183294, -0.004024949856102467, 0.02303178235888481, -0.0063374778255820274, -0.04298895597457886, 0.047640636563301086, -0.0358353890478611, -0.16705070436000824, 0.1314149647951126, 0.03238193690776825, 0.13448987901210785, 0.027051892131567, -0.07785585522651672, 0.006168597377836704, 0.2294677495956421, 0.043840017169713974, -0.08035504072904587, -0.08753375709056854, -0.061623405665159225, 0.10473697632551193, -0.10317499190568924, 0.030649177730083466, -0.09294714778661728, 0.0035695265978574753, -0.04418807104229927, -0.14358295500278473, 0.10457189381122589, -0.03314226493239403, -0.052543770521879196, -0.030047880485653877, 0.11954131722450256, -0.016850760206580162, 0.012382199987769127, 0.031889647245407104, -0.024943120777606964, -0.07156582921743393, -0.09044244140386581, -0.045258648693561554, 0.09973230212926865, -0.014789003878831863, -0.03289588913321495, -0.008997349999845028, -0.06710916012525558, -0.08875919133424759, -0.028751350939273834, 0.2920527458190918, 0.18914279341697693, 0.026738865301012993, 0.12228690087795258, 0.14614003896713257, -0.05233025923371315, -0.18383798003196716, -0.11085536330938339, -0.10676298290491104, 0.03760266676545143, -0.1145983412861824, -0.2111017107963562, 0.08598726242780685, -0.021648360416293144, -0.052548136562108994, 0.18128202855587006, -0.2436567097902298, -0.13867954909801483, 0.1756308227777481, -0.01024862751364708, 0.42519986629486084, -0.14495781064033508, -0.06053958460688591, -0.032296109944581985, -0.25409039855003357, 0.21482905745506287, 0.051457785069942474, 0.1385280042886734, -0.025577865540981293, 0.22170796990394592, 0.028854630887508392, -0.0058471704833209515, 0.13149912655353546, -0.00988668855279684, -0.07712467759847641, -0.0587821863591671, -0.1258781999349594, 0.009957392700016499, 0.012556364759802818, 0.04440191388130188, 0.040945351123809814, -0.018328875303268433, -0.18560290336608887, -0.044678326696157455, -0.09329552948474884, 0.058649998158216476, 0.050071269273757935, -0.08749113231897354, 0.05680248886346817, -0.0019579383078962564, 0.018281400203704834, 0.009433599188923836, 0.19400916993618011, -0.051567576825618744, 0.08390077203512192, 0.03604111447930336, 0.07225397974252701, -0.030367406085133553, 0.005586408078670502, -0.06950386613607407, -0.04008714109659195, 0.056873176246881485, -0.058121245354413986, 0.002233048202469945, 0.11508205533027649, -0.017636630684137344, 0.04180038347840309, 0.05917938053607941, -0.08563974499702454, 0.005737729370594025, 0.08748533576726913, -0.22958116233348846, 0.061494890600442886, -0.058994024991989136, 0.22169026732444763, 0.1098177507519722, 0.0702710673213005, 0.17669536173343658, -0.05403397977352142, -0.07772591710090637, 0.006573454011231661, -0.003890169318765402, -0.05977815389633179, 0.07160136103630066, 0.0056582554243505, 0.018405677750706673, -0.114081472158432, 0.041640814393758774, 0.04046301916241646, -0.122173972427845, 0.02002166025340557, 0.14430025219917297, -0.10402184724807739, -0.10071633756160736, -0.03631522133946419, 0.01642662286758423, -0.06423214823007584, -0.029495947062969208, -0.03822093456983566, -0.1325431913137436, 0.02186383120715618, 0.10358389467000961, 0.04213489964604378, 0.06229519098997116, -0.07047859579324722, -0.026585709303617477, 0.0016056725289672613, -0.02196291647851467, 0.061828482896089554, 0.010986443608999252, -0.034064460545778275, -0.018263069912791252, -0.03507477790117264, 0.09584074467420578, -0.09225871413946152, -0.14963610470294952, -0.16195395588874817, 0.008099142462015152, -0.1272367686033249, -0.04963172599673271, -0.12949202954769135, -0.07583727687597275, -0.05201326683163643, -0.03313804045319557, -0.04756539687514305, -0.03757328540086746, -0.11640685051679611, 0.04774553328752518, -0.12745600938796997, 0.004036115016788244, -0.019930195063352585, -0.011181699112057686, 0.0532081313431263, -0.04323384165763855, 0.10734480619430542, 0.17746864259243011, -0.04633466899394989, 0.04784073680639267, -0.13808126747608185, -0.04841132089495659, 0.09494806826114655, 0.020824939012527466, 0.07451499253511429, -0.004013999830931425, 0.011223159730434418, 0.030113693326711655, 0.10790186375379562, 0.028335148468613625, 0.021777601912617683, -0.10672615468502045, -0.07686634361743927, -0.05541659891605377, -0.14643074572086334, -0.006662235129624605, -0.040001820772886276, 0.016709720715880394, 0.0791269913315773, 0.027738146483898163, -0.061320867389440536, 0.10947734117507935, -0.04557003453373909, 0.019952252507209778, 0.006174206733703613, -0.1476343721151352, 0.050752680748701096, -0.09205413609743118, 0.02277514524757862, 0.03501303866505623, 0.18681633472442627, 0.06632750481367111, -0.004129917360842228, 0.006194179877638817, 0.015037558041512966, 0.05646239221096039, 0.06125001609325409, 0.15509574115276337, 0.12486554682254791, 0.0019854982383549213, -0.04990456625819206, 0.05181987211108208, 0.01190408505499363, 0.05427296459674835, 0.08405133336782455, -0.003132179845124483, 0.027127128094434738, 0.1489090770483017, 0.0019083352526649833, 0.04996398091316223, -0.15878549218177795, -0.21757298707962036, -0.10391085594892502, 0.08444420993328094, -0.07781342417001724, 0.132907435297966, 0.10162985324859619, -0.07654198259115219, 0.017771998420357704, -0.02945012040436268, -0.08718764036893845, -0.18309806287288666, -0.2686324715614319, -0.06405103206634521, -0.09138472378253937, 0.023123808205127716, -0.1181771382689476, 0.0222318172454834, 0.009552128612995148, 0.057009823620319366, -0.05751536414027214, 0.08201567828655243, 0.06754852831363678, -0.10401316732168198, 0.08977203071117401, -0.009901899844408035, 0.0644550696015358, 0.011523065157234669, 0.018873944878578186, -0.09341742843389511, 0.00852828286588192, -0.001179770566523075, 0.069127157330513, -0.02186894789338112, 0.022456657141447067, -0.06293126195669174, -0.10225514322519302, -0.06328529864549637, 0.013312384486198425, 0.02156241424381733, 0.06915271282196045, 0.0035688718780875206, -0.038307953625917435, 0.03235405683517456, 0.2621000111103058, -0.026062320917844772, -0.10289217531681061, -0.06293118745088577, 0.20116059482097626, 0.019675958901643753, 0.11001572757959366, -0.0536893829703331, 0.004454398062080145, -0.07737809419631958, 0.31532958149909973, 0.16159570217132568, -0.041901975870132446, 0.001429893309250474, 0.09180166572332382, 0.03117208369076252, 0.057147108018398285, 0.09341631084680557, 0.0616869293153286, 0.266505628824234, -0.08525412529706955, 0.012682212516665459, -0.0322764553129673, -0.07967081665992737, -0.0819319412112236, 0.03842921555042267, 0.07766268402338028, -0.06355452537536621, -0.06412197649478912, 0.12649327516555786, -0.3058120310306549, 0.1651422083377838, -0.13547660410404205, -0.29571807384490967, -0.1328037977218628, -0.05045419931411743, 0.08701381832361221, 0.06238221377134323, 0.11817780137062073, -0.05668351426720619, -0.0549905002117157, 0.004938533063977957, 0.05802623927593231, -0.14552737772464752, 0.023366305977106094, 0.08709312975406647, -0.042898062616586685, -0.04109587147831917, -0.0035875309258699417, 0.09119682759046555, 0.040909543633461, 0.06018509715795517, 0.01651606895029545, -0.012749437242746353, -0.016243787482380867, -0.02294992469251156, -0.002341771963983774, 0.09191668033599854, -0.014579196460545063, -0.13060450553894043, 0.07135939598083496, -0.09459826350212097, 0.03548582270741463, -0.0012261513620615005, 0.015968991443514824, -0.017454832792282104, 0.043797463178634644, -0.060368504375219345, 0.059354688972234726, 0.07188035547733307, -0.048932772129774094, -0.019176658242940903, -0.028501402586698532, 0.014363511465489864, 0.015541418455541134, -0.1035122200846672, -0.1267472803592682, -0.12897102534770966, -0.15725953876972198, 0.04257012903690338, 0.05319385975599289, -0.17189261317253113, 0.004240068141371012, -0.14025631546974182, 0.054324597120285034, -0.13811707496643066, 0.10994762927293777, 0.09573188424110413, -0.015840355306863785, -0.00799768790602684, -0.09537207335233688, 0.05260574072599411, 0.054516468197107315, -0.1678846776485443, -0.09253097325563431 ]
null
null
transformers
# Takanashi Rikka DialoGPT Model
{"tags": ["conversational"]}
text-generation
NamPE/DialoGPT-medium-Takanashi-Rikka
[ "transformers", "pytorch", "safetensors", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Takanashi Rikka DialoGPT Model
[ "# Takanashi Rikka DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Takanashi Rikka DialoGPT Model" ]
[ 56, 11 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Takanashi Rikka DialoGPT Model" ]
[ -0.030572202056646347, 0.024083316326141357, -0.00523327337577939, 0.017822621390223503, 0.1402311474084854, -0.01696065440773964, 0.17124822735786438, 0.10694241523742676, -0.040271490812301636, -0.009478587657213211, 0.11909472942352295, 0.1488717645406723, 0.01451030746102333, 0.13955678045749664, -0.08862859755754471, -0.30844539403915405, 0.09568475186824799, 0.039634108543395996, 0.08340176194906235, 0.13766168057918549, 0.10714567452669144, -0.05918801575899124, 0.06550212949514389, 0.005491433199495077, -0.11927258223295212, -0.027027146890759468, 0.03334599360823631, -0.14239279925823212, 0.10346172004938126, 0.028545234352350235, 0.07627800852060318, 0.03639223054051399, -0.031396448612213135, -0.11756906658411026, 0.04164130613207817, -0.002047835849225521, -0.04785682633519173, 0.0439462810754776, 0.040974974632263184, -0.07900631427764893, 0.0937555655837059, 0.08719351887702942, -0.015934325754642487, 0.046444039791822433, -0.14315925538539886, 0.0069498540833592415, -0.0034025602508336306, 0.06795509159564972, 0.11347740888595581, 0.11460880190134048, -0.04910610243678093, 0.14011716842651367, -0.07015533745288849, 0.11768972128629684, 0.10411623865365982, -0.31749972701072693, -0.029752308502793312, 0.0816296935081482, 0.047878582030534744, 0.07429011166095734, -0.04480745270848274, 0.06402958184480667, 0.04927639663219452, -0.010867570526897907, -0.022213399410247803, -0.05955469235777855, -0.08145609498023987, -0.01681210659444332, -0.09442640841007233, 0.011400189250707626, 0.24065977334976196, -0.04637131094932556, 0.048547785729169846, -0.1088315099477768, -0.12102280557155609, -0.030478568747639656, -0.03816559910774231, -0.03538559004664421, -0.09459525346755981, 0.05398618429899216, 0.012634758837521076, -0.03499402478337288, -0.12515853345394135, -0.035754505544900894, -0.13848136365413666, 0.20399482548236847, 0.05327014997601509, 0.022485624998807907, -0.19947582483291626, 0.07951467484235764, 0.017920520156621933, -0.12339300662279129, 0.01491719949990511, -0.10546857118606567, 0.04566490650177002, 0.025487352162599564, -0.01217685453593731, -0.0894654244184494, 0.09835544973611832, 0.07574696093797684, -0.062458522617816925, 0.029583889991044998, -0.014660964719951153, 0.04230381175875664, 0.049303021281957626, 0.049794916063547134, -0.04283060505986214, 0.0013679820112884045, 0.04425237327814102, -0.05429193750023842, 0.03290362283587456, -0.06323467195034027, -0.12988826632499695, -0.02276579476892948, 0.09270024299621582, 0.06716936081647873, -0.007327388506382704, 0.15068422257900238, -0.005924158729612827, -0.019097715616226196, 0.053321048617362976, -0.06244077906012535, -0.04858577251434326, 0.00987253338098526, -0.004585579037666321, 0.10361611843109131, -0.0008552147774025798, 0.024180302396416664, -0.10431723296642303, 0.031187759712338448, -0.05411570146679878, -0.017579853534698486, -0.02251998893916607, -0.044846098870038986, -0.0014344512019306421, -0.07180208712816238, 0.012775138951838017, -0.18860428035259247, -0.16542857885360718, 0.0027847369201481342, -0.026222100481390953, -0.029760628938674927, -0.06627108156681061, -0.08804748207330704, -0.027865350246429443, 0.03400946781039238, -0.07280819118022919, -0.05792719125747681, -0.05926399677991867, 0.09881328046321869, -0.02887890301644802, 0.08359678089618683, -0.05553358048200607, 0.049568600952625275, -0.10846420377492905, -0.009301929734647274, -0.06993018835783005, 0.06494753062725067, -0.0034162711817771196, 0.07401628792285919, -0.017582936212420464, -0.0036502366419881582, -0.06631588190793991, 0.05685700848698616, -0.025368819013237953, 0.2484409213066101, -0.06624967604875565, -0.07764589041471481, 0.30654269456863403, -0.12989486753940582, -0.13859786093235016, 0.13410083949565887, 0.006914698053151369, 0.09642424434423447, 0.1216232106089592, 0.20615330338478088, 0.0013695121742784977, -0.013874399475753307, 0.040918488055467606, 0.06250898540019989, -0.09747268259525299, -0.014861795119941235, 0.007483857683837414, 0.02061718888580799, -0.09465856850147247, 0.0614037849009037, 0.07253268361091614, 0.04574412852525711, -0.06437761336565018, -0.03341913968324661, -0.02048589289188385, -0.03678518533706665, 0.07821978628635406, -0.02031608112156391, 0.12095971405506134, -0.053695663809776306, -0.06261832267045975, -0.042224492877721786, 0.04792946204543114, -0.056214574724435806, 0.023385345935821533, -0.10894089937210083, 0.10075674951076508, -0.01753690093755722, 0.05696650594472885, -0.10886354744434357, -0.05769948661327362, -0.02805759385228157, 0.14613033831119537, -0.0002398242213530466, 0.058187536895275116, 0.06813263148069382, -0.005106274504214525, -0.017607320100069046, -0.010488978587090969, 0.18317772448062897, -0.0167690422385931, -0.05064787715673447, -0.08923161774873734, 0.11496167629957199, -0.06763763725757599, 0.12542036175727844, -0.07692815363407135, 0.02272861637175083, 0.012414311990141869, 0.09777045249938965, -0.015955593436956406, 0.039864324033260345, 0.032653044909238815, -0.006652806885540485, -0.06486645340919495, 0.009589312598109245, 0.09513889998197556, 0.004239585716277361, -0.08682585507631302, 0.23307321965694427, -0.22492797672748566, 0.17860612273216248, 0.17246513068675995, -0.1602317988872528, -0.015773627907037735, -0.09664908051490784, -0.02048996463418007, 0.0068858410231769085, 0.03804470971226692, -0.0023177028633654118, 0.11283483356237411, -0.026391275227069855, 0.17589625716209412, -0.06172886863350868, -0.014840283431112766, 0.010419131256639957, -0.09169476479291916, -0.005170188378542662, 0.09750934690237045, 0.0411507710814476, -0.1604910045862198, 0.17286626994609833, 0.06649097055196762, 0.030370265245437622, 0.21927721798419952, 0.006090066395699978, -0.004586676601320505, 0.05708915367722511, 0.05089118331670761, -0.024804621934890747, -0.0528680682182312, -0.22299186885356903, -0.030070502310991287, 0.06773147732019424, 0.038351982831954956, 0.08972381800413132, -0.08895106613636017, -0.055683404207229614, -0.015029129572212696, -0.015274413861334324, 0.044678330421447754, 0.11578714102506638, 0.005689606070518494, 0.13894756138324738, -0.04842713102698326, -0.05847707763314247, 0.05362614989280701, 0.01051772478967905, -0.08660781383514404, 0.19251523911952972, -0.09929589182138443, -0.3410975933074951, -0.0906180664896965, -0.1808619350194931, -0.0553521104156971, 0.044135648757219315, 0.09849464148283005, -0.15379677712917328, -0.033417392522096634, -0.0370093397796154, 0.054088789969682693, -0.04276416823267937, 0.015752164646983147, -0.03591587767004967, 0.020367752760648727, -0.09677883237600327, -0.1012672558426857, -0.037375856190919876, -0.04337982088327408, -0.10177560895681381, 0.17156678438186646, -0.1151590496301651, 0.04890873655676842, 0.21524053812026978, 0.03020559810101986, 0.03539469465613365, -0.034644633531570435, 0.15117938816547394, -0.10597220808267593, 0.018955910578370094, 0.20230089128017426, -0.06018196791410446, 0.06514688581228256, 0.1511082947254181, -0.025489183142781258, -0.0742320716381073, 0.06110922619700432, -0.04331963136792183, -0.07427357137203217, -0.2422138750553131, -0.11764682084321976, -0.08009904623031616, 0.11511023342609406, -0.0006932480027899146, 0.0508381687104702, 0.1707988828420639, 0.11816740781068802, -0.047344870865345, -0.058107562363147736, 0.06982424110174179, 0.08138499408960342, 0.16560658812522888, -0.0547051802277565, 0.1553780734539032, -0.027168570086359978, -0.15491290390491486, 0.05983243137598038, 0.02531370334327221, 0.10859064757823944, 0.017778968438506126, 0.047489065676927567, 0.04323292523622513, 0.09265419095754623, 0.15143045783042908, 0.08655581623315811, 0.00994222704321146, -0.043284498155117035, -0.02398393303155899, -0.04764418303966522, -0.04141784459352493, 0.040581073611974716, -0.036279067397117615, -0.10569734871387482, -0.03247460350394249, 0.016351209953427315, 0.08729944378137589, 0.07809946686029434, 0.07522682845592499, -0.19363504648208618, -0.025367047637701035, 0.07699672877788544, -0.01772424578666687, -0.09921078383922577, 0.09539849311113358, 0.03976503014564514, -0.12488701194524765, 0.04223352670669556, -0.023095088079571724, 0.09528964012861252, -0.05766753852367401, 0.05796092748641968, -0.13218769431114197, -0.05695771425962448, -0.017853088676929474, 0.10439076274633408, -0.2711831331253052, 0.22404298186302185, -0.009915191680192947, -0.002276771003380418, -0.0979083925485611, -0.014234714210033417, 0.019455747678875923, 0.10764637589454651, 0.1495329588651657, -0.02886371873319149, -0.013108606450259686, -0.01753682643175125, -0.07116076350212097, 0.04060466215014458, 0.11083582788705826, 0.0029904809780418873, 0.0019766187760978937, -0.03753802180290222, 0.002797468798235059, -0.019406884908676147, -0.09276045858860016, -0.04383358731865883, -0.1490718275308609, 0.057956065982580185, 0.06424877047538757, 0.0826268121600151, 0.008405955508351326, -0.04389135539531708, -0.11158136278390884, 0.201458141207695, 0.0030388780869543552, -0.10750434547662735, -0.08151406049728394, -0.07496928423643112, 0.037771474570035934, -0.08679985255002975, 0.039174165576696396, -0.06739754229784012, 0.041097160428762436, -0.0651925578713417, -0.1481562852859497, 0.08903490751981735, -0.08699159324169159, -0.08164242655038834, -0.014672894030809402, 0.19377855956554413, 0.0065513430163264275, -0.01869228295981884, 0.038573265075683594, 0.006421986967325211, -0.076758474111557, -0.09225580841302872, -0.019576774910092354, 0.036579594016075134, 0.013852888718247414, 0.0030442161951214075, -0.04762021452188492, -0.12291648983955383, -0.09204591810703278, -0.04385259002447128, 0.2728738486766815, 0.19974300265312195, -0.026881489902734756, 0.13991804420948029, 0.1844230443239212, -0.03555501624941826, -0.28975069522857666, -0.129631906747818, -0.1064172089099884, -0.030529825016856194, -0.07884179055690765, -0.11089235544204712, 0.08475121110677719, 0.00969227310270071, -0.035570502281188965, 0.09379000961780548, -0.2526547908782959, -0.11592777818441391, 0.17347928881645203, 0.042091310024261475, 0.3923875391483307, -0.14174522459506989, -0.05979350581765175, -0.04264022037386894, -0.15105703473091125, 0.11407030373811722, -0.06597913801670074, 0.10665921866893768, -0.016336211934685707, 0.12010250985622406, 0.049770038574934006, -0.028550734743475914, 0.07760490477085114, -0.030502762645483017, -0.024778684601187706, -0.1408219188451767, -0.046294063329696655, 0.037945084273815155, 0.011903857812285423, 0.06426334381103516, -0.059053923934698105, 0.03089851699769497, -0.07427852600812912, -0.06375524401664734, -0.07497759163379669, 0.03561917692422867, 0.02550671249628067, -0.09288892149925232, -0.011004879139363766, -0.016972385346889496, -0.0016542328521609306, 0.004289044998586178, 0.15870213508605957, -0.08388298749923706, 0.17082098126411438, 0.09962966293096542, 0.1267397552728653, -0.09575632214546204, 0.07364371418952942, -0.05826573818922043, -0.0756044015288353, 0.07540355622768402, -0.10420523583889008, 0.01852058432996273, 0.07178482413291931, -0.024856852367520332, 0.07458864897489548, 0.07839018851518631, -0.01256676483899355, 0.03164582699537277, 0.11334807425737381, -0.24697187542915344, -0.09705287963151932, -0.04785506799817085, 0.023476745933294296, 0.08186466246843338, 0.1274099200963974, 0.19005069136619568, -0.03526085987687111, -0.029332375153899193, -0.017147107049822807, 0.04482920467853546, -0.03302641957998276, 0.07310186326503754, 0.013490988872945309, 0.017729394137859344, -0.1323545277118683, 0.06887000799179077, 0.0002557034604251385, -0.0849352702498436, 0.03970040753483772, 0.1221470907330513, -0.13092836737632751, -0.12301164120435715, -0.0555037260055542, 0.08674490451812744, -0.041289087384939194, -0.04737282544374466, -0.05804138630628586, -0.14401821792125702, 0.031101344153285027, 0.12412063032388687, 0.0452696830034256, 0.06100200489163399, -0.0377802811563015, -0.015119562856853008, -0.0840483009815216, 0.02361958846449852, 0.07902487367391586, -0.004079940728843212, -0.10757573693990707, 0.06157791242003441, -0.009893528185784817, 0.11218374222517014, -0.10038261860609055, -0.08338988572359085, -0.1580592393875122, 0.038671817630529404, -0.0821060985326767, -0.05258147791028023, -0.1299353837966919, -0.05539165437221527, -0.03864385560154915, -0.030283542349934578, -0.04710946977138519, -0.03449217975139618, -0.07396160811185837, 0.060775984078645706, -0.02625134587287903, -0.00009399054397363216, -0.07751239091157913, 0.009811396710574627, 0.05151722952723503, -0.03661595284938812, 0.1489076465368271, 0.1346203237771988, -0.12291070073843002, 0.10327229648828506, -0.20033401250839233, -0.02815188840031624, 0.10665174573659897, -0.008448217995464802, 0.02650708705186844, 0.041195888072252274, 0.011828000657260418, 0.07511809468269348, 0.04778595268726349, 0.05561533570289612, 0.05692398548126221, -0.0876823216676712, 0.07426726073026657, -0.04407075047492981, -0.11139474809169769, -0.06200236454606056, -0.03183037415146828, 0.0348762609064579, 0.012705483473837376, 0.1118837520480156, -0.07740536332130432, 0.0913439691066742, -0.0675632581114769, 0.03325635939836502, 0.029557745903730392, -0.17939493060112, -0.01402541995048523, -0.05918882414698601, 0.037285275757312775, -0.013125123456120491, 0.1919046938419342, 0.009775705635547638, -0.06917151808738708, 0.023123105987906456, 0.030112676322460175, 0.03283139318227768, 0.01246984489262104, 0.2282831370830536, 0.08053728193044662, -0.041401706635951996, -0.10273820906877518, 0.04818890616297722, 0.017586050555109978, -0.007494404911994934, 0.08686249703168869, 0.02568146400153637, -0.04521773383021355, 0.07924499362707138, -0.023635804653167725, 0.03864040598273277, -0.08427286893129349, -0.14266741275787354, -0.10259127616882324, 0.024230102077126503, -0.01808827929198742, 0.09952718764543533, 0.22870264947414398, 0.005670683924108744, 0.007427594158798456, -0.02566494792699814, -0.07718150317668915, -0.18426503241062164, -0.19645574688911438, -0.11098158359527588, -0.11867894977331161, 0.041148461401462555, -0.12193097174167633, 0.01668214611709118, 0.061388660222291946, 0.09195728600025177, -0.0679699257016182, 0.13417236506938934, 0.0921182632446289, -0.069246806204319, 0.08622390776872635, -0.023914068937301636, 0.05558593198657036, 0.0048711830750107765, -0.024045497179031372, -0.083584263920784, 0.03235133737325668, 0.0004329363873694092, 0.04467542842030525, -0.05480875074863434, 0.03470044955611229, -0.11282101273536682, -0.08993200212717056, -0.04180255904793739, 0.0791926458477974, 0.014545325189828873, 0.143326997756958, 0.016810134053230286, -0.03717712312936783, 0.02462523803114891, 0.2530100345611572, -0.03439435735344887, -0.11958079040050507, -0.07308673858642578, 0.17629583179950714, 0.018073374405503273, 0.08550749719142914, -0.00900169089436531, -0.010884176939725876, -0.04912753775715828, 0.3407406210899353, 0.2938220500946045, -0.06722709536552429, 0.03361302614212036, -0.03311456739902496, 0.042294811457395554, 0.06088634207844734, 0.1293007731437683, 0.10301060974597931, 0.28354501724243164, -0.06035320460796356, 0.01329181157052517, -0.02626890502870083, -0.02558743581175804, -0.09232905507087708, 0.05384288728237152, 0.04491667076945305, -0.057101234793663025, -0.051306288689374924, 0.09703560918569565, -0.25247064232826233, 0.12101675570011139, -0.17243878543376923, -0.13776402175426483, -0.07376258820295334, 0.00694429874420166, 0.12094949185848236, 0.018576569855213165, 0.09021507203578949, -0.016304364427924156, -0.04131123051047325, 0.03995807096362114, 0.034725937992334366, -0.13731060922145844, 0.02126513049006462, 0.05814722925424576, -0.0256795771420002, 0.03208513557910919, -0.002206335077062249, 0.040828511118888855, 0.07270201295614243, 0.02131122350692749, -0.02543632686138153, 0.11020801961421967, -0.0019148285500705242, -0.05280296504497528, 0.03689876198768616, 0.060101233422756195, 0.011193575337529182, -0.10135050117969513, 0.07637367397546768, -0.16664664447307587, 0.05430193245410919, 0.009332695044577122, -0.03210175409913063, -0.04858435317873955, 0.08407434076070786, -0.06366391479969025, 0.08860976248979568, 0.09148426353931427, -0.027830032631754875, -0.010607229545712471, -0.04303527623414993, 0.025421351194381714, -0.04535661265254021, -0.08884074538946152, -0.04867248982191086, -0.19439277052879333, -0.10947951674461365, 0.1034369245171547, 0.0238637775182724, -0.22515617311000824, 0.0267107542604208, -0.15356694161891937, 0.032854266464710236, -0.1621718406677246, 0.07554659247398376, 0.10143224895000458, 0.02144019305706024, 0.012838616967201233, -0.0048032174818217754, 0.046751461923122406, 0.08516089618206024, -0.10450492054224014, -0.10252957046031952 ]
null
null
transformers
# Satou Hina DialoGPT Model
{"tags": ["conversational"]}
text-generation
NamPE/DialoGPT-small-satouhina
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Satou Hina DialoGPT Model
[ "# Satou Hina DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Satou Hina DialoGPT Model" ]
[ 51, 10 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Satou Hina DialoGPT Model" ]
[ -0.02379368618130684, 0.040238745510578156, -0.0051728119142353535, 0.014593357220292091, 0.15292829275131226, 0.014575047418475151, 0.16844914853572845, 0.12342324107885361, 0.015005026012659073, -0.02924446389079094, 0.11559892445802689, 0.11033757776021957, 0.046778127551078796, 0.12115103006362915, -0.06139954924583435, -0.32603874802589417, 0.06850647926330566, 0.05653581768274307, -0.004624190740287304, 0.11880907416343689, 0.10351252555847168, -0.04418912157416344, 0.09708374738693237, 0.03744697943329811, -0.15480932593345642, 0.006233994849026203, -0.004335555247962475, -0.12093932926654816, 0.10356958955526352, 0.08358292281627655, 0.048062633723020554, 0.028731191530823708, -0.03382963314652443, -0.1376836746931076, 0.042056649923324585, -0.028619827702641487, -0.050954319536685944, 0.03593821078538895, 0.007071816362440586, -0.041653551161289215, 0.07163358479738235, 0.11521980166435242, -0.035170912742614746, 0.04615726321935654, -0.14223071932792664, 0.006899279076606035, 0.005959334783256054, 0.039614975452423096, 0.05698734149336815, 0.09459207206964493, -0.051961738616228104, 0.10674832761287689, -0.07709895819425583, 0.07410421967506409, 0.06303177773952484, -0.3083428740501404, -0.02971573732793331, 0.08402817696332932, 0.007031865417957306, 0.08244967460632324, -0.04545443132519722, 0.08628582954406738, 0.041081562638282776, -0.030017226934432983, -0.07420974224805832, -0.08530408143997192, -0.02888704277575016, 0.021381700411438942, -0.06977987289428711, 0.02598905749619007, 0.2539491355419159, -0.013206145726144314, 0.05234573036432266, -0.07025329768657684, -0.08636283874511719, -0.00005326547034201212, -0.052413225173950195, -0.03232908621430397, -0.08299153298139572, 0.0680806115269661, 0.03166131675243378, -0.09392835944890976, -0.12014707922935486, -0.022625064477324486, -0.12542495131492615, 0.12577249109745026, 0.05561399459838867, 0.037838999181985855, -0.20286603271961212, 0.06570826470851898, 0.025963421911001205, -0.09458819031715393, -0.014548197388648987, -0.09661711752414703, -0.009785748086869717, 0.02783169224858284, -0.004778612405061722, -0.008851870894432068, 0.08898540586233139, 0.07712669670581818, -0.015005742199718952, 0.04997444525361061, -0.02283831313252449, 0.045344263315200806, 0.05411003902554512, 0.08840964734554291, -0.03607212379574776, -0.06543349474668503, 0.020094072446227074, -0.0769927129149437, -0.01626153290271759, -0.06672047823667526, -0.1757666915655136, -0.023518064990639687, 0.012761859223246574, 0.04262647405266762, -0.014327127486467361, 0.1259155124425888, 0.020369790494441986, -0.04375861957669258, -0.004337666556239128, -0.05088603124022484, -0.04048194736242294, 0.0010799779556691647, -0.001374181010760367, 0.17901040613651276, -0.002677086042240262, 0.01693335548043251, -0.13184183835983276, -0.017202846705913544, -0.06278394162654877, -0.006099890451878309, -0.00958729162812233, 0.005393891129642725, -0.019428450614213943, -0.07325654476881027, 0.00920186284929514, -0.1564921885728836, -0.13732659816741943, -0.005822786130011082, 0.0029369087424129248, -0.06514067202806473, -0.10680603981018066, -0.08801671862602234, 0.011991697363555431, 0.03963163122534752, -0.06449402868747711, -0.04964928328990936, -0.062036506831645966, 0.07916977256536484, -0.0023030557204037905, 0.08857399225234985, -0.08311466127634048, 0.06860543042421341, -0.07997223734855652, -0.009359647519886494, -0.06329710781574249, 0.09550821036100388, 0.015742935240268707, 0.0642928034067154, -0.028110362589359283, -0.008881173096597195, -0.06267938762903214, 0.08000658452510834, -0.04101523756980896, 0.24215185642242432, -0.030018437653779984, -0.10897901654243469, 0.2940950393676758, -0.07804705947637558, -0.17278344929218292, 0.1467624455690384, 0.006431377027183771, 0.12496668100357056, 0.09478957951068878, 0.17444393038749695, -0.0404653362929821, -0.01751280203461647, 0.08473985642194748, 0.10340606421232224, -0.0446048267185688, 0.0004477175243664533, 0.015316104516386986, 0.00537640368565917, -0.13009649515151978, 0.0411556102335453, 0.09391375631093979, 0.04933001473546028, -0.04773395508527756, -0.05326284468173981, 0.004287727177143097, -0.017710817977786064, 0.021025029942393303, -0.0242592953145504, 0.1344350278377533, 0.002339649247005582, -0.049594610929489136, 0.02606400102376938, 0.06502041220664978, -0.0648292526602745, 0.050598956644535065, -0.09165745973587036, 0.08826230466365814, -0.050717711448669434, 0.046577367931604385, -0.10922751575708389, 0.008564235642552376, -0.050290778279304504, 0.19122518599033356, 0.0696512833237648, 0.094301737844944, 0.05197012424468994, -0.043411508202552795, -0.029508961364626884, 0.027023136615753174, 0.16196444630622864, -0.011501488275825977, -0.052720408886671066, -0.11388995498418808, 0.11196432262659073, -0.04087119922041893, 0.18602900207042694, -0.062267106026411057, -0.003101147711277008, 0.0013400226598605514, 0.09658262878656387, -0.028781214728951454, 0.0490899533033371, 0.018646016716957092, -0.013183911330997944, -0.05334378033876419, 0.036991965025663376, 0.0800398737192154, -0.0023299138993024826, -0.11384130269289017, 0.2547444701194763, -0.16153183579444885, 0.1451665163040161, 0.1832270473241806, -0.2034083902835846, 0.0028625219129025936, -0.11446887999773026, -0.01598774455487728, 0.005402310751378536, 0.05912649631500244, 0.008830360136926174, 0.13310109078884125, -0.04467738792300224, 0.19140420854091644, -0.06292355805635452, 0.0013487281976267695, -0.013303782790899277, -0.06776756793260574, 0.00010392031981609762, 0.10353662073612213, 0.14030522108078003, -0.17234639823436737, 0.18822839856147766, 0.05983150005340576, 0.03141232579946518, 0.24871309101581573, 0.021438555791974068, -0.014327863231301308, 0.047261934727430344, 0.004289036616683006, -0.037665002048015594, -0.025988472625613213, -0.2582753896713257, -0.028670212253928185, 0.06102287769317627, 0.027066916227340698, 0.10174936801195145, -0.07482339441776276, -0.049222465604543686, -0.02308748848736286, -0.008782085962593555, 0.09112172573804855, 0.12448786199092865, 0.009708643890917301, 0.11606412380933762, -0.006263197399675846, -0.046216607093811035, 0.04108547046780586, 0.008795700967311859, -0.06421952694654465, 0.1814623475074768, -0.0948847234249115, -0.3340926766395569, -0.08274530619382858, -0.2158227264881134, -0.05438704788684845, 0.03469220921397209, 0.08696574717760086, -0.16806218028068542, -0.01912027969956398, -0.008527560159564018, 0.028877250850200653, -0.11548222601413727, 0.02577088586986065, -0.038308385759592056, 0.008247588761150837, -0.11432158201932907, -0.07598355412483215, -0.044741205871105194, -0.058037515729665756, -0.06694873422384262, 0.11048117280006409, -0.15009379386901855, 0.0350712426006794, 0.20310617983341217, 0.07476957142353058, 0.06144747510552406, -0.03169805184006691, 0.21178488433361053, -0.1495838314294815, 0.023907721042633057, 0.17689530551433563, -0.04773068055510521, 0.06081938371062279, 0.16118955612182617, -0.008378913626074791, -0.08010606467723846, 0.03270911052823067, -0.017114469781517982, -0.06544337421655655, -0.22237619757652283, -0.1079067513346672, -0.1424480378627777, 0.0995749831199646, 0.01798407919704914, 0.04327525943517685, 0.16583402454853058, 0.09728574007749557, -0.050536561757326126, 0.04260269179940224, 0.035567522048950195, 0.06942926347255707, 0.19593264162540436, -0.04873817786574364, 0.12601888179779053, -0.016275206580758095, -0.1622033268213272, 0.06952356547117233, 0.06807290017604828, 0.07543494552373886, 0.04491301625967026, 0.08319449424743652, 0.04653903841972351, 0.05557980760931969, 0.15167048573493958, 0.008174235932528973, 0.001672931364737451, -0.03511985018849373, -0.04678830876946449, -0.049861788749694824, -0.004138713236898184, 0.08206606656312943, 0.048701439052820206, -0.14311560988426208, -0.006072817835956812, -0.028151342645287514, 0.07849138230085373, 0.018456624820828438, 0.1023191586136818, -0.12079798430204391, -0.008815385401248932, 0.07963398098945618, -0.02493171952664852, -0.11647345870733261, 0.08791740238666534, -0.011468175798654556, -0.1531074047088623, 0.0494459867477417, -0.015337617136538029, 0.12395226210355759, -0.07800892740488052, 0.05733860656619072, -0.14316265285015106, -0.06065310165286064, -0.020696645602583885, 0.0940956249833107, -0.28752297163009644, 0.22653554379940033, -0.00003367249883012846, -0.04724070802330971, -0.08901660144329071, -0.007439826149493456, 0.04722142964601517, 0.12693364918231964, 0.10437862575054169, -0.00619296682998538, -0.019071562215685844, 0.01964530721306801, -0.03598127141594887, 0.05395651236176491, 0.08216534554958344, 0.0023790413979440928, -0.01982363872230053, -0.03920762240886688, -0.0054177576676011086, -0.040333181619644165, -0.06071973592042923, -0.02692675031721592, -0.20104476809501648, 0.07983304560184479, 0.060813359916210175, 0.08703592419624329, 0.04680850729346275, -0.027146631851792336, -0.06542900204658508, 0.2348335087299347, 0.01607593521475792, -0.08989806473255157, -0.10263926535844803, -0.015695633366703987, 0.04321562871336937, -0.07242178171873093, 0.027951126918196678, -0.08166126906871796, -0.008150937035679817, -0.04829981550574303, -0.14467404782772064, 0.08323237299919128, -0.06458546221256256, -0.04521647468209267, -0.02082495018839836, 0.16828157007694244, -0.006175006739795208, 0.003190501593053341, 0.032921578735113144, -0.011177386157214642, -0.10148696601390839, -0.08571366220712662, 0.0010420503094792366, 0.03257669508457184, 0.008978389203548431, 0.030771911144256592, -0.022771866992115974, -0.04134584963321686, -0.07160170376300812, -0.08346874266862869, 0.2706220746040344, 0.12804356217384338, -0.010326272808015347, 0.17307187616825104, 0.19329090416431427, -0.06274817883968353, -0.2684788107872009, -0.11991984397172928, -0.0934002473950386, -0.02069302834570408, -0.09965898841619492, -0.18570584058761597, 0.07854811102151871, -0.03932219371199608, -0.026510702446103096, 0.050423573702573776, -0.23303884267807007, -0.11450299620628357, 0.16032052040100098, -0.0008021509856916964, 0.4518139362335205, -0.09831462055444717, -0.0634327381849289, -0.06556781381368637, -0.17856325209140778, 0.12634281814098358, 0.03461708500981331, 0.12630827724933624, -0.0384768471121788, 0.10170850902795792, 0.036238089203834534, 0.014756767079234123, 0.11570486426353455, -0.022415505722165108, -0.06503607332706451, -0.1075907051563263, -0.01872977986931801, -0.05939646065235138, 0.029449863359332085, 0.04421132802963257, -0.09567395597696304, 0.0019792644307017326, -0.15490637719631195, -0.06477514654397964, -0.06811006367206573, 0.012967378832399845, 0.031070170924067497, -0.08501649647951126, -0.021455153822898865, -0.005730882752686739, 0.006490695755928755, 0.0250023752450943, 0.16074176132678986, -0.10546351969242096, 0.1352384239435196, 0.04057345539331436, 0.10966970026493073, -0.11495865136384964, 0.010317022912204266, -0.06517668813467026, -0.055793166160583496, 0.09395726025104523, -0.13599108159542084, 0.024651728570461273, 0.07484182715415955, -0.043273571878671646, 0.11000430583953857, 0.0840548723936081, -0.04136808216571808, 0.022468965500593185, 0.09851159900426865, -0.2241756021976471, -0.07952285557985306, -0.07524971663951874, -0.03637207671999931, 0.11297431588172913, 0.07360523194074631, 0.2052888423204422, -0.022923827171325684, -0.044368695467710495, 0.011942291632294655, 0.039971914142370224, -0.038429681211709976, 0.054011519998311996, 0.01599317044019699, 0.012913404032588005, -0.1327487975358963, 0.04824918508529663, 0.031809043139219284, -0.11443863064050674, 0.0461818091571331, 0.1664728969335556, -0.10790061950683594, -0.11593439429998398, -0.04955637827515602, 0.08159181475639343, -0.0809478610754013, -0.0388043150305748, -0.05641476437449455, -0.13611944019794464, 0.061402518302202225, 0.1068536639213562, 0.05152123421430588, 0.03756004571914673, -0.06006148084998131, -0.026980971917510033, -0.05143439769744873, 0.009057661518454552, 0.0510086789727211, -0.009403803385794163, -0.05239848792552948, -0.015134976245462894, -0.0028146894183009863, 0.15981432795524597, -0.08410198241472244, -0.1221405640244484, -0.18706455826759338, 0.04302584379911423, -0.13448478281497955, -0.059630393981933594, -0.09729854017496109, -0.05838989466428757, -0.020353784784674644, -0.03240235894918442, -0.05635162815451622, -0.031046394258737564, -0.10467074066400528, 0.053436171263456345, -0.03033892996609211, 0.02170417457818985, -0.046772632747888565, 0.03296119347214699, 0.08051091432571411, -0.04660402983427048, 0.15181589126586914, 0.13194671273231506, -0.12018385529518127, 0.08110108971595764, -0.1270582228899002, -0.04922521114349365, 0.06497745960950851, 0.023593364283442497, 0.06422983109951019, 0.04761548340320587, 0.017107892781496048, 0.05042482540011406, 0.06776049733161926, 0.043039292097091675, 0.09786400943994522, -0.06862140446901321, 0.046630922704935074, -0.07256623357534409, -0.12347130477428436, -0.060413435101509094, -0.012419654987752438, 0.02632773108780384, 0.038178350776433945, 0.04692704603075981, -0.0594911053776741, 0.07907073944807053, -0.057770051062107086, 0.03150354325771332, 0.01976151019334793, -0.15757545828819275, 0.04007021337747574, -0.13467174768447876, 0.031297944486141205, 0.0023390217684209347, 0.22515101730823517, 0.009408180601894855, -0.026539567857980728, 0.028776751831173897, 0.01874825544655323, 0.019586779177188873, 0.006183628458529711, 0.18063275516033173, 0.09722309559583664, -0.02628031000494957, -0.07255443185567856, 0.05212371423840523, 0.015508195385336876, 0.1387057602405548, 0.07827448844909668, -0.05027015879750252, 0.0058906893245875835, 0.10162024199962616, -0.04379726201295853, 0.0501178540289402, -0.07091628015041351, -0.1138409823179245, -0.08960382640361786, 0.04961574450135231, -0.036297380924224854, 0.09051264077425003, 0.1983163207769394, 0.011672128923237324, 0.02793794497847557, -0.04591388627886772, -0.08187572658061981, -0.19669872522354126, -0.1634284257888794, -0.0870126262307167, -0.16407833993434906, 0.03594105690717697, -0.11570288240909576, 0.03097034990787506, 0.07622521370649338, 0.10991016030311584, -0.052243854850530624, 0.0928388461470604, 0.10596374422311783, -0.09515488892793655, 0.0707489475607872, -0.02504509687423706, 0.07371285557746887, -0.011970605701208115, -0.0025999085046350956, -0.09244446456432343, 0.00862234178930521, -0.014961889944970608, 0.03668134659528732, -0.06526996195316315, 0.029609765857458115, -0.07068799436092377, -0.07874595373868942, -0.04847468435764313, 0.054051995277404785, 0.028724128380417824, 0.1755496859550476, 0.008636947721242905, -0.0195853840559721, 0.010256226174533367, 0.22055168449878693, -0.04336768016219139, -0.07979823648929596, -0.09806060045957565, 0.19532547891139984, 0.00752272829413414, 0.061741314828395844, -0.020661024376749992, 0.011683142744004726, -0.08026435971260071, 0.38978129625320435, 0.29515260457992554, -0.12026484310626984, 0.022151026874780655, 0.010943911038339138, 0.04096056520938873, 0.08835118263959885, 0.12660470604896545, 0.09085214138031006, 0.2777922451496124, -0.05848490074276924, -0.028147350996732712, -0.027233295142650604, -0.04977480322122574, -0.06288646906614304, 0.08409923315048218, 0.07106401026248932, -0.07162490487098694, -0.021962987259030342, 0.09712450206279755, -0.2695610225200653, 0.020010828971862793, -0.08607863634824753, -0.1715879887342453, -0.09467104077339172, -0.010202597826719284, 0.09663046896457672, 0.04405215382575989, 0.05466725677251816, -0.022904744371771812, -0.05283009633421898, 0.07724955677986145, 0.056574497371912, -0.1782708466053009, -0.02595638856291771, 0.10737179219722748, -0.07074596732854843, -0.028505777940154076, -0.010507212951779366, 0.04274776950478554, 0.0645119696855545, 0.04201458767056465, 0.023072462528944016, 0.04653497412800789, 0.00883500650525093, -0.045997146517038345, 0.009541613981127739, 0.08151888847351074, 0.008579512126743793, -0.09412923455238342, 0.08556473255157471, -0.11319908499717712, 0.03843935206532478, 0.0029404936358332634, -0.03383919224143028, -0.01223132573068142, 0.04766605794429779, -0.10161708295345306, 0.08913721144199371, 0.0979306697845459, -0.018743347376585007, -0.022058743983507156, -0.03040764294564724, 0.024525603279471397, -0.047241076827049255, -0.12199542671442032, -0.08953677117824554, -0.16919077932834625, -0.11855696886777878, 0.05609278380870819, 0.022285545244812965, -0.18598714470863342, 0.028970513492822647, -0.13650628924369812, 0.057905588299036026, -0.1441224068403244, 0.11898715794086456, 0.0813116580247879, 0.01104242354631424, 0.008340764790773392, -0.014754606410861015, 0.043347399681806564, 0.04368705675005913, -0.12443804740905762, -0.07740885764360428 ]
null
null
transformers
# Bapibot
{"tags": ["conversational"]}
text-generation
NanniKirby/DialoGPT-medium-bapi
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Bapibot
[ "# Bapibot" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Bapibot" ]
[ 51, 4 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Bapibot" ]
[ -0.0017511481419205666, 0.011250718496739864, -0.007105078548192978, 0.014087896794080734, 0.12434225529432297, 0.004416945856064558, 0.1292487233877182, 0.1294037401676178, 0.05080842599272728, -0.00788082368671894, 0.15101216733455658, 0.17526975274085999, 0.006775634828954935, 0.17518381774425507, -0.07102128863334656, -0.26338180899620056, 0.07620087265968323, 0.05055075138807297, 0.08102144300937653, 0.13387872278690338, 0.07828658074140549, -0.04454581439495087, 0.0946480855345726, -0.038753777742385864, -0.1196356788277626, 0.022293001413345337, 0.02305344119668007, -0.12182280421257019, 0.1052849143743515, 0.04030470922589302, 0.06339428573846817, 0.011224220506846905, -0.07818161696195602, -0.14059129357337952, 0.03709583729505539, 0.015355461277067661, -0.058539580553770065, 0.0523231141269207, 0.011728175915777683, -0.09777050465345383, 0.1937159150838852, 0.09967375546693802, -0.009738285094499588, 0.07374898344278336, -0.16750361025333405, -0.014507761225104332, 0.0034449161030352116, 0.08643744885921478, 0.053061358630657196, 0.07830766588449478, -0.03413871303200722, 0.10113906115293503, -0.07362164556980133, 0.10694681853055954, 0.1385362446308136, -0.28448089957237244, -0.01919391378760338, 0.048505570739507675, 0.07070586085319519, 0.08386410772800446, -0.05857731029391289, 0.03844190388917923, 0.022003045305609703, 0.026592228561639786, -0.03245708346366882, -0.06952455639839172, -0.07897067815065384, 0.004927807953208685, -0.07912197709083557, -0.06715267896652222, 0.22136260569095612, -0.06773879379034042, 0.0384698361158371, -0.06539565324783325, -0.12327640503644943, -0.04955113306641579, -0.03788921236991882, 0.013159535825252533, -0.0887410044670105, 0.07490301132202148, 0.01743420958518982, -0.08355604857206345, -0.10982082039117813, -0.034911490976810455, -0.14221419394016266, 0.15075300633907318, 0.03118259832262993, 0.05490957945585251, -0.18879668414592743, 0.09513366967439651, 0.09063911437988281, -0.09794734418392181, 0.012371692806482315, -0.09828171133995056, 0.07523719221353531, 0.02695264294743538, -0.01942623034119606, -0.011637488380074501, 0.08099702000617981, 0.18244598805904388, -0.023470653221011162, 0.04112561419606209, -0.05866257846355438, 0.06802737712860107, 0.04330535605549812, 0.05058376491069794, 0.01016483549028635, -0.010575215332210064, 0.04313905909657478, -0.08271214365959167, 0.006649345625191927, -0.0696047991514206, -0.15257856249809265, -0.010490801185369492, 0.05910295620560646, 0.06431310623884201, 0.01385466754436493, 0.09990697354078293, -0.04820932820439339, -0.03127741441130638, 0.06686968356370926, -0.04610613361001015, 0.00025555313914082944, 0.02514314092695713, 0.022945338860154152, 0.08682192116975784, 0.03549877181649208, 0.008559205569326878, -0.1334124654531479, 0.05331631749868393, -0.07127733528614044, -0.003518803743645549, -0.023419158533215523, -0.01782975345849991, 0.014263879507780075, -0.07272767275571823, -0.006280501373112202, -0.14799964427947998, -0.15082962810993195, 0.032678522169589996, -0.008464082144200802, -0.058054763823747635, -0.05192820355296135, -0.04510946571826935, -0.0026791864074766636, 0.02507198601961136, -0.0721500962972641, -0.012633901089429855, -0.06718849390745163, 0.12627333402633667, -0.025815600529313087, 0.12141858786344528, -0.1263977587223053, 0.06454204022884369, -0.12083960324525833, -0.008531259372830391, -0.07914237678050995, 0.08761459589004517, -0.02001926302909851, 0.06751041114330292, -0.033074554055929184, -0.018282342702150345, -0.08101292699575424, 0.05817386507987976, -0.028213150799274445, 0.2158772051334381, -0.06074972078204155, -0.11696600168943405, 0.3252049386501312, -0.05978121608495712, -0.13671088218688965, 0.11592056602239609, 0.0025376600679010153, 0.08551787585020065, 0.1094394326210022, 0.1790281981229782, -0.08567206561565399, -0.007753870915621519, 0.07438565045595169, 0.06927944719791412, -0.0604916512966156, 0.0024515960831195116, -0.0059074112214148045, -0.025102457031607628, -0.09662521630525589, 0.03701172024011612, 0.1343100666999817, 0.0607074573636055, -0.03880254924297333, -0.03515775501728058, -0.005387125536799431, -0.008894911967217922, 0.09551075100898743, -0.00016178771329578012, 0.10810365527868271, -0.04347845911979675, -0.03661409020423889, -0.024372370913624763, -0.003201813204213977, -0.011579100973904133, 0.021436933428049088, -0.03842025622725487, 0.10971415042877197, 0.004494994413107634, 0.06822986900806427, -0.13380348682403564, -0.03021402284502983, -0.016651524230837822, 0.1429925262928009, 0.05004606768488884, 0.05227561667561531, 0.06167182698845863, -0.015655608847737312, 0.0019969658460468054, 0.015711378306150436, 0.13579633831977844, -0.013892199844121933, -0.07391507923603058, -0.0713772252202034, 0.053612250834703445, -0.044055212289094925, 0.06532503664493561, -0.031174032017588615, 0.013181757181882858, 0.03382924571633339, 0.1297212839126587, -0.002855091355741024, 0.01390718761831522, 0.008473231457173824, 0.00031223968835547566, -0.06875152885913849, 0.011284063570201397, 0.09325157105922699, -0.03049524500966072, -0.05595379322767258, 0.1854894906282425, -0.14268310368061066, 0.1809924691915512, 0.18461838364601135, -0.2169969528913498, 0.006989262066781521, -0.06757842749357224, -0.04333429038524628, 0.001391394529491663, 0.06645659357309341, -0.04053378477692604, 0.16684623062610626, -0.004742867778986692, 0.19418460130691528, -0.06754010915756226, -0.033819153904914856, -0.014858691021800041, -0.06710674613714218, 0.007762156426906586, 0.07826805114746094, 0.12667177617549896, -0.1518745720386505, 0.20558227598667145, 0.1540810465812683, 0.052801452577114105, 0.25207340717315674, 0.025363245978951454, -0.01826503500342369, 0.08966691792011261, 0.025446632876992226, -0.05035993456840515, -0.044772349298000336, -0.2819068729877472, -0.02762690931558609, 0.07189897447824478, 0.003995971288532019, 0.10206298530101776, -0.0882275328040123, -0.014142680913209915, -0.03909510746598244, -0.021323394030332565, 0.03596283867955208, 0.09497835487127304, 0.05735106021165848, 0.14303937554359436, -0.0013663818826898932, -0.042738452553749084, 0.06491530686616898, 0.03594905510544777, -0.08051818609237671, 0.17025767266750336, -0.10789845138788223, -0.3303767144680023, -0.12666897475719452, -0.18723608553409576, -0.07798880338668823, 0.058363087475299835, 0.11355265229940414, -0.09560500085353851, -0.009067423641681671, 0.008312875404953957, 0.11399012058973312, -0.03743589296936989, -0.010863079689443111, -0.0522383414208889, 0.011008509434759617, -0.07872682064771652, -0.0853753611445427, -0.068702831864357, -0.03372227028012276, -0.07870011776685715, 0.14050467312335968, -0.12022721767425537, 0.03750775381922722, 0.16198845207691193, 0.04623846337199211, 0.05580172315239906, -0.03629470616579056, 0.20418009161949158, -0.09202136099338531, 0.015429901890456676, 0.17943108081817627, -0.06368906050920486, 0.05735187977552414, 0.10585559904575348, -0.008696191012859344, -0.11744841188192368, 0.02588449977338314, -0.041710980236530304, -0.08276889473199844, -0.1760098785161972, -0.1465461701154709, -0.12963515520095825, 0.12091411650180817, 0.07658551633358002, 0.05677100270986557, 0.1512013077735901, 0.06194658949971199, -0.06037645414471626, 0.05355751886963844, 0.03626605123281479, 0.09198823571205139, 0.16916067898273468, -0.07611466199159622, 0.1473371833562851, -0.02189563401043415, -0.15301448106765747, 0.09273351728916168, 0.09584911912679672, 0.05191873013973236, 0.05364310368895531, 0.07845684885978699, 0.03606307879090309, 0.046751443296670914, 0.11609923094511032, 0.090122289955616, 0.00010615537030389532, -0.013747138902544975, -0.0402374342083931, -0.030403416603803635, -0.11337801069021225, 0.0520600862801075, 0.04358001425862312, -0.12633073329925537, -0.012312641367316246, -0.13545677065849304, 0.08751768618822098, 0.13425107300281525, 0.07082904130220413, -0.20407675206661224, -0.0332791768014431, 0.08744991570711136, -0.04270715266466141, -0.15019790828227997, 0.07476996630430222, -0.013571611605584621, -0.1596647948026657, 0.034726981073617935, -0.02609577775001526, 0.11290229111909866, -0.06935574114322662, 0.08500628918409348, -0.10983198881149292, -0.09639723598957062, 0.01283550076186657, 0.10171009600162506, -0.3023146986961365, 0.1680195927619934, -0.014083834365010262, -0.06848939508199692, -0.1420067697763443, 0.0029088682495057583, 0.0009256050689145923, 0.04594671353697777, 0.07863319665193558, -0.0024096015840768814, 0.05605536699295044, -0.10230256617069244, -0.03957638889551163, 0.027148356661200523, 0.08965093642473221, -0.05737045034766197, -0.030302388593554497, -0.04060925915837288, -0.00220161909237504, -0.056495003402233124, -0.06687954068183899, 0.03863503411412239, -0.1757507175207138, 0.08982572704553604, 0.04320790246129036, 0.08141466230154037, 0.011262413114309311, 0.0017246654024347663, 0.039298396557569504, 0.24140101671218872, -0.09092174470424652, -0.11748185753822327, -0.09149523824453354, -0.04215710610151291, 0.03203820437192917, -0.04768017306923866, 0.04130777344107628, -0.08861560374498367, 0.0023050580639392138, -0.0888088047504425, -0.19696177542209625, 0.13487368822097778, -0.09851716458797455, -0.0831485390663147, -0.03295236825942993, 0.21626687049865723, -0.03607114031910896, -0.0030084550380706787, 0.01822637766599655, 0.005079708527773619, -0.12757980823516846, -0.0829785093665123, 0.0008442185935564339, 0.018169669434428215, 0.044842492789030075, 0.0533379465341568, -0.06072048470377922, -0.05110444501042366, -0.05154569447040558, -0.009274646639823914, 0.2839198410511017, 0.15673211216926575, -0.021633261814713478, 0.2193676233291626, 0.14408932626247406, -0.049755338579416275, -0.2991931140422821, -0.16014660894870758, -0.10840560495853424, -0.04882722347974777, -0.11073663830757141, -0.18954622745513916, 0.07294496893882751, -0.005246181972324848, -0.021851597353816032, 0.16042251884937286, -0.258932888507843, -0.09119062125682831, 0.1883215308189392, -0.023368462920188904, 0.3800354599952698, -0.13420099020004272, -0.08837100118398666, -0.0714380219578743, -0.10495177656412125, 0.10388769209384918, 0.013683130964636803, 0.1205175444483757, -0.042062122374773026, 0.1583060771226883, 0.04185004532337189, -0.0074843461625278, 0.08516961336135864, -0.0019985903054475784, -0.04917994514107704, -0.10770098865032196, -0.018119478598237038, 0.001729840412735939, 0.02218853496015072, 0.035027362406253815, -0.09693848341703415, 0.01938164047896862, -0.1535816192626953, -0.0479704886674881, -0.09718881547451019, 0.054506637156009674, 0.029848741367459297, -0.06666885316371918, -0.010430785827338696, -0.07095720618963242, -0.03837977349758148, 0.04168104752898216, 0.13265399634838104, -0.07359743118286133, 0.19052670896053314, 0.16120533645153046, 0.09365106374025345, -0.16593292355537415, 0.025612706318497658, -0.061363335698843, -0.061250776052474976, 0.09493084251880646, -0.07538038492202759, 0.046931520104408264, 0.09057863056659698, -0.0363500714302063, 0.09744144976139069, 0.08892326056957245, -0.01721867173910141, -0.005885282531380653, 0.10404489189386368, -0.28943848609924316, -0.0012335312785580754, -0.04141814261674881, 0.03749097138643265, 0.0706285759806633, 0.08287683874368668, 0.2002297192811966, -0.00037451853859238327, -0.06811629980802536, -0.006827784702181816, 0.051596224308013916, -0.0546976737678051, 0.06786540895700455, -0.023279303684830666, 0.02716057561337948, -0.15581510961055756, 0.04097403585910797, 0.017894990742206573, -0.09174242615699768, 0.04731019213795662, 0.1674218475818634, -0.10944526642560959, -0.13090835511684418, -0.07370325177907944, 0.08427153527736664, -0.0769939050078392, 0.014951529912650585, -0.02274724841117859, -0.12307044118642807, 0.09705822169780731, 0.14415685832500458, 0.04462552070617676, 0.06991852074861526, -0.0690818727016449, -0.01180460024625063, -0.014567393809556961, -0.007809052709490061, 0.0009489525691606104, 0.019562147557735443, -0.01028464175760746, 0.019347602501511574, -0.023796478286385536, 0.11861518025398254, -0.0945991650223732, -0.08782494813203812, -0.18878060579299927, 0.038008175790309906, -0.13181288540363312, -0.0761663168668747, -0.09996651858091354, -0.053141698241233826, 0.000979511416517198, -0.03447364643216133, -0.03376716002821922, -0.05029051750898361, -0.1077083870768547, 0.026815690100193024, -0.04647703841328621, 0.04285716637969017, -0.121421217918396, 0.03409375995397568, 0.09332330524921417, -0.058104533702135086, 0.16155703365802765, 0.15682050585746765, -0.09318212419748306, 0.07845820486545563, -0.12566114962100983, -0.0683261901140213, 0.1279447078704834, -0.0025643312837928534, 0.036462727934122086, 0.08992704004049301, 0.024318095296621323, 0.06101861596107483, 0.05099788308143616, 0.05838392674922943, 0.11023902148008347, -0.10186591744422913, 0.05606961250305176, -0.020200520753860474, -0.14553619921207428, -0.029792990535497665, -0.0397031344473362, 0.035513050854206085, 0.010734638199210167, 0.06849972158670425, -0.06679591536521912, 0.08919834345579147, -0.05105314403772354, 0.019588211551308632, 0.003242513397708535, -0.17732955515384674, -0.014198181219398975, -0.10004813969135284, 0.024431688711047173, 0.007259441539645195, 0.25050321221351624, 0.029818003997206688, -0.008812793530523777, 0.022513004019856453, 0.08151181042194366, 0.029068822041153908, 0.029033103957772255, 0.17342989146709442, 0.08309419453144073, -0.06218401715159416, -0.13393183052539825, 0.06202925741672516, 0.012779651209712029, 0.05504680052399635, 0.10654009878635406, 0.05981411039829254, -0.048437949270009995, 0.06045788154006004, -0.00927590113133192, 0.027956368401646614, -0.11255290359258652, -0.1596049666404724, -0.051056861877441406, 0.06358379125595093, -0.0397067554295063, 0.12105828523635864, 0.12218006700277328, -0.032227400690317154, 0.007712609600275755, -0.018643252551555634, -0.04922863095998764, -0.14967332780361176, -0.11896781623363495, -0.05350566282868385, -0.12205114960670471, 0.011524338275194168, -0.09071875363588333, 0.03947072848677635, 0.032331354916095734, 0.06074453145265579, -0.05948539078235626, 0.14244994521141052, 0.061188217252492905, -0.11220669001340866, 0.0653870478272438, -0.0037791274953633547, 0.03899082541465759, -0.044437918812036514, -0.025120874866843224, -0.09214923530817032, 0.011023752391338348, 0.023526176810264587, 0.051312822848558426, -0.06791822612285614, -0.0016826913924887776, -0.14322006702423096, -0.08487700670957565, -0.0394696444272995, 0.04866306483745575, -0.05023662745952606, 0.1642676740884781, 0.009525329805910587, -0.025920147076249123, 0.03980735316872597, 0.18757548928260803, -0.06569670885801315, -0.05740028992295265, -0.061079103499650955, 0.2654319405555725, 0.003808156820014119, 0.08950404822826385, -0.04851897433400154, 0.019039388746023178, -0.10233274847269058, 0.31112176179885864, 0.2889745533466339, -0.11570275574922562, 0.019497746601700783, 0.00267986161634326, 0.05070123076438904, 0.13644829392433167, 0.1442762166261673, 0.10564230382442474, 0.30452725291252136, -0.049554493278265, -0.009600816294550896, 0.008292791433632374, -0.05423891171813011, -0.10872315615415573, 0.10191518068313599, 0.046052053570747375, -0.07097305357456207, -0.051319852471351624, 0.06611884385347366, -0.2430342584848404, 0.08742429316043854, -0.09402911365032196, -0.20991599559783936, -0.061329182237386703, 0.008695591241121292, 0.07227429747581482, 0.01855143904685974, 0.07725254446268082, 0.011252488009631634, -0.09869740903377533, 0.06569153815507889, 0.02433895133435726, -0.2159195989370346, 0.002435178030282259, 0.060930509120225906, -0.13729606568813324, 0.016866179183125496, -0.04461656138300896, 0.024356702342629433, 0.05867326632142067, 0.04714081063866615, 0.0030829021707177162, 0.032254382967948914, -0.002440518466755748, -0.02262018620967865, -0.007517869584262371, 0.07979123294353485, 0.01503628958016634, -0.09828265756368637, 0.07816318422555923, -0.17583686113357544, 0.01952325366437435, -0.007547630462795496, -0.020200328901410103, -0.001953412313014269, 0.035544175654649734, -0.05971439182758331, 0.05464267358183861, 0.0829974040389061, -0.022736750543117523, -0.011044321581721306, -0.06745599210262299, 0.004517103545367718, -0.021792719140648842, -0.09803014993667603, -0.11051607877016068, -0.1688210815191269, -0.13380204141139984, 0.037805862724781036, 0.00585087388753891, -0.23689515888690948, 0.02954542450606823, -0.13089050352573395, 0.0955151915550232, -0.17577627301216125, 0.11024092137813568, 0.03501670062541962, 0.006343763321638107, 0.006624797824770212, -0.03926124423742294, 0.053036514669656754, 0.07803836464881897, -0.1031845360994339, -0.05701852962374687 ]
null
null
transformers
# Bapibot
{"tags": ["conversational"]}
text-generation
NanniKirby/bapismall
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Bapibot
[ "# Bapibot" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Bapibot" ]
[ 51, 4 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Bapibot" ]
[ -0.0017511481419205666, 0.011250718496739864, -0.007105078548192978, 0.014087896794080734, 0.12434225529432297, 0.004416945856064558, 0.1292487233877182, 0.1294037401676178, 0.05080842599272728, -0.00788082368671894, 0.15101216733455658, 0.17526975274085999, 0.006775634828954935, 0.17518381774425507, -0.07102128863334656, -0.26338180899620056, 0.07620087265968323, 0.05055075138807297, 0.08102144300937653, 0.13387872278690338, 0.07828658074140549, -0.04454581439495087, 0.0946480855345726, -0.038753777742385864, -0.1196356788277626, 0.022293001413345337, 0.02305344119668007, -0.12182280421257019, 0.1052849143743515, 0.04030470922589302, 0.06339428573846817, 0.011224220506846905, -0.07818161696195602, -0.14059129357337952, 0.03709583729505539, 0.015355461277067661, -0.058539580553770065, 0.0523231141269207, 0.011728175915777683, -0.09777050465345383, 0.1937159150838852, 0.09967375546693802, -0.009738285094499588, 0.07374898344278336, -0.16750361025333405, -0.014507761225104332, 0.0034449161030352116, 0.08643744885921478, 0.053061358630657196, 0.07830766588449478, -0.03413871303200722, 0.10113906115293503, -0.07362164556980133, 0.10694681853055954, 0.1385362446308136, -0.28448089957237244, -0.01919391378760338, 0.048505570739507675, 0.07070586085319519, 0.08386410772800446, -0.05857731029391289, 0.03844190388917923, 0.022003045305609703, 0.026592228561639786, -0.03245708346366882, -0.06952455639839172, -0.07897067815065384, 0.004927807953208685, -0.07912197709083557, -0.06715267896652222, 0.22136260569095612, -0.06773879379034042, 0.0384698361158371, -0.06539565324783325, -0.12327640503644943, -0.04955113306641579, -0.03788921236991882, 0.013159535825252533, -0.0887410044670105, 0.07490301132202148, 0.01743420958518982, -0.08355604857206345, -0.10982082039117813, -0.034911490976810455, -0.14221419394016266, 0.15075300633907318, 0.03118259832262993, 0.05490957945585251, -0.18879668414592743, 0.09513366967439651, 0.09063911437988281, -0.09794734418392181, 0.012371692806482315, -0.09828171133995056, 0.07523719221353531, 0.02695264294743538, -0.01942623034119606, -0.011637488380074501, 0.08099702000617981, 0.18244598805904388, -0.023470653221011162, 0.04112561419606209, -0.05866257846355438, 0.06802737712860107, 0.04330535605549812, 0.05058376491069794, 0.01016483549028635, -0.010575215332210064, 0.04313905909657478, -0.08271214365959167, 0.006649345625191927, -0.0696047991514206, -0.15257856249809265, -0.010490801185369492, 0.05910295620560646, 0.06431310623884201, 0.01385466754436493, 0.09990697354078293, -0.04820932820439339, -0.03127741441130638, 0.06686968356370926, -0.04610613361001015, 0.00025555313914082944, 0.02514314092695713, 0.022945338860154152, 0.08682192116975784, 0.03549877181649208, 0.008559205569326878, -0.1334124654531479, 0.05331631749868393, -0.07127733528614044, -0.003518803743645549, -0.023419158533215523, -0.01782975345849991, 0.014263879507780075, -0.07272767275571823, -0.006280501373112202, -0.14799964427947998, -0.15082962810993195, 0.032678522169589996, -0.008464082144200802, -0.058054763823747635, -0.05192820355296135, -0.04510946571826935, -0.0026791864074766636, 0.02507198601961136, -0.0721500962972641, -0.012633901089429855, -0.06718849390745163, 0.12627333402633667, -0.025815600529313087, 0.12141858786344528, -0.1263977587223053, 0.06454204022884369, -0.12083960324525833, -0.008531259372830391, -0.07914237678050995, 0.08761459589004517, -0.02001926302909851, 0.06751041114330292, -0.033074554055929184, -0.018282342702150345, -0.08101292699575424, 0.05817386507987976, -0.028213150799274445, 0.2158772051334381, -0.06074972078204155, -0.11696600168943405, 0.3252049386501312, -0.05978121608495712, -0.13671088218688965, 0.11592056602239609, 0.0025376600679010153, 0.08551787585020065, 0.1094394326210022, 0.1790281981229782, -0.08567206561565399, -0.007753870915621519, 0.07438565045595169, 0.06927944719791412, -0.0604916512966156, 0.0024515960831195116, -0.0059074112214148045, -0.025102457031607628, -0.09662521630525589, 0.03701172024011612, 0.1343100666999817, 0.0607074573636055, -0.03880254924297333, -0.03515775501728058, -0.005387125536799431, -0.008894911967217922, 0.09551075100898743, -0.00016178771329578012, 0.10810365527868271, -0.04347845911979675, -0.03661409020423889, -0.024372370913624763, -0.003201813204213977, -0.011579100973904133, 0.021436933428049088, -0.03842025622725487, 0.10971415042877197, 0.004494994413107634, 0.06822986900806427, -0.13380348682403564, -0.03021402284502983, -0.016651524230837822, 0.1429925262928009, 0.05004606768488884, 0.05227561667561531, 0.06167182698845863, -0.015655608847737312, 0.0019969658460468054, 0.015711378306150436, 0.13579633831977844, -0.013892199844121933, -0.07391507923603058, -0.0713772252202034, 0.053612250834703445, -0.044055212289094925, 0.06532503664493561, -0.031174032017588615, 0.013181757181882858, 0.03382924571633339, 0.1297212839126587, -0.002855091355741024, 0.01390718761831522, 0.008473231457173824, 0.00031223968835547566, -0.06875152885913849, 0.011284063570201397, 0.09325157105922699, -0.03049524500966072, -0.05595379322767258, 0.1854894906282425, -0.14268310368061066, 0.1809924691915512, 0.18461838364601135, -0.2169969528913498, 0.006989262066781521, -0.06757842749357224, -0.04333429038524628, 0.001391394529491663, 0.06645659357309341, -0.04053378477692604, 0.16684623062610626, -0.004742867778986692, 0.19418460130691528, -0.06754010915756226, -0.033819153904914856, -0.014858691021800041, -0.06710674613714218, 0.007762156426906586, 0.07826805114746094, 0.12667177617549896, -0.1518745720386505, 0.20558227598667145, 0.1540810465812683, 0.052801452577114105, 0.25207340717315674, 0.025363245978951454, -0.01826503500342369, 0.08966691792011261, 0.025446632876992226, -0.05035993456840515, -0.044772349298000336, -0.2819068729877472, -0.02762690931558609, 0.07189897447824478, 0.003995971288532019, 0.10206298530101776, -0.0882275328040123, -0.014142680913209915, -0.03909510746598244, -0.021323394030332565, 0.03596283867955208, 0.09497835487127304, 0.05735106021165848, 0.14303937554359436, -0.0013663818826898932, -0.042738452553749084, 0.06491530686616898, 0.03594905510544777, -0.08051818609237671, 0.17025767266750336, -0.10789845138788223, -0.3303767144680023, -0.12666897475719452, -0.18723608553409576, -0.07798880338668823, 0.058363087475299835, 0.11355265229940414, -0.09560500085353851, -0.009067423641681671, 0.008312875404953957, 0.11399012058973312, -0.03743589296936989, -0.010863079689443111, -0.0522383414208889, 0.011008509434759617, -0.07872682064771652, -0.0853753611445427, -0.068702831864357, -0.03372227028012276, -0.07870011776685715, 0.14050467312335968, -0.12022721767425537, 0.03750775381922722, 0.16198845207691193, 0.04623846337199211, 0.05580172315239906, -0.03629470616579056, 0.20418009161949158, -0.09202136099338531, 0.015429901890456676, 0.17943108081817627, -0.06368906050920486, 0.05735187977552414, 0.10585559904575348, -0.008696191012859344, -0.11744841188192368, 0.02588449977338314, -0.041710980236530304, -0.08276889473199844, -0.1760098785161972, -0.1465461701154709, -0.12963515520095825, 0.12091411650180817, 0.07658551633358002, 0.05677100270986557, 0.1512013077735901, 0.06194658949971199, -0.06037645414471626, 0.05355751886963844, 0.03626605123281479, 0.09198823571205139, 0.16916067898273468, -0.07611466199159622, 0.1473371833562851, -0.02189563401043415, -0.15301448106765747, 0.09273351728916168, 0.09584911912679672, 0.05191873013973236, 0.05364310368895531, 0.07845684885978699, 0.03606307879090309, 0.046751443296670914, 0.11609923094511032, 0.090122289955616, 0.00010615537030389532, -0.013747138902544975, -0.0402374342083931, -0.030403416603803635, -0.11337801069021225, 0.0520600862801075, 0.04358001425862312, -0.12633073329925537, -0.012312641367316246, -0.13545677065849304, 0.08751768618822098, 0.13425107300281525, 0.07082904130220413, -0.20407675206661224, -0.0332791768014431, 0.08744991570711136, -0.04270715266466141, -0.15019790828227997, 0.07476996630430222, -0.013571611605584621, -0.1596647948026657, 0.034726981073617935, -0.02609577775001526, 0.11290229111909866, -0.06935574114322662, 0.08500628918409348, -0.10983198881149292, -0.09639723598957062, 0.01283550076186657, 0.10171009600162506, -0.3023146986961365, 0.1680195927619934, -0.014083834365010262, -0.06848939508199692, -0.1420067697763443, 0.0029088682495057583, 0.0009256050689145923, 0.04594671353697777, 0.07863319665193558, -0.0024096015840768814, 0.05605536699295044, -0.10230256617069244, -0.03957638889551163, 0.027148356661200523, 0.08965093642473221, -0.05737045034766197, -0.030302388593554497, -0.04060925915837288, -0.00220161909237504, -0.056495003402233124, -0.06687954068183899, 0.03863503411412239, -0.1757507175207138, 0.08982572704553604, 0.04320790246129036, 0.08141466230154037, 0.011262413114309311, 0.0017246654024347663, 0.039298396557569504, 0.24140101671218872, -0.09092174470424652, -0.11748185753822327, -0.09149523824453354, -0.04215710610151291, 0.03203820437192917, -0.04768017306923866, 0.04130777344107628, -0.08861560374498367, 0.0023050580639392138, -0.0888088047504425, -0.19696177542209625, 0.13487368822097778, -0.09851716458797455, -0.0831485390663147, -0.03295236825942993, 0.21626687049865723, -0.03607114031910896, -0.0030084550380706787, 0.01822637766599655, 0.005079708527773619, -0.12757980823516846, -0.0829785093665123, 0.0008442185935564339, 0.018169669434428215, 0.044842492789030075, 0.0533379465341568, -0.06072048470377922, -0.05110444501042366, -0.05154569447040558, -0.009274646639823914, 0.2839198410511017, 0.15673211216926575, -0.021633261814713478, 0.2193676233291626, 0.14408932626247406, -0.049755338579416275, -0.2991931140422821, -0.16014660894870758, -0.10840560495853424, -0.04882722347974777, -0.11073663830757141, -0.18954622745513916, 0.07294496893882751, -0.005246181972324848, -0.021851597353816032, 0.16042251884937286, -0.258932888507843, -0.09119062125682831, 0.1883215308189392, -0.023368462920188904, 0.3800354599952698, -0.13420099020004272, -0.08837100118398666, -0.0714380219578743, -0.10495177656412125, 0.10388769209384918, 0.013683130964636803, 0.1205175444483757, -0.042062122374773026, 0.1583060771226883, 0.04185004532337189, -0.0074843461625278, 0.08516961336135864, -0.0019985903054475784, -0.04917994514107704, -0.10770098865032196, -0.018119478598237038, 0.001729840412735939, 0.02218853496015072, 0.035027362406253815, -0.09693848341703415, 0.01938164047896862, -0.1535816192626953, -0.0479704886674881, -0.09718881547451019, 0.054506637156009674, 0.029848741367459297, -0.06666885316371918, -0.010430785827338696, -0.07095720618963242, -0.03837977349758148, 0.04168104752898216, 0.13265399634838104, -0.07359743118286133, 0.19052670896053314, 0.16120533645153046, 0.09365106374025345, -0.16593292355537415, 0.025612706318497658, -0.061363335698843, -0.061250776052474976, 0.09493084251880646, -0.07538038492202759, 0.046931520104408264, 0.09057863056659698, -0.0363500714302063, 0.09744144976139069, 0.08892326056957245, -0.01721867173910141, -0.005885282531380653, 0.10404489189386368, -0.28943848609924316, -0.0012335312785580754, -0.04141814261674881, 0.03749097138643265, 0.0706285759806633, 0.08287683874368668, 0.2002297192811966, -0.00037451853859238327, -0.06811629980802536, -0.006827784702181816, 0.051596224308013916, -0.0546976737678051, 0.06786540895700455, -0.023279303684830666, 0.02716057561337948, -0.15581510961055756, 0.04097403585910797, 0.017894990742206573, -0.09174242615699768, 0.04731019213795662, 0.1674218475818634, -0.10944526642560959, -0.13090835511684418, -0.07370325177907944, 0.08427153527736664, -0.0769939050078392, 0.014951529912650585, -0.02274724841117859, -0.12307044118642807, 0.09705822169780731, 0.14415685832500458, 0.04462552070617676, 0.06991852074861526, -0.0690818727016449, -0.01180460024625063, -0.014567393809556961, -0.007809052709490061, 0.0009489525691606104, 0.019562147557735443, -0.01028464175760746, 0.019347602501511574, -0.023796478286385536, 0.11861518025398254, -0.0945991650223732, -0.08782494813203812, -0.18878060579299927, 0.038008175790309906, -0.13181288540363312, -0.0761663168668747, -0.09996651858091354, -0.053141698241233826, 0.000979511416517198, -0.03447364643216133, -0.03376716002821922, -0.05029051750898361, -0.1077083870768547, 0.026815690100193024, -0.04647703841328621, 0.04285716637969017, -0.121421217918396, 0.03409375995397568, 0.09332330524921417, -0.058104533702135086, 0.16155703365802765, 0.15682050585746765, -0.09318212419748306, 0.07845820486545563, -0.12566114962100983, -0.0683261901140213, 0.1279447078704834, -0.0025643312837928534, 0.036462727934122086, 0.08992704004049301, 0.024318095296621323, 0.06101861596107483, 0.05099788308143616, 0.05838392674922943, 0.11023902148008347, -0.10186591744422913, 0.05606961250305176, -0.020200520753860474, -0.14553619921207428, -0.029792990535497665, -0.0397031344473362, 0.035513050854206085, 0.010734638199210167, 0.06849972158670425, -0.06679591536521912, 0.08919834345579147, -0.05105314403772354, 0.019588211551308632, 0.003242513397708535, -0.17732955515384674, -0.014198181219398975, -0.10004813969135284, 0.024431688711047173, 0.007259441539645195, 0.25050321221351624, 0.029818003997206688, -0.008812793530523777, 0.022513004019856453, 0.08151181042194366, 0.029068822041153908, 0.029033103957772255, 0.17342989146709442, 0.08309419453144073, -0.06218401715159416, -0.13393183052539825, 0.06202925741672516, 0.012779651209712029, 0.05504680052399635, 0.10654009878635406, 0.05981411039829254, -0.048437949270009995, 0.06045788154006004, -0.00927590113133192, 0.027956368401646614, -0.11255290359258652, -0.1596049666404724, -0.051056861877441406, 0.06358379125595093, -0.0397067554295063, 0.12105828523635864, 0.12218006700277328, -0.032227400690317154, 0.007712609600275755, -0.018643252551555634, -0.04922863095998764, -0.14967332780361176, -0.11896781623363495, -0.05350566282868385, -0.12205114960670471, 0.011524338275194168, -0.09071875363588333, 0.03947072848677635, 0.032331354916095734, 0.06074453145265579, -0.05948539078235626, 0.14244994521141052, 0.061188217252492905, -0.11220669001340866, 0.0653870478272438, -0.0037791274953633547, 0.03899082541465759, -0.044437918812036514, -0.025120874866843224, -0.09214923530817032, 0.011023752391338348, 0.023526176810264587, 0.051312822848558426, -0.06791822612285614, -0.0016826913924887776, -0.14322006702423096, -0.08487700670957565, -0.0394696444272995, 0.04866306483745575, -0.05023662745952606, 0.1642676740884781, 0.009525329805910587, -0.025920147076249123, 0.03980735316872597, 0.18757548928260803, -0.06569670885801315, -0.05740028992295265, -0.061079103499650955, 0.2654319405555725, 0.003808156820014119, 0.08950404822826385, -0.04851897433400154, 0.019039388746023178, -0.10233274847269058, 0.31112176179885864, 0.2889745533466339, -0.11570275574922562, 0.019497746601700783, 0.00267986161634326, 0.05070123076438904, 0.13644829392433167, 0.1442762166261673, 0.10564230382442474, 0.30452725291252136, -0.049554493278265, -0.009600816294550896, 0.008292791433632374, -0.05423891171813011, -0.10872315615415573, 0.10191518068313599, 0.046052053570747375, -0.07097305357456207, -0.051319852471351624, 0.06611884385347366, -0.2430342584848404, 0.08742429316043854, -0.09402911365032196, -0.20991599559783936, -0.061329182237386703, 0.008695591241121292, 0.07227429747581482, 0.01855143904685974, 0.07725254446268082, 0.011252488009631634, -0.09869740903377533, 0.06569153815507889, 0.02433895133435726, -0.2159195989370346, 0.002435178030282259, 0.060930509120225906, -0.13729606568813324, 0.016866179183125496, -0.04461656138300896, 0.024356702342629433, 0.05867326632142067, 0.04714081063866615, 0.0030829021707177162, 0.032254382967948914, -0.002440518466755748, -0.02262018620967865, -0.007517869584262371, 0.07979123294353485, 0.01503628958016634, -0.09828265756368637, 0.07816318422555923, -0.17583686113357544, 0.01952325366437435, -0.007547630462795496, -0.020200328901410103, -0.001953412313014269, 0.035544175654649734, -0.05971439182758331, 0.05464267358183861, 0.0829974040389061, -0.022736750543117523, -0.011044321581721306, -0.06745599210262299, 0.004517103545367718, -0.021792719140648842, -0.09803014993667603, -0.11051607877016068, -0.1688210815191269, -0.13380204141139984, 0.037805862724781036, 0.00585087388753891, -0.23689515888690948, 0.02954542450606823, -0.13089050352573395, 0.0955151915550232, -0.17577627301216125, 0.11024092137813568, 0.03501670062541962, 0.006343763321638107, 0.006624797824770212, -0.03926124423742294, 0.053036514669656754, 0.07803836464881897, -0.1031845360994339, -0.05701852962374687 ]
null
null
transformers
# Spanish RoBERTa2RoBERTa (roberta-base-bne) fine-tuned on MLSUM ES for summarization ## Model [BSC-TeMU/roberta-base-bne](https://huggingface.co/BSC-TeMU/roberta-base-bne) (RoBERTa Checkpoint) ## Dataset **MLSUM** is the first large-scale MultiLingual SUMmarization dataset. Obtained from online newspapers, it contains 1.5M+ article/summary pairs in five different languages -- namely, French, German, **Spanish**, Russian, Turkish. Together with English newspapers from the popular CNN/Daily mail dataset, the collected data form a large scale multilingual dataset which can enable new research directions for the text summarization community. We report cross-lingual comparative analyses based on state-of-the-art systems. These highlight existing biases which motivate the use of a multi-lingual dataset. [MLSUM es](https://huggingface.co/datasets/viewer/?dataset=mlsum) ## Results |Set|Metric| Value| |----|------|------| | Test |Rouge2 - mid -precision | 11.42| | Test | Rouge2 - mid - recall | 10.58 | | Test | Rouge2 - mid - fmeasure | 10.69| | Test | Rouge1 - fmeasure | 28.83 | | Test | RougeL - fmeasure | 23.15 | Raw metrics using HF/metrics `rouge`: ```python rouge = datasets.load_metric("rouge") rouge.compute(predictions=results["pred_summary"], references=results["summary"]) {'rouge1': AggregateScore(low=Score(precision=0.30393366820245, recall=0.27905239591639935, fmeasure=0.283148902808752), mid=Score(precision=0.3068521142101569, recall=0.2817252494122592, fmeasure=0.28560373425206464), high=Score(precision=0.30972608774202665, recall=0.28458152325781716, fmeasure=0.2883786700591887)), 'rougeL': AggregateScore(low=Score(precision=0.24184668819794716, recall=0.22401171380621518, fmeasure=0.22624104698839514), mid=Score(precision=0.24470388406868163, recall=0.22665793214539162, fmeasure=0.2289118878817394), high=Score(precision=0.2476594458951327, recall=0.22932683203591905, fmeasure=0.23153001570662513))} rouge.compute(predictions=results["pred_summary"], references=results["summary"], rouge_types=["rouge2"])["rouge2"].mid Score(precision=0.11423200347113865, recall=0.10588038944902506, fmeasure=0.1069921217219595) ``` ## Usage ```python import torch from transformers import RobertaTokenizerFast, EncoderDecoderModel device = 'cuda' if torch.cuda.is_available() else 'cpu' ckpt = 'Narrativa/bsc_roberta2roberta_shared-spanish-finetuned-mlsum-summarization' tokenizer = RobertaTokenizerFast.from_pretrained(ckpt) model = EncoderDecoderModel.from_pretrained(ckpt).to(device) def generate_summary(text): inputs = tokenizer([text], padding="max_length", truncation=True, max_length=512, return_tensors="pt") input_ids = inputs.input_ids.to(device) attention_mask = inputs.attention_mask.to(device) output = model.generate(input_ids, attention_mask=attention_mask) return tokenizer.decode(output[0], skip_special_tokens=True) text = "Your text here..." generate_summary(text) ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "es", "tags": ["summarization", "news"], "datasets": ["mlsum"], "widget": [{"text": "Al filo de las 22.00 horas del jueves, la Asamblea de Madrid vive un momento sorprendente: Vox decide no apoyar una propuesta del PP en favor del blindaje fiscal de la Comunidad. Se ha roto la unidad de los tres partidos de derechas. Es un hecho excepcional. Desde que arranc\u00f3 la legislatura, PP, Cs y Vox han votado en bloque casi el 75% de las veces en el pleno de la C\u00e1mara. Juntos decidieron la composici\u00f3n de la Mesa de la Asamblea. Juntos invistieron presidenta a Isabel D\u00edaz Ayuso. Y juntos han votado la mayor\u00eda de proposiciones no de ley, incluida la que ha marcado el esprint final de la campa\u00f1a para las elecciones generales: acaban de instar al Gobierno de Espa\u00f1a a \"la ilegalizaci\u00f3n inmediata\" de los partidos separatistas \"que atenten contra la unidad de la Naci\u00f3n\". Los cr\u00edticos de Cs no comparten el apoyo al texto de Vox contra el secesionisimo Ese balance retrata una necesidad antes que una complicidad, seg\u00fan fuentes del PP con predicamento en la direcci\u00f3n regional y nacional. Tras casi 15 a\u00f1os gobernando con mayor\u00eda absoluta, la formaci\u00f3n conservadora vivi\u00f3 como una tortura la pasada legislatura, en la que dependi\u00f3 de Cs para sacar adelante sus iniciativas. El problema se agudiz\u00f3 tras las elecciones auton\u00f3micas de mayo. El PP ha tenido que formar con Cs el primer gobierno de coalici\u00f3n de la historia de la regi\u00f3n, y ni siquiera con eso le basta para ganar las votaciones de la C\u00e1mara. Los dos socios gubernamentales necesitan a Vox, la menos predecible de las tres formaciones. \"Tenemos que trabajar juntos defendiendo la unidad del pa\u00eds, por eso no quisimos dejar a Vox solo\", dijo ayer D\u00edaz Ayuso para justificar el apoyo de PP y Cs a la proposici\u00f3n de la extrema derecha sobre Catalu\u00f1a. \"Despu\u00e9s nosotros llev\u00e1bamos otra proposici\u00f3n para defender el blindaje fiscal de Madrid, y ah\u00ed Vox nos dej\u00f3 atr\u00e1s. No permiti\u00f3 que esto saliera. Es un grave error por su parte\", prosigui\u00f3, recalcando el enfado del PP. \"Demuestra que est\u00e1 m\u00e1s en cuestiones electoralistas\", subray\u00f3. \"Los que pensamos, con nuestras inmensas diferencias, que tenemos cosas en com\u00fan que nos unen como partidos que queremos Comunidades libres, con bajos impuestos, en las que se viva con seguridad y en paz, tenemos que estar unidos\", argument\u00f3. \"Y por lo menos nosotros de nuestra l\u00ednea no nos separamos\". Al contrario de lo que est\u00e1 ocurriendo el Ayuntamiento de Madrid, donde el PP y Cs ya han defendido posiciones de voto distintas, pese a compartir el Gobierno, en la Asamblea los partidos de D\u00edaz Ayuso e Ignacio Aguado est\u00e1n actuando con la m\u00e1xima lealtad en las votaciones del pleno. Otra cosa son las comisiones. Y el caso Avalmadrid. Es en ese terreno donde Cs y Vox est\u00e1n buscando el margen de maniobra necesario para separarse del PP en plena campa\u00f1a electoral, abandonando a su suerte a su socio para distinguirse ante los electores. \u2014\"Usted me ha dejado tirada\", le espet\u00f3 la presidenta de la Comunidad de Madrid a Roc\u00edo Monasterio tras saber que Vox permitir\u00eda que la izquierda tuviera mayor\u00eda en la comisi\u00f3n parlamentaria que investigar\u00e1 los avales concedidos por la empresa semip\u00fablica entre 2007 y 2018, lo que podr\u00eda incluir el de 400.000 euros aprobado en 2011, y nunca devuelto al completo, para una empresa participada por el padre de Isabel D\u00edaz Ayuso. \"Monasterio no es de fiar. Dice una cosa y hace la contraria\", dice una fuente popular sobre las negociaciones mantenidas para repartirse los puestos de las diferentes comisiones, que Vox no cumpli\u00f3 tras buscar un segundo pacto con otras formaciones (que no lleg\u00f3 a buen puerto). Ilegalizaci\u00f3n de Vox Los tres partidos de derechas tambi\u00e9n se han enfrentado por la ubicaci\u00f3n de Vox en el pleno. Las largas negociaciones para la investidura de D\u00edaz Ayuso dejaron heridas abiertas. Y los diputados de Cs no desaprovechan la oportunidad de lanzar dardos contra los de Vox, pero luego coinciden con ellos en la mayor\u00eda de votaciones. Ocurri\u00f3, por ejemplo, el jueves, cuando se debat\u00eda la pol\u00e9mica proposici\u00f3n para instar al Gobierno nacional a ilegalizar a los partidos separatistas que atenten contra la unidad de Espa\u00f1a. \u2014\"Mostrar nuestra sorpresa ante la presentaci\u00f3n por parte de Vox de esta propuesta\", lanz\u00f3 Araceli G\u00f3mez, diputada de la formaci\u00f3n de Aguado. \"Sorprende que planteen ustedes este asunto cuando est\u00e1 tambi\u00e9n sobre la mesa el debate de su propia ilegalizaci\u00f3n por atentar contra el ordenamiento jur\u00eddico o contra valores constitucionales como la igualdad o la no discriminaci\u00f3n\". Luego de esa descalificaci\u00f3n, y ante la incredulidad de los diputados de los partidos de izquierdas, Cs uni\u00f3 sus votos a los de Vox y a los del PP. La decisi\u00f3n ha provocado pol\u00e9mica interna, como demuestra que Albert Rivera no la apoyara ayer expl\u00edcitamente. Tampoco ha sido bien acogida por el sector cr\u00edtico de la formaci\u00f3n. Pero ha demostrado una cosa: en Madrid hay tres partidos que casi siempre votan como uno."}]}
summarization
Narrativa/bsc_roberta2roberta_shared-spanish-finetuned-mlsum-summarization
[ "transformers", "pytorch", "encoder-decoder", "text2text-generation", "summarization", "news", "es", "dataset:mlsum", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "es" ]
TAGS #transformers #pytorch #encoder-decoder #text2text-generation #summarization #news #es #dataset-mlsum #autotrain_compatible #endpoints_compatible #has_space #region-us
Spanish RoBERTa2RoBERTa (roberta-base-bne) fine-tuned on MLSUM ES for summarization =================================================================================== Model ----- BSC-TeMU/roberta-base-bne (RoBERTa Checkpoint) Dataset ------- MLSUM is the first large-scale MultiLingual SUMmarization dataset. Obtained from online newspapers, it contains 1.5M+ article/summary pairs in five different languages -- namely, French, German, Spanish, Russian, Turkish. Together with English newspapers from the popular CNN/Daily mail dataset, the collected data form a large scale multilingual dataset which can enable new research directions for the text summarization community. We report cross-lingual comparative analyses based on state-of-the-art systems. These highlight existing biases which motivate the use of a multi-lingual dataset. MLSUM es Results ------- Set: Test, Metric: Rouge2 - mid -precision, Value: 11.42 Set: Test, Metric: Rouge2 - mid - recall, Value: 10.58 Set: Test, Metric: Rouge2 - mid - fmeasure, Value: 10.69 Set: Test, Metric: Rouge1 - fmeasure, Value: 28.83 Set: Test, Metric: RougeL - fmeasure, Value: 23.15 Raw metrics using HF/metrics 'rouge': Usage ----- Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[]
[ "TAGS\n#transformers #pytorch #encoder-decoder #text2text-generation #summarization #news #es #dataset-mlsum #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ 60 ]
[ "passage: TAGS\n#transformers #pytorch #encoder-decoder #text2text-generation #summarization #news #es #dataset-mlsum #autotrain_compatible #endpoints_compatible #has_space #region-us \n" ]
[ -0.02062350884079933, 0.06749396026134491, -0.004990261979401112, -0.022293534129858017, 0.16329210996627808, 0.021201778203248978, 0.04116610437631607, 0.13796637952327728, -0.06073347106575966, 0.009822838939726353, 0.12187223136425018, 0.16030822694301605, -0.027028080075979233, 0.15601977705955505, -0.09924864768981934, -0.23675349354743958, 0.0909116268157959, 0.05455796420574188, 0.032586514949798584, 0.10034254938364029, 0.09887459874153137, -0.09324098378419876, 0.07725448161363602, -0.0577564537525177, -0.1114296019077301, 0.061672892421483994, -0.01009431853890419, -0.10388921201229095, 0.07148031890392303, 0.023000843822956085, 0.05822219327092171, 0.03663986176252365, -0.05737869441509247, -0.1320265382528305, 0.04079848900437355, 0.026522796601057053, -0.06143345683813095, 0.048813868314027786, 0.046654798090457916, -0.06754361093044281, 0.10782229900360107, -0.1125657856464386, -0.04647894203662872, 0.014894222840666771, -0.13623714447021484, -0.07268206775188446, -0.011357893235981464, -0.06363972276449203, 0.05390214920043945, 0.11128762364387512, -0.0437852181494236, 0.11652152985334396, -0.10699187964200974, 0.11338485777378082, 0.09628458321094513, -0.2571932375431061, 0.006216590758413076, 0.12243400514125824, 0.0869344100356102, 0.040347836911678314, -0.023707838729023933, 0.07489431649446487, 0.05121522769331932, 0.026985900476574898, 0.008979735895991325, -0.02400599978864193, -0.13287264108657837, 0.04558130353689194, -0.0806879848241806, -0.08915682882070541, 0.25518471002578735, -0.04628694802522659, 0.11088138818740845, -0.08513970673084259, -0.09475021064281464, -0.05946723371744156, -0.02425527758896351, 0.09127558767795563, -0.045294228941202164, 0.08037718385457993, -0.0014699199236929417, -0.01213367935270071, -0.13546091318130493, 0.06063629686832428, -0.21826517581939697, 0.12339833378791809, -0.013220211490988731, 0.06201945245265961, -0.1807039976119995, 0.07111742347478867, 0.10042359679937363, -0.15437932312488556, 0.06802187860012054, -0.06176931411027908, 0.037612948566675186, 0.02149164490401745, -0.11075641959905624, -0.12814736366271973, 0.08031100034713745, 0.0012101419270038605, -0.12130618095397949, 0.009212103672325611, 0.012866770848631859, 0.10995802283287048, 0.09865880757570267, 0.030380483716726303, -0.08670446276664734, -0.06675166636705399, -0.012538650073111057, -0.07012380659580231, 0.0372970886528492, -0.06928382813930511, -0.10576577484607697, 0.00920836254954338, -0.0057390108704566956, 0.0574861615896225, 0.0933041200041771, 0.10451824963092804, -0.05834481865167618, 0.0199299156665802, 0.001963132992386818, -0.08983570337295532, 0.00937785767018795, -0.007409390062093735, 0.020237687975168228, 0.11692459881305695, -0.011690046638250351, 0.02099495381116867, -0.09425928443670273, 0.084893137216568, -0.09262730181217194, -0.002388940192759037, -0.05101344734430313, -0.08152774721384048, 0.04689186066389084, -0.13088348507881165, 0.006351994350552559, -0.12042681127786636, -0.12245576083660126, -0.020884159952402115, -0.005989029537886381, -0.029010821133852005, -0.04276370257139206, -0.03222163766622543, -0.05986907333135605, 0.10095664858818054, -0.07026174664497375, 0.01543695293366909, -0.11390959471464157, 0.09290292859077454, -0.07604975253343582, 0.10818766057491302, -0.16001197695732117, 0.07468774914741516, -0.12690742313861847, -0.017232857644557953, -0.016180645674467087, 0.12493211030960083, -0.015249188058078289, 0.11796602606773376, 0.006263690069317818, -0.02979886718094349, -0.11997488141059875, 0.08252662420272827, -0.01598464325070381, 0.17629702389240265, -0.19211861491203308, -0.06430958956480026, 0.16602864861488342, -0.06717052310705185, -0.05924982950091362, 0.06943601369857788, -0.0070848241448402405, 0.010265602730214596, 0.058367885649204254, 0.19900324940681458, -0.024846311658620834, 0.026485491544008255, -0.05066533386707306, 0.13966599106788635, -0.06496768444776535, -0.05339103192090988, 0.03262794390320778, 0.026505595073103905, -0.034892503172159195, 0.05274202302098274, 0.11225324869155884, 0.0651891827583313, -0.06556223332881927, -0.0531221404671669, -0.03285817801952362, 0.019997593015432358, 0.0892510786652565, 0.01899264194071293, 0.08023948222398758, -0.07756341993808746, -0.03981959447264671, 0.024991653859615326, 0.007030392065644264, 0.0039429436437785625, 0.04407532513141632, -0.03738376498222351, 0.12200570106506348, -0.06427138298749924, 0.0544242337346077, -0.20509690046310425, -0.04723560810089111, -0.06743412464857101, 0.10536926984786987, -0.04741501063108444, 0.127165287733078, 0.0006620599888265133, -0.08738534152507782, -0.018074024468660355, -0.006045693065971136, 0.12453398108482361, 0.028737872838974, -0.06351568549871445, -0.07667355239391327, 0.04834563285112381, -0.09049415588378906, 0.013079166412353516, -0.05005192384123802, 0.021799206733703613, 0.07937674224376678, 0.1563122570514679, -0.008920237421989441, 0.033942073583602905, 0.037529200315475464, 0.061627358198165894, -0.06597117334604263, 0.01841428503394127, 0.08711130172014236, 0.0028878580778837204, -0.0599449947476387, 0.2332114577293396, -0.15617609024047852, 0.22099387645721436, 0.2152053713798523, -0.266069233417511, 0.042930759489536285, 0.04490368068218231, -0.03079826757311821, 0.03542739152908325, -0.04532510042190552, -0.060454871505498886, 0.020796999335289, -0.039260461926460266, 0.15597085654735565, -0.03967614844441414, -0.03553002327680588, -0.01120118610560894, -0.021269693970680237, -0.07303367555141449, 0.062681183218956, 0.042085133492946625, -0.15876667201519012, 0.1916213184595108, 0.2650844156742096, 0.05270477384328842, 0.24901039898395538, -0.021486572921276093, -0.012023838236927986, 0.06785774230957031, -0.07843545079231262, -0.11089910566806793, -0.04990091919898987, -0.1950559914112091, 0.006318383850157261, 0.08873821794986725, 0.03839690610766411, 0.11271191388368607, -0.07142486423254013, -0.038008976727724075, -0.005861950106918812, 0.017703715711832047, -0.0795949250459671, 0.11495745182037354, 0.08701591193675995, 0.1171896681189537, -0.008669920265674591, 0.02878386527299881, 0.08433959633111954, -0.014638332650065422, -0.049754466861486435, 0.16645008325576782, -0.145323246717453, -0.3774828016757965, -0.14360588788986206, -0.0677764043211937, -0.018976449966430664, 0.018092438578605652, 0.1249643936753273, -0.08353186398744583, -0.03346732631325722, -0.07394737005233765, 0.019326113164424896, -0.022247470915317535, 0.020837290212512016, 0.03807377070188522, 0.0547078475356102, -0.02685938961803913, -0.1106194257736206, -0.028209347277879715, -0.015167484991252422, -0.0012078029103577137, 0.12352176755666733, -0.056733138859272, 0.09502232074737549, 0.13677628338336945, 0.007668986450880766, 0.05107806250452995, -0.027683140709996223, 0.1836276650428772, -0.09054699540138245, -0.03460170328617096, 0.18299008905887604, -0.023452920839190483, 0.05550982058048248, 0.16056133806705475, -0.0005943840369582176, -0.055351536720991135, 0.03219189494848251, -0.0007686661556363106, -0.08229415863752365, -0.17645445466041565, -0.18009936809539795, -0.1286998987197876, 0.04647090286016464, -0.007730988319963217, 0.07047823071479797, -0.03259183466434479, 0.02800844982266426, -0.035759180784225464, -0.04813050478696823, -0.05103612691164017, 0.02265872433781624, 0.21599078178405762, -0.05768715590238571, 0.13997066020965576, -0.07501278817653656, -0.10790292173624039, 0.1031361073255539, 0.01762252300977707, 0.038381949067115784, 0.002874972764402628, -0.009789986535906792, 0.03972604125738144, 0.010465212166309357, 0.11412015557289124, 0.11706273257732391, 0.015578266233205795, -0.010601313784718513, -0.03306072950363159, -0.030209418386220932, -0.03234840929508209, 0.018139807507395744, 0.06828233599662781, -0.13506783545017242, -0.048269838094711304, -0.05828667804598808, 0.12431036680936813, 0.09526233375072479, 0.12297508120536804, -0.23038801550865173, 0.012573447078466415, 0.09359331429004669, -0.014695651829242706, -0.10589326918125153, 0.062124744057655334, 0.13203445076942444, -0.04014866054058075, 0.09636257588863373, 0.035587385296821594, 0.11630764603614807, -0.05133797973394394, 0.09868597984313965, -0.04242473095655441, -0.14387351274490356, 0.02275613322854042, 0.11096156388521194, -0.2090534120798111, 0.20127682387828827, -0.009257237426936626, -0.07145745307207108, -0.0722958892583847, -0.022532163187861443, 0.01381121575832367, 0.11982265114784241, 0.07867570221424103, 0.029956795275211334, -0.10122296214103699, -0.11367161571979523, -0.06147633492946625, 0.013814444653689861, 0.08770614117383957, 0.014092298224568367, 0.01771477609872818, -0.019281236454844475, 0.0020204689353704453, -0.010262339375913143, 0.0023011150769889355, 0.00805247388780117, -0.20155012607574463, 0.08559036999940872, 0.12459345161914825, 0.0654841810464859, -0.00706426240503788, -0.06142445281147957, -0.08942373842000961, 0.14520150423049927, 0.008688535541296005, -0.08340665698051453, -0.13093893229961395, 0.007059882394969463, 0.06323514878749847, -0.025952843949198723, 0.021863557398319244, -0.022663168609142303, 0.07128533720970154, -0.0502818264067173, -0.1877209097146988, 0.10828467458486557, -0.08619582653045654, -0.044582776725292206, -0.039105791598558426, 0.11181313544511795, -0.0959053784608841, 0.01479516550898552, 0.031920306384563446, 0.06635732203722, -0.10636648535728455, -0.07891087234020233, -0.0029926386196166277, 0.027905425056815147, 0.09323929250240326, 0.14207547903060913, -0.04408052936196327, -0.13873013854026794, 0.06009417027235031, -0.03045092523097992, 0.29202041029930115, 0.12552928924560547, -0.09556873142719269, 0.12709301710128784, 0.08303278684616089, -0.06180606037378311, -0.405356764793396, -0.10083110630512238, -0.11380107700824738, 0.0011445162817835808, 0.020693624392151833, -0.09325266629457474, 0.08295749127864838, -0.006396034266799688, -0.01879974640905857, 0.02522565796971321, -0.28145673871040344, -0.04632295295596123, 0.0819048136472702, -0.10080811381340027, 0.33799290657043457, -0.09841646254062653, -0.06748628616333008, -0.056045033037662506, -0.13783477246761322, 0.13530667126178741, -0.12574683129787445, 0.10265061259269714, -0.015428246930241585, 0.03687842935323715, 0.04717669636011124, -0.052589140832424164, 0.11445924639701843, 0.011016299948096275, -0.003008779836818576, -0.10847464203834534, -0.04437294602394104, 0.10826623439788818, -0.023998957127332687, 0.04718887433409691, -0.1275467574596405, 0.049819473177194595, -0.2191942036151886, 0.0030488241463899612, -0.09027886390686035, 0.05123313516378403, 0.026657752692699432, -0.001988207921385765, -0.022193504497408867, -0.03192169591784477, 0.06284718960523605, 0.00532767828553915, 0.26325854659080505, -0.06484551727771759, 0.13438886404037476, 0.15051358938217163, 0.08329471200704575, -0.20992928743362427, 0.032979629933834076, -0.017533838748931885, -0.04612239450216293, 0.06125131994485855, -0.14773862063884735, 0.05874091014266014, 0.09887538105249405, -0.05089832469820976, 0.03766708821058273, 0.11458179354667664, 0.04014384001493454, 0.043246060609817505, 0.1913607120513916, -0.22942771017551422, -0.011620806530117989, -0.05274838209152222, -0.0025222264230251312, 0.0236833356320858, 0.02176549658179283, 0.18053142726421356, 0.06156858801841736, -0.0032348004169762135, 0.0053929234854876995, 0.04225802794098854, -0.04493581876158714, 0.08180893957614899, -0.017929807305336, 0.04365206137299538, -0.15429705381393433, 0.0904805064201355, 0.0504913330078125, -0.19715917110443115, 0.017834527418017387, 0.14782926440238953, -0.1320439577102661, -0.1544504463672638, -0.07423841208219528, 0.08569707721471786, -0.05651300400495529, -0.04985235631465912, -0.061696622520685196, -0.13171586394309998, 0.09222216159105301, 0.13408982753753662, 0.0790066123008728, 0.09096844494342804, -0.03926846385002136, -0.0842273086309433, -0.0019151573069393635, 0.029683850705623627, 0.07001407444477081, 0.01042888406664133, -0.07786215841770172, 0.038954958319664, -0.046139370650053024, 0.16674381494522095, -0.10859490931034088, -0.04075069725513458, -0.10270407050848007, 0.025173934176564217, -0.10798872262239456, -0.0515098012983799, -0.0945911556482315, -0.07138695567846298, -0.0262984037399292, -0.035729873925447464, -0.06830136477947235, -0.04838258773088455, -0.09321906417608261, -0.012759487144649029, -0.0577537938952446, 0.07015581429004669, -0.07953298091888428, -0.01302928477525711, 0.042210035026073456, -0.031647372990846634, 0.12253217399120331, 0.06311191618442535, -0.09455797076225281, 0.054419539868831635, -0.07415197789669037, -0.16284166276454926, 0.14574933052062988, 0.015577117912471294, 0.03326111286878586, 0.12080862373113632, 0.018262185156345367, 0.0899227112531662, 0.015954380854964256, 0.05694098025560379, 0.031236156821250916, -0.11948081105947495, 0.06076252833008766, -0.07575413584709167, -0.0762113556265831, -0.03055560775101185, -0.04580047354102135, 0.10970067977905273, 0.028945691883563995, 0.1751648187637329, -0.06379909068346024, 0.04819115996360779, -0.05419548228383064, 0.028753023594617844, -0.02534998580813408, -0.1996580958366394, -0.03371434658765793, -0.03975096344947815, 0.06295237690210342, -0.032893724739551544, 0.2924470901489258, 0.09635927528142929, -0.062059253454208374, 0.052884433418512344, 0.09443685412406921, -0.0031643547117710114, 0.01501914020627737, 0.20204761624336243, 0.08674094080924988, -0.05230630934238434, -0.10230500251054764, 0.03972753509879112, 0.04700247198343277, 0.06489704549312592, 0.09566953778266907, 0.10339601337909698, 0.11533346772193909, 0.11633886396884918, 0.004272878170013428, -0.0023662475869059563, -0.116527259349823, -0.08501167595386505, -0.07136277854442596, 0.1045774519443512, -0.02152254804968834, -0.019716883078217506, 0.1115979254245758, -0.024493180215358734, 0.0403645783662796, -0.03296685591340065, -0.010773194953799248, -0.11948518455028534, -0.05583646520972252, -0.06935139745473862, -0.11962020397186279, -0.028254374861717224, -0.11524824053049088, 0.05633893236517906, 0.1574464589357376, 0.03424578905105591, -0.04876772686839104, 0.09834612160921097, 0.027990013360977173, -0.0816364660859108, 0.11535216867923737, -0.025227507576346397, 0.06758558750152588, -0.04195790737867355, -0.0061003342270851135, -0.04703301191329956, 0.055090632289648056, -0.021140169352293015, 0.08024253696203232, -0.0400153212249279, 0.0302140973508358, -0.1656874418258667, -0.11238857358694077, -0.049150459468364716, 0.06028628349304199, -0.0404137447476387, 0.08304738998413086, 0.03718050569295883, -0.006020142696797848, 0.040604911744594574, 0.2115340232849121, -0.08505047112703323, -0.08016197383403778, -0.028104547411203384, 0.09411701560020447, 0.07761885225772858, 0.11476434767246246, -0.026894282549619675, -0.06421126425266266, -0.1269533783197403, 0.2380894422531128, 0.32148438692092896, -0.07754987478256226, 0.03774797543883324, 0.0030174083076417446, 0.0383269339799881, 0.0857466459274292, 0.07383377104997635, 0.11244623363018036, 0.23682719469070435, -0.06823810935020447, -0.030694520100951195, -0.05922456085681915, -0.023479973897337914, -0.1047314703464508, 0.03157477825880051, 0.029663261026144028, -0.07036898285150528, -0.034544847905635834, 0.08600566536188126, -0.19703355431556702, 0.10636825859546661, -0.06092917174100876, -0.22765541076660156, -0.054570890963077545, -0.0013719944981858134, 0.16067388653755188, 0.00421682745218277, 0.05851620435714722, -0.007814029231667519, -0.04679800570011139, 0.04668491706252098, -0.009665277786552906, -0.15137028694152832, 0.002450793981552124, 0.010325541719794273, -0.14240598678588867, 0.002047039568424225, -0.043230313807725906, -0.011771433055400848, 0.07623866200447083, 0.05121132731437683, -0.048743315041065216, 0.07985813915729523, -0.008498559705913067, 0.00949828140437603, 0.038778506219387054, 0.01518886536359787, -0.006380358710885048, -0.03145628795027733, 0.10694582760334015, -0.14651796221733093, 0.04921320825815201, -0.07307179272174835, -0.03280574828386307, 0.00391410430893302, -0.03506224602460861, -0.0044145602732896805, 0.06791945546865463, 0.10134441405534744, -0.013581912964582443, -0.00024316180497407913, -0.013579165562987328, -0.023058973252773285, -0.0343741700053215, -0.10270974040031433, -0.08859270066022873, -0.12801721692085266, -0.07393291592597961, 0.06586703658103943, 0.041152507066726685, -0.16387313604354858, 0.03222880885004997, -0.07558935880661011, 0.04983057826757431, -0.08862859010696411, 0.08253467082977295, 0.1029319167137146, -0.030415335670113564, -0.032607853412628174, -0.09693356603384018, 0.08313300460577011, 0.08909052610397339, -0.12115451693534851, -0.08273006230592728 ]
null
null
transformers
# ByT5-base fine-tuned for Question Answering (on Tweets) [ByT5](https://huggingface.co/google/byt5-base) base fine-tuned on [TweetQA](https://huggingface.co/datasets/tweet_qa) dataset for **Question Answering** downstream task. # Details of ByT5 - Base 🧠 ByT5 is a tokenizer-free version of [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and generally follows the architecture of [MT5](https://huggingface.co/google/mt5-base). ByT5 was only pre-trained on [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is usable on a downstream task. ByT5 works especially well on noisy text data,*e.g.*, `google/byt5-base` significantly outperforms [mt5-base](https://huggingface.co/google/mt5-base) on [TweetQA](https://arxiv.org/abs/1907.06292). Paper: [ByT5: Towards a token-free future with pre-trained byte-to-byte models](https://arxiv.org/pdf/1910.10683.pdf) Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel* ## Details of the downstream task (Question Answering) - Dataset 📚 [TweetQA](https://huggingface.co/datasets/tweet_qa) With social media becoming increasingly more popular, lots of news and real-time events are being covered. Developing automated question answering systems is critical to the effectiveness of many applications that rely on real-time knowledge. While previous question answering (QA) datasets have focused on formal text such as news and Wikipedia, we present the first large-scale dataset for QA over social media data. To make sure that the tweets are meaningful and contain interesting information, we gather tweets used by journalists to write news articles. We then ask human annotators to write questions and answers upon these tweets. Unlike other QA datasets like SQuAD (in which the answers are extractive), we allow the answers to be abstractive. The task requires the model to read a short tweet and a question and outputs a text phrase (does not need to be in the tweet) as the answer. - Data Instances: Sample ```json { "Question": "who is the tallest host?", "Answer": ["sam bee","sam bee"], "Tweet": "Don't believe @ConanOBrien's height lies. Sam Bee is the tallest host in late night. #alternativefacts\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\u2014 Full Frontal (@FullFrontalSamB) January 22, 2017", "qid": "3554ee17d86b678be34c4dc2c04e334f" } ``` - Data Fields: *Question*: a question based on information from a tweet *Answer*: list of possible answers from the tweet *Tweet*: source tweet *qid*: question id ## Model in Action 🚀 ```sh git clone https://github.com/huggingface/transformers.git pip install -q ./transformers ``` ```python from transformers import AutoTokenizer, T5ForConditionalGeneration ckpt = 'Narrativa/byt5-base-finetuned-tweet-qa' tokenizer = AutoTokenizer.from_pretrained(ckpt) model = T5ForConditionalGeneration.from_pretrained(ckpt).to('cuda') def get_answer(question, context): input_text = 'question: %s context: %s' % (question, context) inputs = tokenizer([input_text], return_tensors='pt') input_ids = inputs.input_ids.to('cuda') attention_mask = inputs.attention_mask.to('cuda') output = model.generate(input_ids, attention_mask=attention_mask) return tokenizer.decode(output[0], skip_special_tokens=True) context = "MONSTARS BASKETBALL @M0NSTARSBBALLWiggins answers Kemba's floater with a three! game tied 106-106. 8.9 to play. CHA ball!12/4/2016, 2:26:30 AM" question = 'who answered kemba\'s "floater"?' get_answer(question, context) # wiggins ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "en", "tags": ["qa", "Question Answering"], "datasets": ["tweet_qa"], "widget": [{"text": "question: how far away was the putt context: GET THE CIGAR READY! Miguel aces the 15th from 174 yards, and celebrates as only he knows how! The European Tour (@EuropeanTour) January, 15 2015"}]}
text2text-generation
Narrativa/byt5-base-finetuned-tweet-qa
[ "transformers", "pytorch", "t5", "text2text-generation", "qa", "Question Answering", "en", "dataset:tweet_qa", "arxiv:1907.06292", "arxiv:1910.10683", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1907.06292", "1910.10683" ]
[ "en" ]
TAGS #transformers #pytorch #t5 #text2text-generation #qa #Question Answering #en #dataset-tweet_qa #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# ByT5-base fine-tuned for Question Answering (on Tweets) ByT5 base fine-tuned on TweetQA dataset for Question Answering downstream task. # Details of ByT5 - Base ByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5. ByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is usable on a downstream task. ByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA. Paper: ByT5: Towards a token-free future with pre-trained byte-to-byte models Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel* ## Details of the downstream task (Question Answering) - Dataset TweetQA With social media becoming increasingly more popular, lots of news and real-time events are being covered. Developing automated question answering systems is critical to the effectiveness of many applications that rely on real-time knowledge. While previous question answering (QA) datasets have focused on formal text such as news and Wikipedia, we present the first large-scale dataset for QA over social media data. To make sure that the tweets are meaningful and contain interesting information, we gather tweets used by journalists to write news articles. We then ask human annotators to write questions and answers upon these tweets. Unlike other QA datasets like SQuAD (in which the answers are extractive), we allow the answers to be abstractive. The task requires the model to read a short tweet and a question and outputs a text phrase (does not need to be in the tweet) as the answer. - Data Instances: Sample - Data Fields: *Question*: a question based on information from a tweet *Answer*: list of possible answers from the tweet *Tweet*: source tweet *qid*: question id ## Model in Action Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "# ByT5-base fine-tuned for Question Answering (on Tweets)\nByT5 base fine-tuned on TweetQA dataset for Question Answering downstream task.", "# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is usable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*", "## Details of the downstream task (Question Answering) - Dataset \n\nTweetQA\n\n\nWith social media becoming increasingly more popular, lots of news and real-time events are being covered. Developing automated question answering systems is critical to the effectiveness of many applications that rely on real-time knowledge. While previous question answering (QA) datasets have focused on formal text such as news and Wikipedia, we present the first large-scale dataset for QA over social media data. To make sure that the tweets are meaningful and contain interesting information, we gather tweets used by journalists to write news articles. We then ask human annotators to write questions and answers upon these tweets. Unlike other QA datasets like SQuAD (in which the answers are extractive), we allow the answers to be abstractive. The task requires the model to read a short tweet and a question and outputs a text phrase (does not need to be in the tweet) as the answer.\n\n- Data Instances:\n\nSample\n\n\n- Data Fields:\n \n*Question*: a question based on information from a tweet \n\n*Answer*: list of possible answers from the tweet \n\n*Tweet*: source tweet \n\n*qid*: question id", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #t5 #text2text-generation #qa #Question Answering #en #dataset-tweet_qa #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# ByT5-base fine-tuned for Question Answering (on Tweets)\nByT5 base fine-tuned on TweetQA dataset for Question Answering downstream task.", "# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is usable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*", "## Details of the downstream task (Question Answering) - Dataset \n\nTweetQA\n\n\nWith social media becoming increasingly more popular, lots of news and real-time events are being covered. Developing automated question answering systems is critical to the effectiveness of many applications that rely on real-time knowledge. While previous question answering (QA) datasets have focused on formal text such as news and Wikipedia, we present the first large-scale dataset for QA over social media data. To make sure that the tweets are meaningful and contain interesting information, we gather tweets used by journalists to write news articles. We then ask human annotators to write questions and answers upon these tweets. Unlike other QA datasets like SQuAD (in which the answers are extractive), we allow the answers to be abstractive. The task requires the model to read a short tweet and a question and outputs a text phrase (does not need to be in the tweet) as the answer.\n\n- Data Instances:\n\nSample\n\n\n- Data Fields:\n \n*Question*: a question based on information from a tweet \n\n*Answer*: list of possible answers from the tweet \n\n*Tweet*: source tweet \n\n*qid*: question id", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 82, 39, 201, 273, 50 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #qa #Question Answering #en #dataset-tweet_qa #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# ByT5-base fine-tuned for Question Answering (on Tweets)\nByT5 base fine-tuned on TweetQA dataset for Question Answering downstream task.# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is usable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*" ]
[ -0.06349073350429535, 0.012839020229876041, -0.0034507564269006252, 0.06807922571897507, 0.05624622106552124, -0.003646589582785964, 0.08897527307271957, 0.12161720544099808, -0.05654538795351982, 0.07286380976438522, 0.07176171988248825, 0.00579628674313426, 0.03695342317223549, 0.19966371357440948, 0.027797242626547813, -0.10489628463983536, -0.005892928224056959, -0.040848154574632645, 0.06157553195953369, 0.10214696824550629, 0.0536515973508358, -0.053525783121585846, 0.05793178454041481, -0.06550101935863495, -0.005601812154054642, 0.013083158060908318, 0.013041449710726738, -0.05145977437496185, 0.022269340232014656, 0.09152431041002274, -0.03271842747926712, 0.06015154346823692, -0.0778239518404007, -0.11084770411252975, 0.054958488792181015, 0.09654130786657333, -0.038640573620796204, 0.04909345880150795, 0.07581622153520584, 0.005020879674702883, 0.11544095724821091, 0.0226593017578125, -0.054530564695596695, 0.05001018941402435, -0.07051420956850052, 0.0027160842437297106, -0.021314313635230064, 0.06559697538614273, 0.08998841047286987, 0.07379286736249924, -0.06123492494225502, 0.13728317618370056, -0.03353611379861832, 0.10843580961227417, 0.07031714916229248, -0.21724997460842133, -0.09275847673416138, 0.008038871921598911, -0.04240066558122635, 0.11908829212188721, -0.07781156152486801, 0.002524708863347769, 0.03551029786467552, -0.00982889998704195, -0.0013836701400578022, -0.012862333096563816, 0.01442436221987009, -0.0013761522714048624, -0.055086128413677216, 0.015273651108145714, 0.16517363488674164, 0.031903013586997986, -0.06770674884319305, -0.12859788537025452, -0.12208277732133865, 0.024535510689020157, -0.004091065376996994, -0.04085807502269745, -0.00702606700360775, 0.04129859432578087, -0.01835644617676735, -0.07396472990512848, -0.100595623254776, -0.04216708615422249, -0.06597331911325455, 0.0017855778569355607, 0.024844035506248474, -0.014169803820550442, -0.05274207890033722, 0.07384131103754044, -0.033718641847372055, -0.1416264772415161, -0.054783813655376434, -0.0792575553059578, -0.07428707182407379, 0.0011951568303629756, 0.006250984035432339, -0.10821521282196045, 0.04254572093486786, 0.09644503146409988, 0.041194502264261246, 0.123292937874794, -0.04344340041279793, -0.056092146784067154, -0.013513418845832348, 0.06922093778848648, -0.033001404255628586, -0.14556053280830383, 0.131025031208992, 0.04776638001203537, 0.011960998177528381, -0.047306351363658905, -0.019717121496796608, -0.031663864850997925, -0.009504438377916813, 0.0014100683620199561, 0.08122443407773972, 0.05739400535821915, -0.041467469185590744, -0.05480194836854935, 0.13241827487945557, -0.07945332676172256, -0.025795921683311462, 0.028385832905769348, -0.021138468757271767, -0.02449985221028328, 0.004530767444521189, -0.048178136348724365, -0.10646715760231018, -0.025046456605196, -0.05251314491033554, -0.06431657075881958, -0.04639380797743797, -0.03931446373462677, -0.0028778889682143927, 0.03264656662940979, -0.05530262738466263, -0.17128688097000122, -0.18276773393154144, -0.031244546175003052, 0.002271332312375307, -0.06052067503333092, -0.018487397581338882, -0.04392879828810692, -0.10168159753084183, 0.055075306445360184, -0.02550245076417923, 0.08595488220453262, -0.057348158210515976, 0.0727074146270752, 0.08032584935426712, 0.06256371736526489, 0.03499535843729973, -0.0071135577745735645, -0.09407522529363632, -0.04037310555577278, -0.09413272142410278, 0.16188131272792816, -0.042114246636629105, -0.01033029519021511, -0.10578188300132751, 0.027524661272764206, -0.14472462236881256, -0.028501352295279503, 0.05860771983861923, 0.19232888519763947, -0.17140348255634308, -0.029954006895422935, 0.12682346999645233, -0.09309687465429306, -0.06983785331249237, 0.1568359136581421, 0.015303588472306728, 0.03183826059103012, 0.11850181221961975, 0.06917428970336914, 0.10332273691892624, -0.06810621917247772, -0.0832752138376236, -0.01756824366748333, -0.02381628379225731, 0.060542602092027664, 0.015851391479372978, 0.01848316378891468, -0.0422087199985981, 0.014083698391914368, 0.024017181247472763, 0.029528263956308365, -0.019133182242512703, -0.05918237194418907, -0.02980950102210045, -0.05229252949357033, -0.002919555176049471, 0.01402740366756916, 0.0303556639701128, -0.03823734447360039, -0.12212219834327698, -0.1063498854637146, 0.0673329085111618, -0.019348036497831345, -0.010998450219631195, -0.10147197544574738, 0.07261345535516739, -0.11964868009090424, 0.05885792523622513, -0.12297193706035614, -0.08123914152383804, 0.026954088360071182, 0.01711106300354004, 0.08166517317295074, -0.07547502964735031, 0.08308190852403641, 0.03121676854789257, -0.06483951956033707, -0.04834228381514549, 0.00850632879883051, -0.027891060337424278, -0.07333410531282425, -0.12252825498580933, 0.06539016216993332, -0.011466330848634243, 0.043323174118995667, -0.1158335953950882, -0.001379842171445489, 0.04502939060330391, 0.07333733141422272, 0.062611423432827, -0.043828997761011124, 0.06892482191324234, 0.008035092614591122, -0.020682446658611298, -0.030796393752098083, 0.009668944403529167, 0.010229827836155891, -0.03671018034219742, 0.14326533675193787, -0.11143366992473602, -0.040681738406419754, 0.11769899725914001, 0.12298694252967834, -0.08761557191610336, 0.12998971343040466, 0.008290967904031277, -0.06216764077544212, -0.02980312705039978, -0.027678750455379486, 0.1666889488697052, -0.024074990302324295, 0.10905903577804565, -0.08067739009857178, -0.05507030338048935, 0.03847527503967285, 0.003839636454358697, -0.010574502870440483, 0.09758670628070831, 0.011674393899738789, -0.0652041956782341, 0.026807010173797607, 0.09946074336767197, 0.013570627197623253, 0.23532624542713165, -0.011866682209074497, -0.10022719204425812, -0.013425955548882484, 0.06175258383154869, 0.013400554656982422, 0.03780582174658775, -0.08690790086984634, 0.030652090907096863, 0.024541527032852173, 0.01567191071808338, 0.03422955423593521, -0.04829024896025658, 0.058019425719976425, -0.01502450555562973, -0.06678962707519531, -0.06018918752670288, 0.037254735827445984, -0.007003591861575842, 0.09157758951187134, -0.028918178752064705, 0.049273017793893814, 0.0030982275493443012, -0.023661382496356964, -0.11110353469848633, 0.12288042902946472, -0.05949067324399948, -0.16615799069404602, -0.07425915449857712, -0.013463927432894707, -0.07094870507717133, 0.007499528117477894, 0.05481250211596489, -0.08492406457662582, -0.05447498336434364, -0.11189848184585571, 0.06980514526367188, 0.031938523054122925, 0.019845567643642426, -0.07327809929847717, -0.018998807296156883, 0.06137571111321449, -0.11020443588495255, 0.04702141880989075, 0.0071880691684782505, -0.1075286716222763, 0.07426640391349792, -0.047801557928323746, 0.0605824775993824, 0.06392242759466171, -0.01481375191360712, -0.004906115122139454, -0.045448772609233856, 0.23173782229423523, -0.08451749384403229, 0.10156465321779251, 0.1294926255941391, 0.07354260981082916, 0.053305234760046005, 0.11914798617362976, -0.04215667396783829, -0.06257126480340958, 0.08495732396841049, 0.043951600790023804, -0.03737747296690941, -0.19491688907146454, 0.008131922222673893, -0.09783541411161423, 0.02807440422475338, 0.016453450545668602, 0.08085299283266068, 0.0027844137512147427, 0.03006977215409279, -0.07795557379722595, 0.004345125053077936, 0.04109562560915947, 0.05783987417817116, 0.021034590899944305, 0.06920235604047775, 0.09423083811998367, -0.06914417445659637, 0.012883736751973629, 0.15104471147060394, -0.031855303794145584, 0.15230803191661835, -0.08062590658664703, 0.11233055591583252, 0.02583186887204647, 0.11869971454143524, 0.128350168466568, 0.004758524242788553, -0.06225420907139778, 0.024696648120880127, -0.020736398175358772, -0.07640670239925385, -0.03378994017839432, 0.0178595669567585, 0.027564488351345062, -0.036163076758384705, 0.021053412929177284, 0.041613683104515076, 0.07732424139976501, 0.20829035341739655, 0.004480159841477871, -0.1481001377105713, -0.09557393193244934, 0.007839132100343704, -0.07836626470088959, -0.052928268909454346, 0.02395130880177021, 0.11805260181427002, -0.052855100482702255, 0.03531327843666077, 0.015664540231227875, 0.09397947043180466, -0.061220813542604446, -0.020476708188652992, 0.013639578595757484, 0.06067083403468132, -0.019110893830657005, 0.04706043004989624, -0.2318086177110672, 0.047330666333436966, 0.008076677098870277, 0.12844237685203552, -0.02952607348561287, 0.02533051371574402, 0.04392111301422119, -0.11562831699848175, 0.08856645971536636, 0.007494375109672546, 0.022348083555698395, -0.03917611762881279, -0.1774846762418747, 0.0453750379383564, 0.08732040971517563, -0.017570510506629944, 0.056624289602041245, -0.010306530632078648, -0.031026571989059448, -0.006128378212451935, 0.06320959329605103, -0.11297355592250824, -0.1374931037425995, 0.026705466210842133, 0.0360042080283165, 0.026932543143630028, -0.01697220467031002, -0.0426257885992527, -0.04163604974746704, 0.08899785578250885, -0.11990811675786972, -0.060209836810827255, -0.07455282658338547, -0.019822372123599052, 0.06847652047872543, -0.06765507906675339, 0.01637706533074379, -0.04309862479567528, -0.013176138512790203, 0.04703958332538605, -0.1188330352306366, 0.05614246055483818, -0.08086560666561127, -0.14208032190799713, -0.014518139883875847, 0.1523144543170929, 0.015173793770372868, 0.018743740394711494, 0.0017791386926546693, 0.0011405162513256073, -0.07602851837873459, -0.10052225738763809, 0.022595088928937912, 0.020515868440270424, -0.0006190211279317737, 0.017315084114670753, 0.0056476108729839325, -0.20069491863250732, -0.05711563676595688, -0.06271592527627945, 0.01366423536092043, 0.23995529115200043, -0.015891248360276222, 0.15208154916763306, 0.16000010073184967, -0.03951944038271904, -0.16864365339279175, -0.08426479995250702, 0.05532124266028404, 0.030093664303421974, -0.03882094845175743, -0.16245175898075104, 0.11527953296899796, 0.01463147159665823, 0.012069514952600002, 0.04663383960723877, -0.20500293374061584, -0.1373651921749115, 0.027347296476364136, 0.013057893142104149, 0.23330844938755035, -0.11267118155956268, -0.05240511894226074, 0.01175567228347063, -0.04409200698137283, 0.06332956999540329, -0.10420306771993637, 0.12120658904314041, 0.016286786645650864, -0.04146905988454819, 0.014883579686284065, -0.027702802792191505, 0.04939740523695946, -0.024906277656555176, 0.01830141991376877, -0.052206553518772125, 0.07350664585828781, 0.11750568449497223, -0.027070794254541397, 0.05181092396378517, 0.01800176501274109, 0.09721781313419342, -0.14299294352531433, -0.04875499755144119, -0.033400487154722214, 0.0033214327413588762, 0.002304329304024577, -0.005984185729175806, -0.0011796778999269009, 0.03569761663675308, 0.08928810060024261, -0.02269500307738781, -0.01953132450580597, -0.07181121408939362, 0.013218616135418415, 0.11344771832227707, 0.1373736709356308, -0.10321719199419022, -0.011434085667133331, -0.0362592414021492, -0.03890877217054367, 0.060825545340776443, -0.15050312876701355, 0.09969072043895721, 0.0494539812207222, 0.0014896809589117765, 0.039423227310180664, 0.006255556363612413, -0.04746811091899872, 0.07693181931972504, 0.10681644827127457, -0.18850058317184448, -0.1177486702799797, -0.04086373746395111, -0.15562888979911804, -0.03946800157427788, -0.01483799610286951, 0.13093699514865875, -0.07796592265367508, 0.0020061370451003313, 0.026706291362643242, 0.006345979403704405, 0.06257720291614532, 0.18714243173599243, -0.002039890270680189, 0.007434619124978781, -0.09646222740411758, 0.1018691137433052, 0.07248351722955704, 0.01230190135538578, 0.05835122987627983, 0.11947467923164368, -0.12816107273101807, -0.052408792078495026, -0.06411170214414597, 0.046676892787218094, 0.10178934782743454, -0.021627552807331085, -0.01850106753408909, 0.033613212406635284, 0.02628704160451889, 0.05684717372059822, -0.0038770935498178005, 0.03567669913172722, -0.001123036490753293, 0.030031295493245125, -0.012540425173938274, 0.12828245759010315, 0.08404199033975601, 0.021917937323451042, -0.08386681228876114, 0.005584171507507563, -0.0089785261079669, 0.00982121005654335, -0.054000724107027054, -0.04616408795118332, -0.07456611096858978, -0.030098581686615944, -0.06631343066692352, 0.07012519985437393, -0.04449532553553581, 0.0005711818230338395, -0.04305768013000488, -0.01999182254076004, -0.05351432412862778, 0.033794451504945755, -0.014145590364933014, -0.01038105133920908, -0.025992603972554207, 0.04518358036875725, -0.11841157078742981, 0.042837902903556824, 0.03804474323987961, -0.06917186826467514, 0.16510431468486786, 0.05839390680193901, -0.054120924323797226, 0.010335695929825306, -0.0905051901936531, -0.02246847376227379, -0.0285852812230587, 0.059524569660425186, -0.03599463030695915, -0.08411536365747452, 0.05088168382644653, 0.037634965032339096, -0.004790312610566616, -0.025805698707699776, 0.11456210166215897, -0.09829578548669815, 0.009209060110151768, -0.06534800678491592, 0.015815142542123795, -0.08470427989959717, 0.029628559947013855, 0.05357975885272026, 0.0419684574007988, 0.16407297551631927, -0.09878718852996826, -0.03561229631304741, -0.13666048645973206, -0.011593637987971306, 0.04177100211381912, -0.07989448308944702, -0.15676729381084442, -0.03423232212662697, 0.10427950322628021, -0.08799957484006882, 0.042977094650268555, -0.06743072718381882, -0.01457031536847353, 0.02533038891851902, -0.05602739378809929, -0.03333986923098564, -0.03887972608208656, 0.15993192791938782, 0.0494176410138607, -0.027264852076768875, 0.03071899712085724, -0.001355635584332049, -0.023578060790896416, 0.15699878334999084, 0.10697069764137268, 0.11446890980005264, 0.05401784926652908, 0.05508268252015114, -0.023583196103572845, -0.010096973739564419, -0.09658060222864151, 0.1322154849767685, -0.1765316277742386, 0.07268029451370239, -0.05959521234035492, 0.02670179307460785, 0.16314160823822021, -0.0337192639708519, 0.041471485048532486, -0.034160785377025604, -0.03684462234377861, -0.0786348357796669, -0.19366593658924103, -0.10237658768892288, -0.08527278155088425, 0.023767562583088875, -0.11584438383579254, 0.029617801308631897, 0.09264147281646729, 0.06811437755823135, -0.024472322314977646, 0.164337620139122, 0.026059694588184357, -0.1144149899482727, 0.11666551977396011, -0.05211685597896576, 0.007007300388067961, 0.07218942791223526, -0.028308892622590065, 0.005064865108579397, -0.0002953128714580089, 0.03506298363208771, 0.057825930416584015, -0.008525975979864597, 0.018276100978255272, -0.10213245451450348, -0.10704760998487473, -0.0291912779211998, 0.05765619874000549, 0.01677015796303749, 0.1310465931892395, 0.0860423818230629, -0.06794316321611404, 0.026783045381307602, 0.2499503791332245, -0.02169356867671013, -0.133205384016037, -0.15441004931926727, 0.06834626197814941, 0.020681193098425865, -0.010034624487161636, 0.019696703180670738, -0.14200864732265472, 0.04587038233876228, 0.2108050435781479, 0.09678937494754791, -0.07275402545928955, 0.004983516875654459, 0.034237757325172424, 0.008283198811113834, 0.030304934829473495, 0.1324799358844757, 0.07537291944026947, 0.21223609149456024, -0.010776701383292675, 0.026440609246492386, -0.004797269124537706, -0.02199234627187252, -0.030516717582941055, 0.23649080097675323, -0.03160065412521362, -0.03467582166194916, -0.057231541723012924, 0.055198073387145996, -0.07263807207345963, -0.18923848867416382, -0.07173097878694534, -0.08117392659187317, -0.09932832419872284, 0.0060136751271784306, -0.005455329082906246, 0.04989855736494064, 0.011965853162109852, -0.008218095637857914, 0.03433996066451073, -0.0211559496819973, 0.022020023316144943, -0.06729663163423538, -0.027672763913869858, 0.055468663573265076, -0.06963928788900375, 0.07444219291210175, 0.02063729614019394, 0.05152212455868721, 0.08208074420690536, 0.009272349067032337, -0.1480940729379654, 0.13507987558841705, 0.029175354167819023, -0.15449711680412292, 0.037996239960193634, 0.15480796992778778, 0.032239533960819244, 0.10870976746082306, 0.09986110031604767, 0.10922861099243164, 0.008720668964087963, 0.025698620826005936, 0.012130520306527615, -0.11213178932666779, 0.061625540256500244, -0.06722889095544815, 0.11671917140483856, 0.09840869158506393, -0.03591127321124077, -0.00949860829859972, -0.087328240275383, 0.023928381502628326, -0.0727221816778183, 0.02649528905749321, -0.02073674649000168, -0.1863224357366562, -0.012348643504083157, 0.029648365452885628, 0.10969746112823486, -0.16378900408744812, -0.03245587646961212, -0.05494900047779083, 0.01826985366642475, -0.06727160513401031, 0.16594164073467255, 0.1133333370089531, 0.018468886613845825, -0.017966246232390404, -0.005638344679027796, -0.004025065805763006, 0.08931887894868851, -0.0951315313577652, -0.08365161716938019 ]
null
null
transformers
# ByT5-base fine-tuned for Hate Speech Detection (on Tweets) [ByT5](https://huggingface.co/google/byt5-base) base fine-tuned on [tweets hate speech detection](https://huggingface.co/datasets/tweets_hate_speech_detection) dataset for **Sequence Classification** downstream task. # Details of ByT5 - Base 🧠 ByT5 is a tokenizer-free version of [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and generally follows the architecture of [MT5](https://huggingface.co/google/mt5-base). ByT5 was only pre-trained on [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task. ByT5 works especially well on noisy text data,*e.g.*, `google/byt5-base` significantly outperforms [mt5-base](https://huggingface.co/google/mt5-base) on [TweetQA](https://arxiv.org/abs/1907.06292). Paper: [ByT5: Towards a token-free future with pre-trained byte-to-byte models](https://arxiv.org/pdf/1910.10683.pdf) Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel* ## Details of the downstream task (Sequence Classification as Text generation) - Dataset 📚 [tweets_hate_speech_detection](hhttps://huggingface.co/datasets/tweets_hate_speech_detection) The objective of this task is to detect hate speech in tweets. For the sake of simplicity, we say a tweet contains hate speech if it has a racist or sexist sentiment associated with it. So, the task is to classify racist or sexist tweets from other tweets. Formally, given a training sample of tweets and labels, where label ‘1’ denotes the tweet is racist/sexist and label ‘0’ denotes the tweet is not racist/sexist, your objective is to predict the labels on the given test dataset. - Data Instances: The dataset contains a label denoting is the tweet a hate speech or not ```json {'label': 0, # not a hate speech 'tweet': ' @user when a father is dysfunctional and is so selfish he drags his kids into his dysfunction. #run'} ``` - Data Fields: **label**: 1 - it is a hate speech, 0 - not a hate speech **tweet**: content of the tweet as a string - Data Splits: The data contains training data with **31962** entries ## Test set metrics 🧾 We created a representative test set with the 5% of the entries. The dataset is so imbalanced and we got a **F1 score of 79.8** ## Model in Action 🚀 ```sh git clone https://github.com/huggingface/transformers.git pip install -q ./transformers ``` ```python from transformers import AutoTokenizer, T5ForConditionalGeneration ckpt = 'Narrativa/byt5-base-tweet-hate-detection' tokenizer = AutoTokenizer.from_pretrained(ckpt) model = T5ForConditionalGeneration.from_pretrained(ckpt).to("cuda") def classify_tweet(tweet): inputs = tokenizer([tweet], padding='max_length', truncation=True, max_length=512, return_tensors='pt') input_ids = inputs.input_ids.to('cuda') attention_mask = inputs.attention_mask.to('cuda') output = model.generate(input_ids, attention_mask=attention_mask) return tokenizer.decode(output[0], skip_special_tokens=True) classify_tweet('here goes your tweet...') ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "en", "tags": ["hate", "speech"], "datasets": ["tweets_hate_speech_detection"], "widget": [{"text": "@user black lives really matter?"}]}
text2text-generation
Narrativa/byt5-base-tweet-hate-detection
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "hate", "speech", "en", "dataset:tweets_hate_speech_detection", "arxiv:1907.06292", "arxiv:1910.10683", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1907.06292", "1910.10683" ]
[ "en" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #hate #speech #en #dataset-tweets_hate_speech_detection #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# ByT5-base fine-tuned for Hate Speech Detection (on Tweets) ByT5 base fine-tuned on tweets hate speech detection dataset for Sequence Classification downstream task. # Details of ByT5 - Base ByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5. ByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task. ByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA. Paper: ByT5: Towards a token-free future with pre-trained byte-to-byte models Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel* ## Details of the downstream task (Sequence Classification as Text generation) - Dataset tweets_hate_speech_detection The objective of this task is to detect hate speech in tweets. For the sake of simplicity, we say a tweet contains hate speech if it has a racist or sexist sentiment associated with it. So, the task is to classify racist or sexist tweets from other tweets. Formally, given a training sample of tweets and labels, where label ‘1’ denotes the tweet is racist/sexist and label ‘0’ denotes the tweet is not racist/sexist, your objective is to predict the labels on the given test dataset. - Data Instances: The dataset contains a label denoting is the tweet a hate speech or not - Data Fields: label: 1 - it is a hate speech, 0 - not a hate speech tweet: content of the tweet as a string - Data Splits: The data contains training data with 31962 entries ## Test set metrics We created a representative test set with the 5% of the entries. The dataset is so imbalanced and we got a F1 score of 79.8 ## Model in Action Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "# ByT5-base fine-tuned for Hate Speech Detection (on Tweets)\nByT5 base fine-tuned on tweets hate speech detection dataset for Sequence Classification downstream task.", "# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*", "## Details of the downstream task (Sequence Classification as Text generation) - Dataset \n\ntweets_hate_speech_detection\n\n\nThe objective of this task is to detect hate speech in tweets. For the sake of simplicity, we say a tweet contains hate speech if it has a racist or sexist sentiment associated with it. So, the task is to classify racist or sexist tweets from other tweets.\n\nFormally, given a training sample of tweets and labels, where label ‘1’ denotes the tweet is racist/sexist and label ‘0’ denotes the tweet is not racist/sexist, your objective is to predict the labels on the given test dataset.\n\n- Data Instances:\n\nThe dataset contains a label denoting is the tweet a hate speech or not\n\n\n- Data Fields:\n \nlabel: 1 - it is a hate speech, 0 - not a hate speech\n\ntweet: content of the tweet as a string\n\n- Data Splits:\n\nThe data contains training data with 31962 entries", "## Test set metrics \n\nWe created a representative test set with the 5% of the entries.\n\nThe dataset is so imbalanced and we got a F1 score of 79.8", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #hate #speech #en #dataset-tweets_hate_speech_detection #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# ByT5-base fine-tuned for Hate Speech Detection (on Tweets)\nByT5 base fine-tuned on tweets hate speech detection dataset for Sequence Classification downstream task.", "# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*", "## Details of the downstream task (Sequence Classification as Text generation) - Dataset \n\ntweets_hate_speech_detection\n\n\nThe objective of this task is to detect hate speech in tweets. For the sake of simplicity, we say a tweet contains hate speech if it has a racist or sexist sentiment associated with it. So, the task is to classify racist or sexist tweets from other tweets.\n\nFormally, given a training sample of tweets and labels, where label ‘1’ denotes the tweet is racist/sexist and label ‘0’ denotes the tweet is not racist/sexist, your objective is to predict the labels on the given test dataset.\n\n- Data Instances:\n\nThe dataset contains a label denoting is the tweet a hate speech or not\n\n\n- Data Fields:\n \nlabel: 1 - it is a hate speech, 0 - not a hate speech\n\ntweet: content of the tweet as a string\n\n- Data Splits:\n\nThe data contains training data with 31962 entries", "## Test set metrics \n\nWe created a representative test set with the 5% of the entries.\n\nThe dataset is so imbalanced and we got a F1 score of 79.8", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 92, 47, 201, 223, 39, 50 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #hate #speech #en #dataset-tweets_hate_speech_detection #arxiv-1907.06292 #arxiv-1910.10683 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# ByT5-base fine-tuned for Hate Speech Detection (on Tweets)\nByT5 base fine-tuned on tweets hate speech detection dataset for Sequence Classification downstream task.# Details of ByT5 - Base \n\nByT5 is a tokenizer-free version of Google's T5 and generally follows the architecture of MT5.\nByT5 was only pre-trained on mC4 excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.\nByT5 works especially well on noisy text data,*e.g.*, 'google/byt5-base' significantly outperforms mt5-base on TweetQA.\nPaper: ByT5: Towards a token-free future with pre-trained byte-to-byte models\nAuthors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*" ]
[ -0.036555737257003784, 0.056267671287059784, -0.004469295963644981, 0.03829711303114891, 0.049556586891412735, -0.018896201625466347, 0.09611382335424423, 0.13461734354496002, -0.07433335483074188, 0.05410075560212135, 0.08474728465080261, 0.0319073349237442, 0.039766017347574234, 0.17079958319664001, 0.009480681270360947, -0.17263448238372803, -0.009973045438528061, -0.04786207899451256, 0.062170617282390594, 0.11472434550523758, 0.08466260135173798, -0.04678699001669884, 0.06514435261487961, -0.03664974868297577, -0.04410841688513756, 0.011356677860021591, 0.03780827298760414, -0.06941431015729904, 0.022277988493442535, 0.07451494038105011, -0.025740906596183777, 0.05164762958884239, -0.04281829297542572, -0.10290762782096863, 0.04851539060473442, 0.12901288270950317, -0.01626896858215332, 0.04011541232466698, 0.061124470084905624, -0.06753461807966232, 0.18668274581432343, -0.011223413050174713, -0.017405753955245018, 0.07365784794092178, -0.10080479830503464, -0.029872093349695206, -0.03982837125658989, 0.12775108218193054, 0.13139638304710388, 0.07985405623912811, -0.05957331880927086, 0.12165087461471558, 0.005214962176978588, 0.1162680834531784, 0.09032026678323746, -0.16945169866085052, -0.0970810204744339, -0.04231208935379982, -0.05566202849149704, 0.10488859564065933, -0.06825467199087143, 0.017698410898447037, 0.03629986196756363, -0.007558934856206179, 0.03839174285531044, -0.03614538908004761, 0.03720097616314888, -0.017151255160570145, -0.1250300109386444, 0.0019907525274902582, 0.0728442594408989, 0.01470215618610382, -0.07620713114738464, -0.18678905069828033, -0.09793087840080261, 0.009630962274968624, -0.009681159630417824, -0.05708300694823265, -0.0116368243470788, 0.03600120171904564, 0.03707928583025932, -0.12761807441711426, -0.0870739221572876, -0.036124687641859055, -0.021374277770519257, 0.049994129687547684, 0.004810422658920288, -0.002451003296300769, -0.05342594534158707, 0.0774657130241394, -0.004595826379954815, -0.12621352076530457, -0.032454777508974075, -0.08179132640361786, -0.05003054812550545, -0.016821755096316338, 0.033625826239585876, -0.13710717856884003, 0.01011211983859539, 0.06440635025501251, 0.03884211182594299, 0.07480036467313766, -0.06854666024446487, -0.019153660163283348, 0.022665925323963165, 0.057830069214105606, -0.05870755761861801, -0.09208838641643524, 0.12362764030694962, 0.058099836111068726, 0.03810832276940346, -0.03011990338563919, -0.015928030014038086, -0.039218734949827194, -0.02014569751918316, 0.05999055504798889, 0.12510591745376587, 0.06626348942518234, -0.025229526683688164, -0.06918752193450928, 0.1679695099592209, -0.10928396880626678, -0.017121626064181328, 0.04414373263716698, -0.043059881776571274, 0.01046551764011383, 0.011746866628527641, -0.03839771822094917, -0.12621448934078217, 0.015822453424334526, -0.05332476645708084, -0.040311288088560104, -0.04408413916826248, -0.07894681394100189, 0.015036452561616898, -0.020119085907936096, -0.07008649408817291, -0.14746689796447754, -0.15943172574043274, -0.031009800732135773, 0.0047386703081429005, -0.05278389900922775, -0.001338086323812604, -0.06092866510152817, -0.08995262533426285, 0.02943899668753147, -0.027253389358520508, 0.014677598141133785, -0.04822773113846779, 0.06650117039680481, 0.009065375663340092, 0.0718582272529602, 0.038473840802907944, -0.009219682775437832, -0.1019969955086708, -0.027432242408394814, -0.19756370782852173, 0.14580245316028595, -0.043472770601511, 0.035175859928131104, -0.12552955746650696, -0.009877803735435009, -0.11967075616121292, -0.005283703561872244, 0.02851477824151516, 0.2128501683473587, -0.18617594242095947, -0.04827708378434181, 0.1386159509420395, -0.12685680389404297, -0.03764939308166504, 0.14042659103870392, 0.017165658995509148, 0.0545576736330986, 0.10626331716775894, 0.1258222907781601, 0.07865826785564423, -0.09184810519218445, -0.08981068432331085, -0.0007755953702144325, -0.11246300488710403, 0.06664246320724487, 0.0024520540609955788, -0.01774619333446026, 0.035952139645814896, -0.013972505927085876, 0.008032824844121933, 0.047000110149383545, 0.004782326519489288, -0.041017718613147736, 0.0004416255105752498, -0.03735993802547455, 0.01823258213698864, -0.02415735460817814, -0.014851544983685017, -0.03568081930279732, -0.13099975883960724, -0.0374726802110672, 0.07324934750795364, -0.032623764127492905, 0.030540816485881805, -0.10830730944871902, 0.014985212124884129, -0.04038739576935768, 0.060983769595623016, -0.14101848006248474, -0.009839803911745548, -0.001830446533858776, -0.03587784245610237, 0.10743981599807739, -0.053902704268693924, 0.06063437461853027, 0.002335798228159547, -0.03073381818830967, -0.04272967949509621, 0.03431639075279236, -0.022327998653054237, -0.07072197645902634, -0.1387658715248108, 0.05584659054875374, -0.0077684237621724606, 0.04449118673801422, -0.1636570394039154, 0.030633896589279175, 0.08965456485748291, 0.06510338932275772, 0.0723210945725441, -0.035954203456640244, 0.08959002792835236, 0.009336238726973534, -0.008112042210996151, -0.05282796919345856, 0.0069892448373138905, 0.000540614768397063, -0.03465883433818817, 0.16303567588329315, -0.13220065832138062, -0.11107504367828369, 0.12430793046951294, 0.049807533621788025, -0.09004300087690353, 0.17409247159957886, -0.015993928536772728, -0.04009842500090599, -0.005835891235619783, -0.03142237290740013, 0.14918233454227448, 0.0026875741314142942, 0.12208035588264465, -0.0888385996222496, -0.06593749672174454, 0.03897072374820709, -0.01979360543191433, -0.04480549320578575, 0.09214229136705399, -0.024197667837142944, -0.12298475205898285, 0.03443901613354683, 0.12370472401380539, -0.0029080905951559544, 0.17902988195419312, 0.015609141439199448, -0.07124830037355423, 0.011408649384975433, 0.04922270402312279, 0.02774062752723694, 0.005114066880196333, -0.07745613902807236, 0.006665696389973164, 0.004380042664706707, 0.021554255858063698, 0.032055966556072235, -0.041057538241147995, 0.0753624439239502, -0.014321205206215382, -0.07554109394550323, -0.03486262261867523, 0.06255213916301727, -0.011175304651260376, 0.08910475671291351, -0.005333464592695236, 0.049694694578647614, -0.018955819308757782, -0.027639463543891907, -0.14023154973983765, 0.10867372900247574, -0.07781078666448593, -0.20431412756443024, -0.0649641752243042, -0.006091617047786713, -0.06612107902765274, -0.0032835903111845255, 0.05897485464811325, -0.06733899563550949, -0.049085505306720734, -0.13028885424137115, 0.037398453801870346, 0.00014325838128570467, 0.023504698649048805, -0.07564931362867355, 0.030281774699687958, 0.04363684728741646, -0.10335972905158997, 0.04531911760568619, -0.007970208302140236, -0.09021854400634766, 0.034693215042352676, -0.007152040489017963, 0.023912595584988594, 0.10380309075117111, -0.008039956912398338, -0.001821233076043427, -0.07299486547708511, 0.21986165642738342, -0.08723966777324677, 0.10345443338155746, 0.1771533489227295, -0.005591976922005415, 0.05075226351618767, 0.08484520018100739, -0.03432078659534454, -0.0743197351694107, 0.06930454820394516, 0.03984253481030464, -0.018627559766173363, -0.18357278406620026, -0.013158713467419147, -0.053423598408699036, 0.08011892437934875, 0.051397185772657394, 0.0945194810628891, -0.014275064691901207, -0.019426444545388222, -0.06485527008771896, -0.026471808552742004, 0.05845740810036659, 0.08874141424894333, 0.02582026645541191, 0.0018544032936915755, 0.08022338896989822, -0.06651411205530167, 0.03489778935909271, 0.14690452814102173, -0.09154555201530457, 0.13000714778900146, -0.06089075282216072, 0.16149401664733887, 0.0491141714155674, 0.09044177085161209, 0.12513121962547302, -0.019015561789274216, -0.06560385227203369, 0.034718554466962814, -0.0014884939882904291, -0.09135973453521729, -0.031108174473047256, 0.01731838658452034, -0.004408631473779678, -0.005793874152004719, 0.03798181563615799, -0.003467013593763113, 0.10353358834981918, 0.28109505772590637, 0.010433701798319817, -0.12442111223936081, -0.07449547946453094, -0.0005663278279826045, -0.09777405858039856, -0.033578112721443176, 0.009076745249330997, 0.092631034553051, -0.04296520724892616, 0.047197531908750534, 0.010218039155006409, 0.08396551758050919, -0.12136872112751007, -0.00419896375387907, 0.0037710501346737146, 0.04326358437538147, -0.03179137781262398, 0.03117191605269909, -0.18898554146289825, 0.06606324762105942, 0.030300604179501534, 0.13338160514831543, -0.0432780496776104, 0.011382797732949257, 0.028747037053108215, -0.0200846865773201, 0.12847980856895447, 0.0026784336660057306, 0.011599141173064709, -0.04404830560088158, -0.17315901815891266, -0.01886129565536976, 0.12390556931495667, -0.02982119657099247, 0.04329841956496239, -0.013695362955331802, -0.044809889048337936, 0.0016454980941489339, -0.02703525871038437, -0.1227840930223465, -0.10966570675373077, 0.020133204758167267, 0.03230512887239456, 0.01933654397726059, -0.0075882975943386555, -0.0176993478089571, 0.027699293568730354, 0.11213300377130508, -0.1812574863433838, -0.050578244030475616, -0.10186157375574112, -0.00958019495010376, 0.026402480900287628, -0.06178548187017441, 0.00475967675447464, -0.015117109753191471, 0.019271910190582275, 0.016940459609031677, -0.10475056618452072, 0.06674962490797043, -0.0907377079129219, -0.13841848075389862, 0.002934585092589259, 0.13644666969776154, 0.048346951603889465, 0.016724372282624245, -0.0009919627336785197, 0.0019272651989012957, -0.03703406825661659, -0.10512703657150269, 0.023648234084248543, 0.0714704841375351, -0.011046032421290874, 0.0073769292794167995, 0.005164843052625656, -0.23831643164157867, -0.07677514106035233, -0.02456086128950119, 0.023638656362891197, 0.24909542500972748, -0.01383891049772501, 0.14621645212173462, 0.17934580147266388, -0.07354596257209778, -0.2198392003774643, -0.027162320911884308, 0.043681394308805466, 0.011965098790824413, -0.013130273669958115, -0.14319093525409698, 0.08448963612318039, 0.023129569366574287, 0.007554995361715555, 0.08680520206689835, -0.2127930372953415, -0.14498314261436462, 0.04786849021911621, 0.009266985580325127, 0.1563364714384079, -0.08028464764356613, -0.03910507634282112, 0.010425879620015621, 0.0328378789126873, 0.06509757786989212, -0.10825032740831375, 0.0877801924943924, 0.032786235213279724, -0.08132780343294144, 0.01850515604019165, -0.022389620542526245, 0.004817164037376642, -0.03601212799549103, 0.04951194301247597, -0.06653517484664917, -0.003910573199391365, 0.12804380059242249, -0.039423998445272446, 0.07795718312263489, 0.009157046675682068, 0.07890256494283676, -0.12229322642087936, -0.06143674626946449, -0.03939821571111679, 0.04101140424609184, 0.001859121723100543, -0.021947549656033516, -0.008763344027101994, 0.03981788083910942, 0.04017959535121918, -0.007627745158970356, -0.013233823701739311, -0.1287887841463089, -0.04045115038752556, 0.18275569379329681, 0.16827832162380219, -0.07172664999961853, 0.02175622247159481, -0.001249567256309092, -0.044387467205524445, 0.05233084782958031, -0.1870659440755844, 0.05486849695444107, 0.010058484971523285, 0.025691179558634758, 0.03781091049313545, -0.007810783106833696, -0.07520271092653275, 0.07045242935419083, 0.08782151341438293, -0.19307658076286316, -0.11986493319272995, -0.04288223013281822, -0.16985836625099182, -0.06567845493555069, -0.009831267409026623, 0.16643038392066956, -0.07251708954572678, 0.011743239127099514, 0.04086751863360405, 0.009219987317919731, 0.017270678654313087, 0.08822702616453171, -0.004304044414311647, -0.014421531930565834, -0.05790608003735542, 0.09446507692337036, 0.09016967564821243, -0.04869961738586426, 0.07095261663198471, 0.1646023988723755, -0.13892677426338196, -0.06954833120107651, -0.12438790500164032, 0.034111037850379944, 0.14532946050167084, -0.004600156098604202, 0.002896002260968089, 0.030197976157069206, 0.01253578718751669, 0.12193000316619873, 0.0007525973487645388, 0.045643776655197144, -0.017246069386601448, 0.03754164278507233, -0.03194370120763779, 0.12213071435689926, 0.0014335300074890256, -0.010356453247368336, -0.06798970699310303, 0.04411141946911812, 0.018783750012516975, 0.01849517412483692, -0.041503697633743286, -0.04821223393082619, -0.07070724666118622, -0.01521778479218483, 0.014755534939467907, 0.034644726663827896, -0.05886084586381912, 0.0008788043051026762, -0.022405853495001793, -0.017910407856106758, -0.038260865956544876, 0.015866829082369804, -0.0021396856755018234, -0.022751711308956146, -0.017106860876083374, 0.04372722655534744, -0.1281071901321411, 0.04120873659849167, 0.00853960495442152, -0.05599929019808769, 0.16234634816646576, 0.07456517219543457, -0.07130897045135498, 0.040828801691532135, -0.03373308107256889, -0.01770261861383915, -0.002089823130518198, 0.03666068613529205, -0.0035256154369562864, -0.06880607455968857, 0.028664980083703995, 0.025984719395637512, -0.02007983811199665, -0.002826340263709426, 0.08840242028236389, -0.10930059105157852, 0.0706431120634079, 0.01404805388301611, 0.022350508719682693, -0.06966863572597504, 0.03916159272193909, 0.08351320773363113, 0.04156232997775078, 0.15919847786426544, -0.08644598722457886, -0.007983195595443249, -0.13376067578792572, -0.009412449784576893, 0.03082212433218956, -0.0564788393676281, -0.09101340174674988, -0.009434576146304607, 0.09030792117118835, -0.08998694270849228, 0.012289654463529587, -0.017237260937690735, -0.04463857412338257, 0.07788267731666565, -0.09367713332176208, -0.09228074550628662, 0.003271344816312194, 0.13550615310668945, 0.026263747364282608, -0.037792980670928955, 0.01336644683033228, -0.001540712546557188, 0.0004769944935105741, 0.08085685223340988, 0.10723788291215897, 0.1333586424589157, 0.06425219029188156, 0.061927009373903275, 0.0038865217939019203, 0.004777104593813419, -0.1406673640012741, 0.13704080879688263, -0.15591931343078613, 0.09067907184362411, -0.06835798919200897, 0.06519949436187744, 0.21179461479187012, -0.025319892913103104, 0.01945400983095169, 0.015905478969216347, -0.03650245815515518, -0.11329389363527298, -0.26085951924324036, -0.05690164491534233, -0.06877123564481735, 0.00042235886212438345, -0.07787637412548065, 0.05229313299059868, 0.01553268451243639, 0.0788160115480423, -0.013176852837204933, 0.13031987845897675, 0.03424974903464317, -0.101188063621521, 0.15077997744083405, -0.049830127507448196, -0.0029476832132786512, 0.024767613038420677, -0.000824775779619813, 0.01911098323762417, -0.01475189346820116, 0.04420418664813042, 0.0583680160343647, -0.024291420355439186, 0.03766311705112457, -0.13801412284374237, -0.11272066831588745, 0.01268682349473238, 0.0663759633898735, 0.011343255639076233, 0.12664099037647247, 0.11280880123376846, -0.08448905497789383, 0.018872220069169998, 0.22606255114078522, -0.0333988182246685, -0.13133381307125092, -0.12763682007789612, 0.09770549833774567, 0.034331243485212326, -0.0005351402214728296, -0.030497165396809578, -0.12865793704986572, 0.03377151116728783, 0.20372037589550018, 0.15500985085964203, -0.052678368985652924, 0.036984384059906006, -0.013757888227701187, 0.014283129945397377, 0.011722578667104244, 0.10226045548915863, 0.06378906220197678, 0.22437718510627747, -0.0025688258465379477, 0.05075344443321228, 0.005821924656629562, -0.03598586469888687, -0.06660222262144089, 0.20239126682281494, -0.016862226650118828, -0.022620845586061478, -0.047619253396987915, 0.09054888039827347, -0.062398310750722885, -0.17888404428958893, -0.06421659141778946, -0.0653311237692833, -0.08475081622600555, 0.01749255694448948, -0.028211981058120728, 0.0390905886888504, 0.04883468896150589, 0.015340061858296394, -0.01696019247174263, 0.04043152555823326, 0.0062537179328501225, -0.0559169203042984, -0.05766190588474274, 0.05162602290511131, -0.09164845943450928, 0.048299577087163925, 0.006446320563554764, 0.05227423831820488, 0.08660732954740524, 0.0015413227956742048, -0.1673264056444168, 0.10291735827922821, -0.021807169541716576, -0.12166354060173035, 0.024220354855060577, 0.16381122171878815, 0.013216083869338036, 0.11271991580724716, 0.07246846705675125, 0.06384957581758499, 0.02097027748823166, -0.07941960543394089, -0.016814518719911575, -0.10993730276823044, 0.05995805934071541, -0.03963721916079521, 0.10977929085493088, 0.0872625932097435, -0.033560626208782196, 0.0038743182085454464, -0.05611804500222206, -0.026786111295223236, -0.048069264739751816, 0.03685830906033516, 0.001436965772882104, -0.1816125512123108, -0.019843433052301407, 0.04672800004482269, 0.10445509105920792, -0.15814733505249023, -0.005769785027951002, -0.03855879232287407, 0.01862587220966816, -0.04726383835077286, 0.12054028362035751, 0.07281732559204102, 0.02145908586680889, -0.00783743616193533, 0.009273868054151535, 0.023907389491796494, 0.11418598890304565, -0.09898199886083603, -0.08795056492090225 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilRoberta-stereotype This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0651 - Accuracy: 0.9892 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.0783 | 1.0 | 5615 | 0.0703 | 0.9847 | | 0.0468 | 2.0 | 11230 | 0.0573 | 0.9863 | | 0.0316 | 3.0 | 16845 | 0.0580 | 0.9882 | | 0.0172 | 4.0 | 22460 | 0.0591 | 0.9885 | | 0.0098 | 5.0 | 28075 | 0.0651 | 0.9892 | ### Framework versions - Transformers 4.10.2 - Pytorch 1.9.0+cu102 - Datasets 1.11.0 - Tokenizers 0.10.3 Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"license": "apache-2.0", "tags": ["generated_from_trainer", "stereotype", "gender", "gender_bias"], "metrics": ["accuracy"], "widget": [{"text": "Cauterize is not just for fans of the guitarist or his other projects, but those that love music that is both aggressive and infectious and gave the album 4 out of 5 stars ."}]}
text-classification
Narrativa/distilroberta-finetuned-stereotype-detection
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "stereotype", "gender", "gender_bias", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #stereotype #gender #gender_bias #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
distilRoberta-stereotype ======================== This model is a fine-tuned version of distilroberta-base on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.0651 * Accuracy: 0.9892 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 5 ### Training results ### Framework versions * Transformers 4.10.2 * Pytorch 1.9.0+cu102 * Datasets 1.11.0 * Tokenizers 0.10.3 Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.10.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #stereotype #gender #gender_bias #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.10.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 73, 98, 4, 80 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #roberta #text-classification #generated_from_trainer #stereotype #gender #gender_bias #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.10.2\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ -0.03298669308423996, 0.27376335859298706, -0.0062947869300842285, 0.04763844981789589, 0.10138124972581863, 0.005608066916465759, 0.10709284991025925, 0.13202673196792603, -0.032159630209207535, 0.03374427556991577, 0.12013562768697739, 0.0813235193490982, 0.0336514487862587, 0.1139831393957138, -0.01607326976954937, -0.2917729318141937, -0.006346062757074833, 0.008290833793580532, -0.048453401774168015, 0.12820352613925934, 0.10701003670692444, -0.06028221547603607, 0.0808819904923439, 0.0245403703302145, -0.0318855457007885, -0.030497292056679726, 0.0017765748780220747, -0.04590465500950813, 0.07066814601421356, 0.06415960937738419, 0.08289274573326111, -0.017383968457579613, 0.01110733300447464, -0.2840806841850281, 0.015398395247757435, 0.0069162845611572266, 0.010934586636722088, 0.07008593529462814, 0.060027386993169785, -0.08457782119512558, 0.24666641652584076, -0.09777849167585373, 0.052249595522880554, 0.06603123992681503, -0.1393633633852005, -0.15906772017478943, -0.04130333289504051, -0.00610274588689208, 0.0007944059325382113, 0.10457602143287659, -0.053346969187259674, 0.05307193100452423, -0.10672245919704437, 0.06758338212966919, 0.15200753509998322, -0.20413467288017273, -0.07348108291625977, 0.038345661014318466, 0.07331588864326477, 0.08201068639755249, -0.11451863497495651, 0.035649266093969345, 0.03432678058743477, 0.0212092287838459, 0.0620700940489769, -0.034611910581588745, 0.05685430392622948, -0.029717445373535156, -0.1233213022351265, -0.05684405937790871, 0.1405993103981018, 0.04355774074792862, -0.009995222091674805, -0.11936791986227036, -0.003271023742854595, -0.06063845008611679, -0.02691572904586792, 0.01828586868941784, -0.01569645293056965, -0.013075641356408596, -0.01895599253475666, -0.10848450660705566, -0.05691203102469444, 0.003428931115195155, 0.03324344381690025, 0.13112039864063263, 0.047434546053409576, 0.009611458517611027, -0.05585848540067673, 0.03718336671590805, 0.08150482922792435, -0.11726831644773483, -0.0039788903668522835, -0.0022202543914318085, 0.02253691852092743, 0.003823883831501007, -0.015080698765814304, 0.04308542236685753, 0.07787088304758072, 0.1318097561597824, -0.09357018023729324, 0.075788214802742, -0.014403189532458782, 0.02601286582648754, -0.019915562123060226, 0.13608603179454803, -0.007710264530032873, -0.10676293820142746, 0.0034928149543702602, 0.02060772106051445, -0.012940258719027042, -0.06780213862657547, -0.07522356510162354, -0.0011111112544313073, 0.04411185160279274, 0.1251494288444519, 0.03381330892443657, 0.06468617171049118, -0.03260847553610802, -0.052588608115911484, -0.03661602362990379, -0.08703357726335526, 0.04633776471018791, 0.03768395259976387, -0.046651143580675125, 0.033514462411403656, -0.017970480024814606, -0.009985210373997688, -0.07827243953943253, -0.06509800255298615, -0.07536327838897705, -0.019878214225172997, -0.09037849307060242, -0.0512981042265892, 0.04173694923520088, -0.03899070993065834, 0.0012996364384889603, -0.10255105048418045, -0.14684438705444336, -0.03549196943640709, 0.03706183284521103, -0.08734506368637085, -0.08324672281742096, -0.10786605626344681, -0.056535325944423676, 0.04881379380822182, -0.005249231122434139, 0.010144012048840523, -0.05435321480035782, 0.08100974559783936, -0.022635286673903465, 0.02349836751818657, 0.006505904719233513, 0.03344625234603882, -0.08403943479061127, 0.028987793251872063, -0.067438043653965, 0.1159050241112709, -0.05466669425368309, 0.015488640405237675, -0.101421058177948, -0.09733632951974869, 0.007541170343756676, 0.03251894563436508, -0.0017245119670405984, 0.10276240855455399, -0.11467625200748444, -0.049769770354032516, 0.169244647026062, -0.05008469521999359, -0.01830819807946682, 0.12053752690553665, -0.025978947058320045, 0.02056434191763401, 0.06023520231246948, 0.19746559858322144, 0.10408362746238708, 0.021525021642446518, 0.06719979643821716, -0.04380490630865097, 0.06137586757540703, 0.0699910819530487, 0.027280928567051888, -0.02248210646212101, 0.0069431122392416, 0.010149705223739147, -0.05501263961195946, 0.06298123300075531, -0.066045381128788, -0.13042323291301727, 0.0027663519140332937, -0.08466456830501556, 0.08293111622333527, 0.03277766332030296, 0.005225654691457748, -0.07162263244390488, -0.08265390992164612, -0.0033817330840975046, 0.08152391761541367, -0.08645367622375488, 0.006295091938227415, -0.1199600026011467, 0.11864107847213745, 0.03732764720916748, 0.007508367300033569, -0.13588860630989075, -0.10171922296285629, 0.02751249633729458, -0.11185894906520844, 0.10290558636188507, 0.03798944503068924, 0.059315722435712814, 0.03233296796679497, -0.068575918674469, -0.00630745617672801, -0.02626487985253334, -0.018903806805610657, -0.018174687400460243, -0.19935186207294464, 0.05681841820478439, -0.029234647750854492, 0.12609490752220154, -0.31226861476898193, 0.02742689847946167, 0.045862846076488495, 0.09170183539390564, 0.031922776252031326, -0.04516513645648956, 0.003700629575178027, 0.042481325566768646, -0.03177511692047119, -0.043713562190532684, 0.059667788445949554, -0.036353982985019684, -0.12385241687297821, 0.07462753355503082, -0.1559045910835266, 0.05726920813322067, 0.08797340095043182, -0.13228225708007812, -0.05413484573364258, 0.020435674116015434, -0.057565852999687195, -0.0018035579705610871, 0.010039902292191982, -0.027134155854582787, 0.2203596830368042, -0.02540643699467182, 0.0948532298207283, -0.07402095943689346, -0.04596862196922302, -0.014261766336858273, -0.05467350780963898, -0.047027818858623505, 0.15559124946594238, 0.020106090232729912, -0.08660102635622025, 0.1363024115562439, 0.11989099532365799, -0.04164176434278488, 0.2849656939506531, -0.008709842339158058, -0.03754095733165741, -0.007472129072993994, 0.01854715123772621, -0.010100211016833782, 0.028733156621456146, -0.1667478084564209, -0.028329143300652504, 0.025945819914340973, 0.014207019470632076, 0.03987312316894531, -0.13592112064361572, -0.03909824416041374, -0.016959523782134056, -0.08620911836624146, 0.030748041346669197, 0.03197193145751953, 0.013792240060865879, 0.1281813383102417, 0.016675712540745735, -0.04890067130327225, 0.009022281505167484, -0.04190026596188545, -0.08308640867471695, 0.17858386039733887, -0.0741172805428505, -0.23523738980293274, -0.14645503461360931, -0.02921346202492714, -0.08274123817682266, 0.0056550693698227406, 0.012951808981597424, -0.09278654307126999, -0.05557030439376831, -0.06434013694524765, 0.10773289948701859, -0.05341142788529396, -0.03510873019695282, -0.044404447078704834, 0.034743428230285645, 0.058776866644620895, -0.12724827229976654, -0.00045960996067151427, 0.035770658403635025, -0.09260838478803635, 0.04323175176978111, -0.01983591914176941, 0.10723377764225006, 0.1507486253976822, -0.020586693659424782, -0.02077031135559082, -0.009432016871869564, 0.27900001406669617, -0.15171457827091217, 0.021281491965055466, 0.14296060800552368, -0.014311636798083782, 0.045688945800065994, 0.09882034361362457, 0.0474972277879715, -0.09974899142980576, 0.01909933239221573, 0.08390843123197556, -0.06185004487633705, -0.27753549814224243, -0.09831155091524124, -0.0644163116812706, -0.0030289827845990658, 0.08330658823251724, 0.07404544949531555, 0.1164332777261734, 0.007782553788274527, -0.05705007165670395, 0.009669261053204536, 0.07626110315322876, 0.07369499653577805, 0.14889483153820038, 0.0019757936242967844, 0.10614585131406784, -0.01878178119659424, -0.06846088916063309, 0.026117580011487007, 0.01129826344549656, 0.19497281312942505, 0.05037439614534378, 0.181592658162117, 0.07152858376502991, 0.09939689189195633, 0.03407508507370949, -0.057525645941495895, 0.043289974331855774, 0.01866423338651657, -0.028988836333155632, -0.06492798775434494, -0.01095041073858738, 0.05419471487402916, 0.05179949104785919, -0.023100027814507484, -0.003239857265725732, -0.0920267105102539, 0.07016412168741226, 0.18238037824630737, 0.06509852409362793, -0.2102225124835968, -0.02105218917131424, 0.09124735742807388, -0.07572135329246521, -0.07791054993867874, 0.05558360368013382, 0.06356478482484818, -0.14059041440486908, 0.08607180416584015, -0.016959981992840767, 0.10693541914224625, -0.1577509492635727, 0.0379083976149559, -0.00631615798920393, 0.04395415261387825, 0.019980160519480705, 0.12268972396850586, -0.3094022870063782, 0.2405371218919754, 0.021075166761875153, 0.03356330841779709, -0.06713010370731354, -0.025263143703341484, 0.024364938959479332, -0.032128315418958664, 0.08214664459228516, 0.0326983667910099, -0.027495579794049263, -0.12522459030151367, -0.01740899123251438, 0.024950899183750153, 0.1131269633769989, -0.041339755058288574, 0.07205816358327866, -0.04656054452061653, 0.011876609176397324, 0.003918321803212166, -0.11925313621759415, -0.14560240507125854, -0.13245521485805511, 0.008887462317943573, 0.013230379670858383, 0.021466504782438278, -0.04549673944711685, -0.06691566854715347, -0.04862675815820694, 0.17891326546669006, -0.08448100835084915, -0.06227889657020569, -0.08169835805892944, 0.030204910784959793, 0.07450839132070541, -0.07598951458930969, 0.012484000064432621, 0.016023794189095497, 0.1502881795167923, 0.005179017782211304, -0.04292147606611252, 0.054882049560546875, -0.029624367132782936, -0.2129867821931839, -0.03681015595793724, 0.16224777698516846, 0.09307924658060074, 0.07886873930692673, -0.0016394122503697872, 0.0015232200967147946, -0.010223068296909332, -0.12608115375041962, 0.033644240349531174, 0.0935693234205246, -0.08376838266849518, 0.0534796342253685, -0.008386987261474133, -0.01199040375649929, -0.12932822108268738, -0.07287739962339401, 0.1600838452577591, 0.21439452469348907, -0.039143677800893784, 0.04869093373417854, 0.12751175463199615, -0.11636193841695786, -0.22559796273708344, -0.0037563631776720285, 0.015754710882902145, -0.026439227163791656, 0.0167249646037817, -0.21952050924301147, 0.025409622117877007, 0.04586617648601532, 0.006996259093284607, 0.10383442044258118, -0.2678736746311188, -0.11148810386657715, 0.07672500610351562, 0.1222183033823967, 0.05960909277200699, -0.1372465342283249, -0.059835031628608704, -0.021466605365276337, -0.02240123599767685, 0.15575723350048065, -0.04517329856753349, 0.08502533286809921, -0.00910975132137537, 0.14500215649604797, 0.02553544193506241, -0.012505529448390007, 0.1572255641222, -0.04885680601000786, 0.04253171756863594, -0.09334743022918701, 0.015519949607551098, 0.07977242022752762, -0.0097972322255373, 0.03014674410223961, -0.11810395121574402, -0.026965150609612465, -0.11289052665233612, -0.04892773553729057, -0.09663461148738861, 0.038675665855407715, -0.059912361204624176, -0.05520237237215042, -0.06568795442581177, 0.05516918748617172, 0.06734753400087357, 0.021338751539587975, 0.141275092959404, -0.074591726064682, 0.11318960040807724, 0.051730625331401825, 0.16520029306411743, 0.008468124084174633, -0.021908393129706383, -0.029002545401453972, -0.006366616580635309, 0.04454079270362854, -0.12022823095321655, 0.028611965477466583, 0.1281791776418686, -0.0005862314137630165, 0.16713237762451172, 0.03498714044690132, -0.08564092218875885, 0.00959313940256834, 0.05103171616792679, -0.08994858711957932, -0.20265910029411316, -0.03129947930574417, -0.03413444757461548, -0.08819270879030228, -0.06674674898386002, 0.15108756721019745, -0.0357254222035408, -0.02404726855456829, 0.004471085499972105, 0.04222937673330307, -0.0255606509745121, 0.12006346881389618, 0.06269991397857666, 0.04470186308026314, -0.07913610339164734, 0.0475173145532608, 0.09761238098144531, -0.03188209608197212, 0.06833051145076752, 0.1420893371105194, -0.024950388818979263, -0.07421733438968658, 0.01027399767190218, 0.1504669338464737, -0.06848777085542679, -0.051993079483509064, -0.052016481757164, -0.1489439457654953, 0.06140923500061035, 0.12287568300962448, 0.02116517908871174, 0.03247583657503128, -0.03655961900949478, -0.005718809086829424, -0.040597956627607346, 0.1052941381931305, -0.0092791011556983, -0.019745802506804466, -0.0892924889922142, 0.0788598507642746, 0.03290539234876633, 0.009779826737940311, -0.0224615428596735, -0.03201381862163544, -0.13763609528541565, 0.03543495759367943, -0.08766797930002213, 0.025784319266676903, -0.08382007479667664, 0.04592634737491608, -0.04363953694701195, -0.06077086552977562, -0.07168404012918472, -0.011804362758994102, -0.08766107261180878, -0.036119308322668076, 0.014423361048102379, 0.09411894530057907, -0.07871773093938828, -0.021541988477110863, 0.04071091115474701, -0.056992921978235245, 0.06720791012048721, -0.055019646883010864, -0.011039931327104568, 0.08622480928897858, -0.1775878220796585, 0.09166549891233444, 0.023229634389281273, 0.03552244231104851, -0.013452505692839622, -0.1231742724776268, -0.038533762097358704, -0.007886054925620556, 0.04706493392586708, 0.04794740676879883, -0.017132487148046494, -0.10239161550998688, 0.04259021580219269, -0.0028173758182674646, -0.15180674195289612, -0.040045060217380524, 0.07603146880865097, 0.06785957515239716, -0.03753833472728729, 0.13181999325752258, -0.06497050821781158, 0.052609678357839584, -0.14940230548381805, 0.039242591708898544, 0.004953369498252869, -0.04097349941730499, -0.051433637738227844, -0.05146542936563492, 0.057838261127471924, -0.05600305274128914, 0.17083801329135895, 0.028318408876657486, 0.025140702724456787, 0.030119476839900017, 0.00733612896874547, -0.022609760984778404, 0.03314148634672165, 0.08775913715362549, 0.040705230087041855, -0.05030927062034607, 0.019950438290834427, -0.0031271944753825665, 0.02716068923473358, 0.14977669715881348, 0.14140602946281433, 0.1114451140165329, 0.09364117681980133, 0.055169928818941116, 0.01915830746293068, -0.032636530697345734, -0.1394616812467575, 0.08854477107524872, 0.06424537301063538, 0.06536564975976944, -0.07373280823230743, 0.1901404857635498, 0.1544618308544159, -0.17886210978031158, 0.07007396221160889, -0.04695625230669975, -0.08166331052780151, -0.12630009651184082, -0.2819335162639618, -0.06693066656589508, -0.05577758699655533, -0.010541352443397045, -0.1304510086774826, 0.04225888475775719, -0.0012964836787432432, 0.02138388529419899, -0.08270955085754395, 0.04918193444609642, 0.03886552155017853, -0.04533982276916504, 0.057966649532318115, 0.04129720851778984, 0.07057622820138931, -0.04295552149415016, -0.005215494427829981, -0.02638942189514637, 0.028608446940779686, 0.03040052391588688, 0.012016099877655506, -0.08075420558452606, 0.03158523514866829, -0.09243191033601761, -0.08804672956466675, 0.01678485795855522, 0.02595374546945095, 0.06178569793701172, 0.15419995784759521, 0.028211692348122597, -0.03058697283267975, 0.009780758060514927, 0.23052333295345306, -0.00560449855402112, -0.020008841529488564, -0.09781508892774582, 0.18543589115142822, 0.00934192817658186, 0.02408492937684059, -0.008122415281832218, -0.061475835740566254, -0.00781239802017808, 0.2388186752796173, 0.23622256517410278, -0.032713305205106735, 0.009619086980819702, -0.012100876308977604, 0.02227720059454441, 0.028018049895763397, 0.06692874431610107, 0.10019777715206146, 0.1632649451494217, -0.047985561192035675, -0.027017692103981972, -0.040647588670253754, 0.01248221006244421, -0.0582893081009388, 0.11530879884958267, 0.025563709437847137, 0.0022747789043933153, -0.06779485195875168, 0.07770475745201111, -0.18829602003097534, -0.04531998932361603, -0.015530718490481377, -0.1928943693637848, -0.13985097408294678, -0.016006935387849808, 0.0031691736076027155, 0.025375062599778175, 0.06255783140659332, 0.003297339426353574, -0.05586693435907364, 0.023799212649464607, 0.00893994327634573, -0.07483979314565659, -0.06730353832244873, 0.09392198175191879, -0.03244530037045479, 0.2215951681137085, -0.025523044168949127, 0.016929293051362038, 0.137663334608078, 0.011820674873888493, -0.06854353845119476, 0.05092192441225052, 0.0665878877043724, -0.00042403064435347915, 0.016639841720461845, 0.20253215730190277, -0.02641722746193409, 0.12148799747228622, 0.1498108208179474, -0.019604971632361412, 0.01801856979727745, -0.046414814889431, -0.0606294721364975, -0.03032797947525978, 0.032261621206998825, -0.1119832918047905, 0.08061563968658447, 0.2044256329536438, -0.03743909299373627, -0.034116726368665695, -0.047522641718387604, -0.005292915273457766, -0.007863630540668964, 0.07492829859256744, -0.04999881237745285, -0.19523558020591736, -0.03711835294961929, 0.034131959080696106, 0.05217081680893898, -0.20245692133903503, -0.04803391173481941, -0.04135030508041382, -0.004156441893428564, -0.07401133328676224, 0.12714965641498566, 0.05583282932639122, -0.002261974848806858, -0.031936727464199066, -0.1566610485315323, -0.03273750841617584, 0.14365260303020477, -0.12256868183612823, -0.024377936497330666 ]
null
null
transformers
# mT5-base fine-tuned on TyDiQA for multilingual Question Generation 🗺📖❓ [Google's mT5-base](https://huggingface.co/google/mt5-base) fine-tuned on [TyDi QA](https://huggingface.co/nlp/viewer/?dataset=tydiqa&config=secondary_task) (secondary task) for **multingual Question Generation** downstream task (by answer prepending). ## Details of mT5 [Google's mT5](https://github.com/google-research/multilingual-t5) mT5 is pretrained on the [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu. **Note**: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task. Pretraining Dataset: [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) Other Community Checkpoints: [here](https://huggingface.co/models?search=mt5) Paper: [mT5: A massively multilingual pre-trained text-to-text transformer](https://arxiv.org/abs/2010.11934) Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel* ## Details of the dataset 📚 **TyDi QA** is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs. The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language expresses -- such that we expect models performing well on this set to generalize across a large number of the languages in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without the use of translation (unlike MLQA and XQuAD). | Dataset | Task | Split | # samples | | -------- | ----- |------| --------- | | TyDi QA | GoldP | train| 49881 | | TyDi QA | GoldP | valid| 5077 | ## Results on validation dataset 📝 ### WIP ## Model in Action 🚀 ### WIP Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "multilingual", "datasets": ["tydiqa"], "widget": [{"text": "answer: monitoring and managing PR strategy including relations with the media and journalists context: Sof\u00eda has a degree in Communications and public relations agency experience where she was in charge of monitoring and managing PR strategy including relations with the media and journalists."}]}
text2text-generation
Narrativa/mT5-base-finetuned-tydiQA-question-generation
[ "transformers", "pytorch", "mt5", "text2text-generation", "multilingual", "dataset:tydiqa", "arxiv:2010.11934", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2010.11934" ]
[ "multilingual" ]
TAGS #transformers #pytorch #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
mT5-base fine-tuned on TyDiQA for multilingual Question Generation ================================================================== Google's mT5-base fine-tuned on TyDi QA (secondary task) for multingual Question Generation downstream task (by answer prepending). Details of mT5 -------------- Google's mT5 mT5 is pretrained on the mC4 corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu. Note: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task. Pretraining Dataset: mC4 Other Community Checkpoints: here Paper: mT5: A massively multilingual pre-trained text-to-text transformer Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel* Details of the dataset ---------------------- TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs. The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language expresses -- such that we expect models performing well on this set to generalize across a large number of the languages in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without the use of translation (unlike MLQA and XQuAD). Results on validation dataset ----------------------------- ### WIP Model in Action --------------- ### WIP Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "### WIP\n\n\nModel in Action\n---------------", "### WIP\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### WIP\n\n\nModel in Action\n---------------", "### WIP\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 68, 10, 50 ]
[ "passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### WIP\n\n\nModel in Action\n---------------### WIP\n\n\nCreated by: Narrativa\n\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ -0.055299390107393265, 0.1464882344007492, -0.0030029781628400087, 0.05672416090965271, 0.09390322118997574, -0.02160263992846012, 0.09456703811883926, 0.09943564981222153, -0.008880897425115108, -0.0447075180709362, 0.18595856428146362, 0.15399274230003357, 0.008797118440270424, 0.15042969584465027, -0.007036611437797546, -0.2938555181026459, 0.03132271021604538, 0.02502824366092682, 0.08604028820991516, 0.12431814521551132, 0.12035372108221054, 0.003025940852239728, 0.07575487345457077, 0.0523664616048336, -0.13883541524410248, 0.016215700656175613, 0.011074238456785679, -0.1285858005285263, 0.10568919777870178, 0.09641177952289581, 0.01952316425740719, -0.059459853917360306, 0.021539298817515373, -0.14917749166488647, 0.03674237057566643, -0.023983286693692207, -0.006338877137750387, 0.03689679503440857, 0.02813159115612507, -0.085692398250103, 0.1945105791091919, 0.010483179241418839, -0.042407065629959106, 0.0415523424744606, -0.13290004432201385, 0.010870441794395447, 0.024726852774620056, 0.026473747566342354, -0.013455569744110107, 0.12332843244075775, -0.04489799588918686, 0.07172287255525589, -0.06026477739214897, 0.09531789273023605, 0.040578797459602356, -0.320441871881485, -0.07548419386148453, -0.009663304314017296, 0.054131004959344864, -0.055253803730010986, -0.03109489381313324, 0.11199050396680832, 0.09859844297170639, 0.04819831997156143, -0.05647694319486618, -0.03851087763905525, -0.03205046430230141, -0.03725799173116684, -0.10340383648872375, 0.011491980403661728, 0.27975741028785706, -0.0230014119297266, 0.0077337599359452724, -0.08006719499826431, -0.027451705187559128, 0.07745230197906494, -0.007478257641196251, -0.007331607397645712, -0.0831155851483345, 0.03613073751330376, 0.021955491974949837, -0.05242887884378433, -0.07785965502262115, 0.04723912104964256, -0.09605172276496887, 0.2017335742712021, 0.04716194421052933, 0.04168745130300522, -0.17407476902008057, 0.0671723261475563, 0.10396591573953629, -0.08211202174425125, 0.015119591727852821, -0.07108251005411148, 0.08380737900733948, 0.0398663766682148, -0.023726075887680054, 0.07977929711341858, 0.07599027454853058, 0.02430744096636772, 0.048434335738420486, 0.010996600612998009, -0.002816039603203535, 0.05042683333158493, 0.07261991500854492, 0.069516621530056, -0.04602014645934105, -0.18173930048942566, 0.042517248541116714, -0.08623102307319641, -0.002224391559138894, -0.056815993040800095, -0.09789373725652695, -0.0023066282738000154, -0.030786022543907166, 0.05534274876117706, 0.07227837294340134, 0.10982361435890198, -0.025200217962265015, -0.0423835851252079, -0.019731875509023666, -0.07576649636030197, -0.00994313508272171, -0.028124410659074783, -0.014140543527901173, 0.26872387528419495, 0.026159480214118958, -0.010045349597930908, -0.18350231647491455, -0.03546706214547157, -0.04601746052503586, -0.022617226466536522, -0.06040450185537338, -0.022286489605903625, 0.038666047155857086, 0.004241215996444225, 0.06464036554098129, -0.16362085938453674, -0.1622585654258728, -0.01325245201587677, 0.05850087106227875, -0.048779815435409546, -0.134623184800148, -0.132626473903656, -0.005311891436576843, 0.06877107173204422, -0.05746854096651077, -0.05979299545288086, -0.07123729586601257, 0.07668952643871307, -0.0598960816860199, 0.04953493922948837, -0.168851837515831, 0.032526515424251556, -0.06132687255740166, -0.016421843320131302, 0.050775762647390366, 0.11810170114040375, 0.023650119081139565, 0.021015239879488945, -0.024445215240120888, -0.019716273993253708, 0.002594488672912121, 0.0933937057852745, -0.02207280322909355, 0.19787417352199554, -0.09430186450481415, -0.041011519730091095, 0.18365055322647095, -0.06304903328418732, -0.05402303859591484, 0.16131876409053802, 0.007370691746473312, 0.17275723814964294, 0.15747208893299103, 0.2264857441186905, 0.026122134178876877, -0.03151614964008331, 0.014507127925753593, 0.023170743137598038, -0.037136949598789215, 0.020881617441773415, -0.0034103256184607744, 0.09633985906839371, -0.13972176611423492, 0.042939454317092896, 0.11641200631856918, 0.064008928835392, -0.036975812166929245, -0.05911673605442047, 0.00862930528819561, -0.06713178753852844, 0.08162732422351837, -0.035726968199014664, 0.04277922958135605, -0.05322207510471344, -0.043486688286066055, 0.007679889444261789, 0.08386310189962387, -0.037356507033109665, 0.036932263523340225, -0.16741003096103668, 0.08888662606477737, -0.03823864459991455, 0.05697545409202576, -0.07021446526050568, -0.042948976159095764, -0.04075577110052109, 0.06748805195093155, 0.10179948061704636, 0.0529782809317112, 0.013888479210436344, -0.004039916675537825, -0.03390105068683624, 0.0512169785797596, 0.07250239700078964, -0.02616572566330433, -0.0488344244658947, -0.16023996472358704, 0.09188634157180786, -0.0549442321062088, 0.024704700335860252, -0.20456697046756744, -0.009106017649173737, -0.023096174001693726, 0.06306732445955276, -0.03711318224668503, 0.02295737899839878, -0.020939458161592484, 0.01225536409765482, -0.07530683279037476, 0.015827447175979614, 0.08464916795492172, -0.004173944238573313, -0.09530212730169296, 0.22581924498081207, -0.10726553946733475, 0.1506171077489853, 0.1608985811471939, -0.23377877473831177, 0.01287816185504198, 0.008614796213805676, -0.014521683566272259, -0.03015204519033432, 0.0033809724263846874, -0.009710454382002354, 0.19797144830226898, -0.04970013350248337, 0.1114783063530922, -0.0581052340567112, -0.04695684090256691, -0.018512358888983727, -0.06485779583454132, -0.02045600861310959, 0.13154909014701843, -0.0038431682623922825, -0.19771981239318848, 0.13150063157081604, 0.16174547374248505, 0.03659093379974365, 0.30229058861732483, 0.0775640681385994, -0.00410444987937808, 0.03265709429979324, -0.023300079628825188, -0.04139367863535881, 0.04068230837583542, -0.20364801585674286, -0.02840079739689827, 0.037531185895204544, 0.029650172218680382, 0.08932173997163773, -0.0868382453918457, -0.06524596363306046, -0.04163097217679024, -0.025142444297671318, -0.02000368945300579, 0.06353525072336197, -0.04788927361369133, 0.11579306423664093, 0.008648056536912918, -0.03339715301990509, 0.05418282002210617, 0.015069874003529549, -0.06815018504858017, 0.15242606401443481, -0.07595603913068771, -0.33133718371391296, -0.12093255668878555, -0.2144317328929901, -0.08162251114845276, 0.013308227062225342, 0.061297670006752014, -0.10857855528593063, -0.014649887569248676, 0.018490910530090332, 0.059095337986946106, -0.110198013484478, -0.06694669276475906, 0.024600287899374962, -0.014454202726483345, -0.034707434475421906, -0.10822185128927231, -0.028954287990927696, 0.024877332150936127, -0.02824542112648487, 0.07480547577142715, -0.12360508739948273, 0.09592747688293457, 0.15989314019680023, 0.03514496982097626, 0.02972155250608921, -0.01439413707703352, 0.2220238298177719, -0.15612372756004333, 0.024138081818819046, 0.16288259625434875, -0.1053844541311264, 0.029976028949022293, 0.02709798514842987, 0.016196761280298233, -0.12452404946088791, 0.028160136193037033, 0.003669421188533306, -0.07081558555364609, -0.25917723774909973, -0.1651945263147354, -0.06763705611228943, 0.12138903886079788, 0.060300618410110474, 0.053499653935432434, 0.01681523025035858, 0.08095870912075043, -0.03803602233529091, 0.06472605466842651, 0.01817001774907112, 0.08789891004562378, 0.14247217774391174, -0.08268655836582184, 0.12269481271505356, -0.03461961820721626, -0.1257907599210739, 0.07610590010881424, 0.0903254896402359, 0.057905733585357666, 0.07155507057905197, -0.017040876671671867, 0.03601624071598053, 0.0691390112042427, 0.08972624689340591, 0.11002928018569946, 0.03161554038524628, -0.010805381461977959, -0.04542998597025871, -0.05711150914430618, -0.01717321388423443, 0.0641416385769844, 0.004675312899053097, -0.052438098937273026, 0.020608654245734215, -0.03661816567182541, 0.07373046875, 0.15285234153270721, 0.12615565955638885, -0.16739481687545776, -0.03943297639489174, 0.07974075525999069, -0.09758245944976807, -0.1629529595375061, 0.14540055394172668, 0.07294225692749023, -0.12729676067829132, 0.12854275107383728, 0.041215505450963974, 0.12315844744443893, -0.16138547658920288, 0.050879042595624924, -0.10705597698688507, -0.06708084046840668, 0.02404169738292694, 0.11814776808023453, -0.2645731270313263, 0.20127207040786743, 0.007823856547474861, 0.005455163307487965, -0.11878147721290588, -0.045518193393945694, 0.012255615554749966, 0.09054410457611084, 0.1554328352212906, 0.03959907591342926, 0.061156194657087326, -0.011202871799468994, -0.08053470402956009, 0.06011543422937393, 0.08438003808259964, 0.06224367395043373, 0.024388495832681656, -0.0055951643735170364, 0.028331030160188675, -0.05852911248803139, -0.068637415766716, -0.13105939328670502, -0.19243352115154266, 0.04333828017115593, 0.11834654957056046, 0.13071703910827637, 0.04846825450658798, -0.07234716415405273, -0.09148009866476059, 0.26170405745506287, 0.003123384667560458, -0.05876118689775467, -0.09255960583686829, -0.02402229979634285, -0.008583180606365204, -0.046954020857810974, -0.018976321443915367, 0.0037597143091261387, 0.02287447080016136, -0.06538219749927521, -0.11111924052238464, 0.08372686803340912, -0.0908132791519165, -0.0374494232237339, -0.026585988700389862, 0.11261720955371857, 0.0883554145693779, 0.02112150751054287, 0.0579037219285965, -0.054209042340517044, -0.07229425013065338, -0.12502522766590118, -0.010416872799396515, 0.07769419252872467, -0.11839860677719116, 0.035169728100299835, -0.05331706628203392, -0.02479434572160244, -0.10853631049394608, -0.11107856780290604, 0.22557370364665985, 0.06703044474124908, -0.012743215076625347, 0.11557013541460037, 0.17450062930583954, -0.11129710078239441, -0.32157617807388306, -0.10179151594638824, -0.0752619132399559, 0.011549821123480797, -0.1095360741019249, -0.16136333346366882, 0.1306302845478058, -0.01909053511917591, 0.0007861590129323304, 0.1561235636472702, -0.19988486170768738, -0.08687851577997208, 0.10297726094722748, 0.03586264327168465, 0.3764137029647827, -0.06690698862075806, -0.012319876812398434, -0.11544080078601837, -0.020248351618647575, 0.16486750543117523, -0.12586912512779236, 0.08791448175907135, -0.0013406203361228108, 0.1917470097541809, 0.01870638132095337, 0.03461659327149391, 0.11643005162477493, -0.05372004583477974, -0.056993041187524796, -0.09112624824047089, 0.013484690338373184, 0.006329908035695553, 0.04509961977601051, 0.054365359246730804, -0.08254527300596237, -0.054399654269218445, -0.13654941320419312, -0.08539154380559921, -0.07934630662202835, 0.037380386143922806, -0.007908975705504417, -0.07420415431261063, 0.011132842861115932, -0.005377465393394232, 0.042414065450429916, 0.04318706691265106, 0.06469722092151642, -0.1066889613866806, 0.06324952840805054, -0.043264467269182205, 0.19265496730804443, -0.09678641706705093, 0.05826054885983467, -0.06121945381164551, -0.0664762631058693, 0.09213853627443314, -0.14037655293941498, -0.00873784814029932, 0.10439381003379822, -0.02943749725818634, 0.052730705589056015, 0.05104566365480423, -0.06645601242780685, 0.011584230698645115, 0.0836583599448204, -0.15337447822093964, -0.07062128931283951, -0.06828739494085312, 0.042005475610494614, 0.09285595268011093, 0.03747542202472687, 0.25017601251602173, -0.06801917403936386, -0.03156418725848198, -0.009457484818994999, 0.011016257107257843, -0.0876360833644867, 0.10503745824098587, -0.002632593037560582, -0.013234744779765606, -0.13996072113513947, 0.051536764949560165, 0.033621642738580704, -0.07784196734428406, 0.07899448275566101, 0.1520293951034546, -0.04936722293496132, -0.11853136867284775, -0.0892489105463028, 0.08802109956741333, -0.12382025271654129, -0.10443970561027527, 0.0233562383800745, -0.14689795672893524, 0.09014659374952316, 0.11951512098312378, 0.03739273175597191, 0.019663769751787186, -0.08708357810974121, -0.04835446551442146, -0.03637281805276871, 0.015896238386631012, 0.06989173591136932, -0.033542998135089874, -0.10034848749637604, 0.027027595788240433, 0.013498755171895027, 0.15719757974147797, -0.07694555073976517, -0.06446715444326401, -0.12383928149938583, 0.05728278309106827, -0.16591094434261322, 0.0011216814164072275, -0.11825297027826309, 0.032365716993808746, -0.02807929739356041, -0.03379738703370094, -0.05411703139543533, -0.029519904404878616, -0.08882828801870346, 0.02059851586818695, 0.0305540282279253, 0.02439870685338974, 0.012949966825544834, 0.015770351514220238, 0.045065250247716904, -0.031459275633096695, 0.10257613658905029, 0.04395825415849686, -0.10572396218776703, 0.09594058245420456, -0.16718851029872894, 0.0243721604347229, 0.08649507164955139, 0.025386197492480278, 0.023127565160393715, -0.012171615846455097, -0.024211682379245758, 0.05889882519841194, 0.0832858681678772, 0.061571426689624786, 0.09752506762742996, -0.08197855949401855, 0.018246592953801155, 0.02212442457675934, -0.14909835159778595, -0.01853000931441784, -0.00047251718933694065, 0.00010467565152794123, -0.02013469859957695, 0.0680348128080368, -0.08954502642154694, 0.05658141151070595, -0.015935448929667473, 0.06908971071243286, 0.019741369411349297, -0.11875680834054947, -0.007297221105545759, -0.07513438165187836, 0.04306609928607941, -0.007790565490722656, 0.19883552193641663, -0.06369297951459885, -0.061461806297302246, -0.007239711005240679, -0.04242660105228424, -0.15285582840442657, -0.037383172661066055, 0.09059058129787445, 0.045537836849689484, -0.0037316016387194395, -0.10893944650888443, 0.028764495626091957, 0.023830726742744446, 0.16795147955417633, 0.05237517133355141, -0.00888282060623169, 0.09455029666423798, 0.11248301714658737, -0.04771130532026291, 0.048358574509620667, -0.12891823053359985, -0.09868883341550827, -0.02967391163110733, 0.074916772544384, -0.02896038442850113, 0.05319321155548096, 0.13401475548744202, 0.007509782910346985, 0.0067626736126840115, -0.03231741860508919, -0.05312833562493324, -0.16891518235206604, -0.3058330714702606, -0.049342602491378784, -0.023747142404317856, -0.012816782109439373, -0.1450205147266388, 0.07275371253490448, -0.027529027312994003, 0.10249993950128555, -0.10276704281568527, 0.05802207812666893, 0.03569164127111435, -0.11117011308670044, 0.10367687791585922, 0.004614887293428183, 0.09436098486185074, -0.02787654474377632, 0.03937435895204544, -0.1321129947900772, 0.07186228781938553, 0.014819705858826637, 0.03422273322939873, -0.05360934138298035, 0.04247221350669861, -0.0715639516711235, -0.01976795494556427, -0.06207742914557457, 0.052810560911893845, 0.07088099420070648, 0.11923481523990631, 0.04432256147265434, -0.06760779023170471, 0.00625526811927557, 0.15338176488876343, -0.03304978832602501, -0.1495674103498459, -0.10977120697498322, 0.22407998144626617, -0.05737387761473656, 0.06448284536600113, -0.07817624509334564, 0.05211440473794937, -0.060327257961034775, 0.396340548992157, 0.2915027439594269, -0.10974928736686707, -0.016764994710683823, -0.036484964191913605, 0.04214556887745857, 0.04369736462831497, 0.0717419683933258, 0.10374800115823746, 0.27022963762283325, -0.03833841532468796, -0.06419660151004791, -0.09090223908424377, -0.006194273941218853, -0.08295971155166626, 0.08545484393835068, 0.03398752212524414, -0.05244741961359978, -0.057628631591796875, 0.07962069660425186, -0.23114022612571716, 0.13612468540668488, -0.13553567230701447, -0.12723369896411896, -0.12338022142648697, -0.00044523237738758326, -0.030917175114154816, 0.048098985105752945, 0.07906288653612137, -0.005890006199479103, -0.03966010734438896, 0.018268806859850883, 0.022865522652864456, -0.16752952337265015, 0.04804971069097519, 0.06908846646547318, -0.050446998327970505, 0.07036619633436203, 0.012078836560249329, 0.022045115008950233, 0.07083769142627716, 0.06379044055938721, -0.05918325111269951, 0.023572945967316628, -0.020809320732951164, -0.029581008478999138, 0.027243204414844513, 0.08438995480537415, -0.006298207212239504, 0.019959980621933937, 0.11507157236337662, -0.02445347234606743, 0.022534364834427834, 0.11890189349651337, 0.007716676685959101, -0.000490346341393888, 0.05660264194011688, -0.10158862918615341, 0.10238111764192581, 0.16640916466712952, -0.0325278677046299, -0.039217155426740646, -0.050867270678281784, 0.027655065059661865, -0.07975082844495773, -0.08281566202640533, -0.05778732895851135, -0.15621298551559448, -0.09100310504436493, -0.052203383296728134, 0.03505898267030716, -0.14568300545215607, 0.028211684897542, -0.09174241125583649, 0.003982142545282841, -0.12115749716758728, 0.09929691255092621, 0.08559928834438324, -0.01297822967171669, 0.007074408698827028, -0.08856479078531265, 0.050204090774059296, 0.06483017653226852, -0.06167801097035408, -0.10843013972043991 ]
null
null
transformers
# mT5-base fine-tuned on TyDiQA for multilingual QA 🗺📖❓ [Google's mT5-base](https://huggingface.co/google/mt5-base) fine-tuned on [TyDi QA](https://huggingface.co/nlp/viewer/?dataset=tydiqa&config=secondary_task) (secondary task) for **multingual Q&A** downstream task. ## Details of mT5 [Google's mT5](https://github.com/google-research/multilingual-t5) mT5 is pretrained on the [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu. **Note**: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task. Pretraining Dataset: [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) Other Community Checkpoints: [here](https://huggingface.co/models?search=mt5) Paper: [mT5: A massively multilingual pre-trained text-to-text transformer](https://arxiv.org/abs/2010.11934) Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel* ## Details of the dataset 📚 **TyDi QA** is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs. The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language expresses -- such that we expect models performing well on this set to generalize across a large number of the languages in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without the use of translation (unlike MLQA and XQuAD). | Dataset | Task | Split | # samples | | -------- | ----- |------| --------- | | TyDi QA | GoldP | train| 49881 | | TyDi QA | GoldP | valid| 5077 | ## Results on validation dataset 📝 | Metric | # Value | | ------ | --------- | | **EM** | **60.88** | ## Model in Action 🚀 ```python from transformers import AutoModelForCausalLM, AutoTokenizer import torch device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') tokenizer = AutoTokenizer.from_pretrained("Narrativa/mT5-base-finetuned-tydiQA-xqa") model = AutoModelForCausalLM.from_pretrained("Narrativa/mT5-base-finetuned-tydiQA-xqa").to(device) def get_response(question, context, max_length=32): input_text = 'question: %s context: %s' % (question, context) features = tokenizer([input_text], return_tensors='pt') output = model.generate(input_ids=features['input_ids'].to(device), attention_mask=features['attention_mask'].to(device), max_length=max_length) return tokenizer.decode(output[0]) # Some examples in different languages context = 'HuggingFace won the best Demo paper at EMNLP2020.' question = 'What won HuggingFace?' get_response(question, context) context = 'HuggingFace ganó la mejor demostración con su paper en la EMNLP2020.' question = 'Qué ganó HuggingFace?' get_response(question, context) context = 'HuggingFace выиграл лучшую демонстрационную работу на EMNLP2020.' question = 'Что победило в HuggingFace?' get_response(question, context) ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "multilingual", "datasets": ["tydiqa"], "widget": [{"text": "question: what does she do? context: Sof\u00eda has a degree in Communications and public relations agency experience where she was in charge of monitoring and managing PR strategy including relations with the media and journalists."}]}
text2text-generation
Narrativa/mT5-base-finetuned-tydiQA-xqa
[ "transformers", "pytorch", "tensorboard", "mt5", "text2text-generation", "multilingual", "dataset:tydiqa", "arxiv:2010.11934", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2010.11934" ]
[ "multilingual" ]
TAGS #transformers #pytorch #tensorboard #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
mT5-base fine-tuned on TyDiQA for multilingual QA ================================================= Google's mT5-base fine-tuned on TyDi QA (secondary task) for multingual Q&A downstream task. Details of mT5 -------------- Google's mT5 mT5 is pretrained on the mC4 corpus, covering 101 languages: Afrikaans, Albanian, Amharic, Arabic, Armenian, Azerbaijani, Basque, Belarusian, Bengali, Bulgarian, Burmese, Catalan, Cebuano, Chichewa, Chinese, Corsican, Czech, Danish, Dutch, English, Esperanto, Estonian, Filipino, Finnish, French, Galician, Georgian, German, Greek, Gujarati, Haitian Creole, Hausa, Hawaiian, Hebrew, Hindi, Hmong, Hungarian, Icelandic, Igbo, Indonesian, Irish, Italian, Japanese, Javanese, Kannada, Kazakh, Khmer, Korean, Kurdish, Kyrgyz, Lao, Latin, Latvian, Lithuanian, Luxembourgish, Macedonian, Malagasy, Malay, Malayalam, Maltese, Maori, Marathi, Mongolian, Nepali, Norwegian, Pashto, Persian, Polish, Portuguese, Punjabi, Romanian, Russian, Samoan, Scottish Gaelic, Serbian, Shona, Sindhi, Sinhala, Slovak, Slovenian, Somali, Sotho, Spanish, Sundanese, Swahili, Swedish, Tajik, Tamil, Telugu, Thai, Turkish, Ukrainian, Urdu, Uzbek, Vietnamese, Welsh, West Frisian, Xhosa, Yiddish, Yoruba, Zulu. Note: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task. Pretraining Dataset: mC4 Other Community Checkpoints: here Paper: mT5: A massively multilingual pre-trained text-to-text transformer Authors: *Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel* Details of the dataset ---------------------- TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs. The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language expresses -- such that we expect models performing well on this set to generalize across a large number of the languages in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without the use of translation (unlike MLQA and XQuAD). Results on validation dataset ----------------------------- Model in Action --------------- Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[]
[ "TAGS\n#transformers #pytorch #tensorboard #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 72 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #mt5 #text2text-generation #multilingual #dataset-tydiqa #arxiv-2010.11934 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.08051328361034393, 0.06721025705337524, -0.005313674453645945, 0.04198123514652252, 0.17268793284893036, 0.026791686192154884, 0.15227575600147247, 0.12731611728668213, -0.007090440019965172, -0.004299451597034931, 0.13031478226184845, 0.21525339782238007, 0.016936445608735085, 0.0712476372718811, -0.09032614529132843, -0.2511366009712219, 0.0049511645920574665, 0.062343791127204895, -0.037267059087753296, 0.11903445422649384, 0.08308901637792587, -0.0702652633190155, 0.08275630325078964, -0.03531765192747116, -0.16893163323402405, 0.045370589941740036, 0.03942909464240074, -0.13413529098033905, 0.11070253700017929, 0.07224925607442856, 0.1295255869626999, 0.04119342938065529, -0.028968343511223793, -0.09804918617010117, 0.04035211354494095, 0.0256823543459177, -0.06223139539361, 0.07913092523813248, 0.10675843060016632, -0.09278552979230881, 0.09183184057474136, 0.011419828049838543, -0.017790867015719414, 0.05179707333445549, -0.12581123411655426, 0.026463426649570465, -0.0521492175757885, 0.011306402273476124, 0.043035004287958145, 0.05424637719988823, -0.0033503558952361345, 0.15419989824295044, -0.0343569777905941, 0.1307327300310135, 0.10116822272539139, -0.3360098600387573, -0.02479483187198639, 0.0824059247970581, 0.06241721287369728, 0.04972922056913376, -0.02108382247388363, 0.0659959614276886, 0.05509741231799126, 0.030836578458547592, -0.008443407714366913, -0.08714661747217178, -0.1715400069952011, 0.03759283944964409, -0.09205110371112823, 0.016512572765350342, 0.26591482758522034, -0.04884630814194679, 0.07308244705200195, -0.011616028845310211, -0.1006854772567749, -0.018092280253767967, -0.01010748092085123, 0.003460397943854332, -0.06414615362882614, 0.030917322263121605, 0.010546686127781868, -0.06386207789182663, -0.14334499835968018, 0.0372113399207592, -0.23131641745567322, 0.08597180247306824, 0.013600021600723267, 0.04059240221977234, -0.2008168250322342, 0.04417489096522331, 0.036888059228658676, -0.10101668536663055, 0.0839684009552002, -0.08004120737314224, -0.00928207766264677, -0.01597309485077858, -0.022557128220796585, -0.21872387826442719, 0.08425650000572205, -0.014753923751413822, 0.06203489005565643, 0.06294766813516617, -0.09100030362606049, 0.08372806012630463, 0.03485694155097008, 0.06658913940191269, -0.06944096088409424, -0.0969480648636818, 0.04354392737150192, -0.08386416733264923, 0.027170533314347267, -0.06551119685173035, -0.15989866852760315, -0.04543048143386841, 0.02226700447499752, 0.08821120113134384, 0.0410890132188797, 0.09001041948795319, -0.03824485465884209, -0.03688250482082367, 0.012693582102656364, -0.08924774080514908, 0.024428633973002434, -0.0006677400670014322, 0.010231474414467812, 0.15363118052482605, 0.010695434175431728, 0.015661006793379784, -0.13104656338691711, 0.036288440227508545, -0.07068648934364319, 0.019312839955091476, -0.04493246600031853, -0.10642144829034805, 0.04624151438474655, -0.11891140788793564, 0.0032344579230993986, -0.13589055836200714, -0.13780272006988525, 0.002301981672644615, 0.018259143456816673, -0.04269801452755928, -0.016087358817458153, -0.04550846666097641, -0.030396057292819023, 0.06318879127502441, -0.04551004618406296, 0.04528414458036423, -0.04660378769040108, 0.07535290718078613, -0.04842616245150566, 0.07280701398849487, -0.1343759447336197, 0.05638692155480385, -0.059247661381959915, -0.018081551417708397, -0.05308973044157028, 0.08163902163505554, 0.0009710400481708348, 0.09194204211235046, -0.049144111573696136, -0.020201770588755608, -0.07931108772754669, 0.04500159993767738, 0.017382433637976646, 0.18287402391433716, -0.19469864666461945, -0.08106527477502823, 0.21515168249607086, -0.0729614719748497, -0.14831386506557465, 0.11295371502637863, -0.006618432234972715, 0.05802059918642044, 0.09253624826669693, 0.15971814095973969, 0.040720134973526, -0.04328152909874916, 0.03319227695465088, 0.08792807906866074, -0.07076278328895569, -0.11856929957866669, -0.006936042103916407, 0.0018021479481831193, -0.06351400166749954, 0.029850497841835022, 0.1669357717037201, 0.06216012313961983, -0.06044004485011101, -0.04781137779355049, -0.029548274353146553, -0.03289882838726044, 0.08416860550642014, 0.03198853135108948, 0.12418904900550842, -0.07818801701068878, 0.010607978329062462, 0.038761772215366364, 0.013370554894208908, -0.036286305636167526, 0.03127497062087059, -0.06755254417657852, 0.09464487433433533, -0.09645671397447586, 0.02520824782550335, -0.18554970622062683, -0.08575990051031113, -0.0247099120169878, 0.1105508953332901, 0.035157352685928345, 0.10110347718000412, 0.06198767572641373, -0.016608120873570442, -0.03640338033437729, 0.0324256531894207, 0.1899305135011673, 0.007655180525034666, -0.10270527005195618, -0.12389582395553589, 0.08709757775068283, -0.08738218992948532, -0.008211949840188026, -0.11602389067411423, 0.014622080139815807, 0.015103211626410484, 0.11376843601465225, 0.018871311098337173, 0.05865756422281265, 0.01650085113942623, 0.044124357402324677, -0.09271478652954102, -0.00222101341933012, 0.0984177365899086, -0.004811280407011509, -0.08774515241384506, 0.19621551036834717, -0.1601085215806961, 0.19306744635105133, 0.18973536789417267, -0.22658345103263855, 0.007192160002887249, -0.06590995192527771, 0.004315352067351341, -0.01559466402977705, 0.05149811878800392, -0.009769083000719547, 0.06259830296039581, 0.002650610404089093, 0.17915289103984833, -0.051307909190654755, -0.02730070985853672, 0.010402446612715721, -0.04910883679986, -0.05757627263665199, 0.11398186534643173, 0.09660317748785019, -0.23152215778827667, 0.1621057689189911, 0.1806182861328125, 0.009769490920007229, 0.1978120356798172, 0.0037646160926669836, -0.033771857619285583, 0.03906864672899246, -0.011097106151282787, -0.014046547003090382, -0.03127259016036987, -0.1914629191160202, -0.031673599034547806, 0.06964607536792755, 0.02405742183327675, 0.07997006922960281, -0.11835551261901855, -0.029097333550453186, -0.04124978929758072, 0.012089488096535206, -0.015751373022794724, 0.08072608709335327, 0.057349663227796555, 0.1492301970720291, -0.04241986572742462, -0.012781335040926933, 0.084661103785038, -0.0016092838486656547, -0.08776500076055527, 0.19231726229190826, -0.15657784044742584, -0.3277990520000458, -0.12735144793987274, -0.1833619773387909, -0.07664551585912704, 0.025662049651145935, 0.07042112201452255, -0.10222344845533371, -0.02876332961022854, -0.0016944996314123273, 0.05805961787700653, -0.08335679769515991, 0.011470971629023552, -0.03405177593231201, 0.044600408524274826, -0.06354447454214096, -0.08994053304195404, -0.019243815913796425, -0.02106483466923237, -0.011819489300251007, 0.13907311856746674, -0.10722173005342484, 0.07546000182628632, 0.1992521733045578, 0.006360770668834448, 0.04047932103276253, -0.04002479463815689, 0.09311388432979584, -0.07793537527322769, 0.02018432319164276, 0.17758755385875702, -0.03961708024144173, 0.07307949662208557, 0.12310037016868591, 0.02402384579181671, -0.07025537639856339, 0.012911939062178135, -0.0039163692854344845, -0.07891879230737686, -0.2974897623062134, -0.11739037185907364, -0.13567091524600983, 0.05849554017186165, 0.03123031184077263, 0.044397901743650436, 0.06860530376434326, 0.08312907814979553, 0.01018213015049696, 0.0284863468259573, -0.010646737180650234, 0.04630514234304428, 0.198222815990448, -0.0214772317558527, 0.139516681432724, -0.08873259276151657, -0.11317049711942673, 0.09899961203336716, 0.10240296274423599, 0.13386952877044678, 0.040964215993881226, 0.06438038498163223, 0.023476457223296165, 0.07622063905000687, 0.10486012697219849, 0.12391017377376556, 0.028277607634663582, -0.026646655052900314, -0.014376543462276459, -0.02096819505095482, 0.0008403807878494263, 0.01511758379638195, 0.05653980001807213, -0.12051098048686981, -0.020266840234398842, -0.03592358157038689, 0.07833132892847061, 0.06911227852106094, 0.08837059140205383, -0.2389381378889084, 0.017269257456064224, 0.07157491892576218, -0.020961908623576164, -0.12863615155220032, 0.06468052417039871, 0.050154972821474075, -0.08695846796035767, 0.11860229074954987, -0.07351407408714294, 0.11294107884168625, -0.07269902527332306, 0.04153759032487869, -0.06950873136520386, -0.030460583046078682, 0.0021483420860022306, 0.10404107719659805, -0.345279335975647, 0.2146795392036438, 0.027653120458126068, -0.04117795079946518, -0.10549835115671158, -0.010913652367889881, 0.0161727424710989, 0.10348411649465561, 0.0901331827044487, -0.007207466755062342, 0.012678971514105797, -0.038480352610349655, -0.05058520659804344, 0.021409397944808006, 0.10983901470899582, 0.050671227276325226, -0.008064934983849525, -0.012306434102356434, -0.012917080894112587, -0.007326191291213036, -0.05723809450864792, -0.021360844373703003, -0.19358845055103302, 0.0778544694185257, 0.04445368051528931, -0.03354205936193466, 0.04469635710120201, -0.08050795644521713, -0.15469242632389069, 0.21374841034412384, -0.042119015008211136, -0.07152184098958969, -0.11790204048156738, -0.025500861927866936, 0.08749816566705704, -0.07307788729667664, 0.018481148406863213, -0.06302272528409958, 0.002143964171409607, -0.05470583215355873, -0.1962868869304657, 0.12572266161441803, -0.09142369031906128, -0.014426414854824543, -0.07377196103334427, 0.16154469549655914, -0.051538173109292984, 0.009533241391181946, 0.015335055068135262, 0.0009005079627968371, -0.07247072458267212, -0.06957513839006424, 0.0006805943557992578, -0.0059520769864320755, 0.10662343353033066, 0.047107402235269547, -0.08027716726064682, -0.08794714510440826, -0.032843027263879776, -0.03204116225242615, 0.2999255657196045, 0.12337004393339157, -0.05311448872089386, 0.13916777074337006, 0.11874309182167053, -0.09402231872081757, -0.30884742736816406, -0.04191835969686508, -0.07176055759191513, 0.010155525989830494, -0.035370975732803345, -0.16059380769729614, 0.07856468111276627, -0.006477271672338247, -0.011174969375133514, 0.13625916838645935, -0.2673639953136444, -0.0967768058180809, 0.14426849782466888, 0.03284836933016777, 0.29333552718162537, -0.1523400992155075, -0.09049039334058762, -0.05577006936073303, -0.06489111483097076, 0.18574388325214386, -0.09280816465616226, 0.10017731040716171, -0.006444351281970739, 0.09260933846235275, 0.04134135693311691, -0.0552305169403553, 0.087934710085392, -0.0017967059975489974, -0.012351357378065586, -0.10347459465265274, -0.04936530068516731, 0.09802040457725525, 0.007044315338134766, 0.01788610778748989, -0.035581983625888824, 0.016030043363571167, -0.10752463340759277, -0.025825541466474533, -0.06535778939723969, 0.06414421647787094, 0.02443418651819229, -0.049083560705184937, 0.005058459471911192, -0.046833399683237076, 0.0009142642957158387, -0.01569284498691559, 0.22057926654815674, -0.02687223069369793, 0.14059875905513763, 0.12400210648775101, 0.17775003612041473, -0.10436658561229706, 0.05742919445037842, -0.06701722741127014, -0.06324249505996704, 0.04677899554371834, -0.11209020018577576, 0.05256778746843338, 0.1267659068107605, -0.03367136791348457, 0.06965555995702744, 0.0836653783917427, -0.004969305358827114, 0.018433358520269394, 0.14522553980350494, -0.1887938380241394, 0.000013811513781547546, -0.030860327184200287, -0.05214580148458481, 0.03327987715601921, 0.11431843042373657, 0.18191267549991608, 0.03785686567425728, -0.021800309419631958, 0.005157358478754759, 0.019536323845386505, -0.02107965387403965, 0.12117304652929306, 0.05857320874929428, 0.005960852839052677, -0.13959567248821259, 0.09368348866701126, 0.03669710457324982, -0.16180314123630524, 0.017301728948950768, 0.17109368741512299, -0.10281617939472198, -0.12248287349939346, 0.007771121803671122, 0.13989825546741486, -0.11520744860172272, -0.05106060579419136, -0.05029815062880516, -0.10371407866477966, 0.0704481229186058, 0.20103420317173004, 0.04000439494848251, 0.05505204200744629, -0.07834786176681519, -0.06581325083971024, -0.05059641972184181, 0.02829291671514511, 0.03935990110039711, 0.02735912799835205, -0.11655717343091965, 0.06868546456098557, -0.031525563448667526, 0.15913093090057373, -0.08567605167627335, -0.0377161018550396, -0.11844120174646378, 0.017646342515945435, -0.1585385948419571, -0.021088549867272377, -0.08919218927621841, -0.04380227252840996, -0.01883043721318245, -0.059310536831617355, -0.06596893817186356, -0.05384774133563042, -0.10428573936223984, 0.04242327809333801, -0.027690714225172997, 0.04899786040186882, -0.06851599365472794, -0.024966903030872345, 0.03528739884495735, -0.0097616296261549, 0.1282053291797638, 0.09871283173561096, -0.0931868925690651, 0.11003623157739639, -0.13846056163311005, -0.06572076678276062, 0.11659381538629532, 0.047788310796022415, 0.06235114485025406, 0.06078824773430824, 0.009982251562178135, 0.08428698033094406, 0.055987678468227386, 0.05242673307657242, 0.06547930091619492, -0.08025702089071274, 0.023852920159697533, -0.09962978214025497, -0.1330520659685135, -0.05506284534931183, 0.013157171197235584, 0.03232777491211891, 0.028460104018449783, 0.10059341043233871, -0.0668318048119545, 0.07466688007116318, -0.07266757637262344, 0.02387436106801033, -0.00663341348990798, -0.16423247754573822, -0.01973032020032406, -0.055691130459308624, 0.05975889042019844, -0.012339858338236809, 0.2002401053905487, 0.0033616020809859037, 0.007049271371215582, 0.00715596741065383, 0.0641401931643486, -0.06463663280010223, 0.016916600987315178, 0.17257289588451385, 0.04622003436088562, -0.03969947621226311, -0.10456595569849014, 0.07981351763010025, 0.019872527569532394, 0.09800729900598526, 0.1312236338853836, 0.027524681761860847, -0.0212135948240757, 0.09519454091787338, 0.001518258941359818, -0.0027745929546654224, -0.07973918318748474, -0.06043755263090134, -0.07441379129886627, 0.07315025478601456, -0.010459906421601772, 0.08965393900871277, 0.13422393798828125, -0.022410377860069275, 0.017666468396782875, -0.05342554301023483, -0.08217579126358032, -0.15623964369297028, -0.18367941677570343, -0.09309981763362885, -0.0551089309155941, -0.006929390598088503, -0.11091828346252441, 0.058050502091646194, 0.06955647468566895, 0.09335814416408539, -0.05808793753385544, 0.07134494185447693, 0.06509023159742355, -0.09277945011854172, 0.05650898069143295, 0.007902033627033234, 0.08185476809740067, -0.016927242279052734, -0.0004285456379875541, -0.10442929714918137, 0.025788575410842896, -0.022866208106279373, 0.05007996782660484, -0.005114417988806963, 0.032834719866514206, -0.13874505460262299, -0.09866025298833847, -0.044499471783638, 0.07119446247816086, 0.03436776623129845, 0.12192612886428833, 0.03614635020494461, -0.02627542056143284, 0.030102020129561424, 0.19782356917858124, -0.062033288180828094, -0.059955745935440063, -0.06012539565563202, 0.15030354261398315, 0.002429852494969964, 0.07769055664539337, -0.019444238394498825, -0.009270239621400833, -0.059491828083992004, 0.32499226927757263, 0.3280152380466461, -0.10468223690986633, 0.007098811212927103, 0.012009045109152794, 0.027413781732320786, 0.07222495973110199, 0.12214338034391403, 0.07653671503067017, 0.23804059624671936, -0.06507861614227295, -0.03914859518408775, -0.043573733419179916, 0.01530381292104721, -0.09758597612380981, 0.13443514704704285, 0.019567525014281273, -0.06630068272352219, 0.00012222168152220547, 0.12056777626276016, -0.16430430114269257, 0.044658347964286804, -0.08434730023145676, -0.20086728036403656, -0.11168307065963745, 0.0004209016333334148, 0.10503076016902924, 0.006880833767354488, 0.05256611481308937, -0.022016489878296852, -0.045519426465034485, 0.0026354603469371796, 0.017172547057271004, -0.210942342877388, 0.03844384104013443, 0.04024379327893257, -0.12072634696960449, -0.02220993861556053, -0.008859091438353062, 0.054278962314128876, 0.09297504276037216, 0.0649549663066864, -0.03225334733724594, 0.026859521865844727, 0.00372802815400064, 0.016800452023744583, 0.06775476038455963, 0.024775484576821327, 0.01482111494988203, -0.08591089397668839, 0.08715075254440308, -0.05973463132977486, 0.040979932993650436, -0.0653398334980011, -0.03675669804215431, -0.00515773193910718, 0.0474637895822525, -0.07020116597414017, 0.09214184433221817, 0.0980323851108551, 0.001873140106908977, -0.0008076200028881431, -0.06368963420391083, -0.03891117870807648, -0.008907032199203968, -0.061702921986579895, -0.09729445725679398, -0.14353212714195251, -0.09244818240404129, 0.03289917856454849, 0.03512318804860115, -0.2099444568157196, 0.017244499176740646, -0.10908322781324387, 0.015088883228600025, -0.18521668016910553, 0.08136054873466492, 0.08945294469594955, -0.010541512630879879, -0.021918201819062233, -0.012816283851861954, 0.04408836364746094, 0.08242587000131607, -0.12894634902477264, -0.07617179304361343 ]
null
null
transformers
# mBART-large-50 fine-tuned onpus100 and opusbook for English to Portuguese translation. [mBART-50](https://huggingface.co/facebook/mbart-large-50/) large fine-tuned on [opus100](https://huggingface.co/datasets/viewer/?dataset=opus100) dataset for **NMT** downstream task. # Details of mBART-50 🧠 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. mBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. **Multilingual Denoising Pretraining**: The model incorporates N languages by concatenating data: `D = {D1, ..., DN }` where each Di is a collection of monolingual documents in language `i`. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution `(λ = 3.5)`. The decoder input is the original text with one position offset. A language id symbol `LID` is used as the initial token to predict the sentence. ## Details of the downstream task (NMT) - Dataset 📚 - **Homepage:** [Link](http://opus.nlpl.eu/opus-100.php) - **Repository:** [GitHub](https://github.com/EdinburghNLP/opus-100-corpus) - **Paper:** [ARXIV](https://arxiv.org/abs/2004.11867) ### Dataset Summary OPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS. ### Languages OPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k. ## Dataset Structure ### Data Fields - `src_tag`: `string` text in source language - `tgt_tag`: `string` translation of source language in target language ### Data Splits The dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set. ## Test set metrics 🧾 We got a **BLEU score of 20.61** ## Model in Action 🚀 ```sh git clone https://github.com/huggingface/transformers.git pip install -q ./transformers ``` ```python from transformers import MBart50TokenizerFast, MBartForConditionalGeneration ckpt = 'Narrativa/mbart-large-50-finetuned-opus-en-pt-translation' tokenizer = MBart50TokenizerFast.from_pretrained(ckpt) model = MBartForConditionalGeneration.from_pretrained(ckpt).to("cuda") tokenizer.src_lang = 'en_XX' def translate(text): inputs = tokenizer(text, return_tensors='pt') input_ids = inputs.input_ids.to('cuda') attention_mask = inputs.attention_mask.to('cuda') output = model.generate(input_ids, attention_mask=attention_mask, forced_bos_token_id=tokenizer.lang_code_to_id['pt_XX']) return tokenizer.decode(output[0], skip_special_tokens=True) translate('here your English text to be translated to Portuguese...') ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": ["en", "pt"], "tags": ["translation"], "datasets": ["opus100", "opusbook"], "metrics": ["bleu"]}
translation
Narrativa/mbart-large-50-finetuned-opus-en-pt-translation
[ "transformers", "pytorch", "mbart", "text2text-generation", "translation", "en", "pt", "dataset:opus100", "dataset:opusbook", "arxiv:2008.00401", "arxiv:2004.11867", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2008.00401", "2004.11867" ]
[ "en", "pt" ]
TAGS #transformers #pytorch #mbart #text2text-generation #translation #en #pt #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-large-50 fine-tuned onpus100 and opusbook for English to Portuguese translation. mBART-50 large fine-tuned on opus100 dataset for NMT downstream task. # Details of mBART-50 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. mBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. Multilingual Denoising Pretraining: The model incorporates N languages by concatenating data: 'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'. The decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence. ## Details of the downstream task (NMT) - Dataset - Homepage: Link - Repository: GitHub - Paper: ARXIV ### Dataset Summary OPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS. ### Languages OPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k. ## Dataset Structure ### Data Fields - 'src_tag': 'string' text in source language - 'tgt_tag': 'string' translation of source language in target language ### Data Splits The dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set. ## Test set metrics We got a BLEU score of 20.61 ## Model in Action Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "# mBART-large-50 fine-tuned onpus100 and opusbook for English to Portuguese translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.", "# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Details of the downstream task (NMT) - Dataset \n\n- Homepage: Link \n- Repository: GitHub\n- Paper: ARXIV", "### Dataset Summary\n\nOPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS.", "### Languages\n\nOPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k.", "## Dataset Structure", "### Data Fields\n\n- 'src_tag': 'string' text in source language\n- 'tgt_tag': 'string' translation of source language in target language", "### Data Splits\n\nThe dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set.", "## Test set metrics \n\nWe got a BLEU score of 20.61", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #mbart #text2text-generation #translation #en #pt #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-large-50 fine-tuned onpus100 and opusbook for English to Portuguese translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.", "# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Details of the downstream task (NMT) - Dataset \n\n- Homepage: Link \n- Repository: GitHub\n- Paper: ARXIV", "### Dataset Summary\n\nOPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS.", "### Languages\n\nOPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k.", "## Dataset Structure", "### Data Fields\n\n- 'src_tag': 'string' text in source language\n- 'tgt_tag': 'string' translation of source language in target language", "### Data Splits\n\nThe dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set.", "## Test set metrics \n\nWe got a BLEU score of 20.61", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 81, 45, 360, 32, 58, 49, 6, 38, 137, 14, 50 ]
[ "passage: TAGS\n#transformers #pytorch #mbart #text2text-generation #translation #en #pt #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-large-50 fine-tuned onpus100 and opusbook for English to Portuguese translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence." ]
[ -0.08404651284217834, 0.02088739536702633, -0.005764995235949755, 0.025416923686861992, 0.04961054399609566, 0.014721700921654701, 0.15724527835845947, -0.004931475035846233, -0.1392369419336319, 0.0570317879319191, 0.06972867995500565, 0.01989283226430416, 0.0664738342165947, 0.1681586056947708, 0.12480784952640533, -0.2806839942932129, 0.06875678896903992, -0.044397469609975815, 0.05821144953370094, 0.06494858115911484, 0.03639021888375282, -0.00416572205722332, 0.06971386075019836, -0.01123468205332756, -0.09970315545797348, 0.0060163093730807304, -0.01163939293473959, -0.020824551582336426, 0.03842546045780182, 0.1032581701874733, 0.028470996767282486, 0.007283514365553856, 0.06376717984676361, -0.11938337981700897, 0.007529927417635918, 0.03335727006196976, -0.005389712750911713, 0.041442856192588806, 0.05821491777896881, 0.01415990386158228, 0.20859938859939575, 0.026441702619194984, 0.007276270538568497, 0.04636180028319359, -0.09372961521148682, -0.012023421004414558, -0.037867628037929535, 0.004059934057295322, -0.01284998469054699, 0.06727112829685211, -0.047075092792510986, 0.04906385764479637, 0.010387416929006577, 0.05494128540158272, 0.03471021726727486, -0.2658005654811859, -0.005884476471692324, 0.14425690472126007, 0.0473543182015419, 0.08957147598266602, 0.04145507141947746, 0.011714215390384197, 0.03272275999188423, 0.03574908897280693, 0.010932770557701588, -0.06279927492141724, 0.18066169321537018, -0.13748076558113098, -0.13476307690143585, 0.005083051044493914, 0.055051058530807495, -0.03419362008571625, -0.11912821233272552, -0.0719638466835022, -0.025586014613509178, 0.015375684015452862, -0.022589080035686493, -0.0005444269045256078, 0.011882967315614223, 0.02507076971232891, 0.12829402089118958, -0.09077267348766327, -0.04385331645607948, -0.022434180602431297, -0.07705451548099518, 0.0945427268743515, 0.020560014992952347, 0.026151509955525398, -0.05992267280817032, 0.06893733888864517, -0.14182651042938232, -0.025230146944522858, -0.043262552469968796, -0.09096486866474152, -0.06409554928541183, 0.05344681069254875, -0.049487125128507614, -0.11434745043516159, -0.0235887598246336, 0.08613770455121994, 0.0030792071484029293, 0.07810801267623901, -0.06644739955663681, 0.04462258517742157, 0.015533124096691608, 0.0825595036149025, -0.0941612720489502, -0.07979977875947952, 0.028764966875314713, -0.009152205660939217, 0.00570877268910408, -0.03342731297016144, -0.1283922791481018, -0.04397319257259369, -0.004823063965886831, 0.05207451060414314, -0.0026188294868916273, 0.09037643671035767, 0.050348952412605286, -0.0489385724067688, 0.049517806619405746, -0.122564397752285, -0.014443004503846169, -0.02895350567996502, -0.036766715347766876, 0.04910999909043312, 0.06791820377111435, -0.012707446701824665, -0.10839856415987015, -0.03458743169903755, -0.020937319844961166, -0.0019082158105447888, -0.13609188795089722, -0.1496732085943222, -0.018333666026592255, -0.0392141230404377, -0.04701516404747963, -0.09007813781499863, -0.15073318779468536, -0.07801666855812073, 0.032350093126297, -0.031455907970666885, 0.02193637378513813, -0.06729935109615326, 0.032703328877687454, -0.02441805973649025, -0.018126776441931725, 0.038579754531383514, 0.030811259523034096, 0.0001884156372398138, -0.04076755791902542, 0.0640934407711029, -0.16441504657268524, 0.0361735038459301, -0.09282361716032028, 0.0541585348546505, -0.15848994255065918, 0.190424844622612, 0.030382126569747925, -0.03141656517982483, -0.07121478766202927, -0.029775548726320267, -0.0879492312669754, 0.08145692944526672, 0.061054736375808716, 0.1099189817905426, -0.2267136424779892, -0.021229013800621033, 0.15264584124088287, -0.12911929190158844, 0.027488943189382553, 0.021499183028936386, -0.023284807801246643, 0.19176523387432098, 0.09987512230873108, 0.05743243917822838, 0.06623327732086182, 0.0849197506904602, 0.06670934706926346, -0.045609064400196075, -0.05456385016441345, 0.054722487926483154, 0.025896340608596802, -0.011009139940142632, -0.06583385914564133, 0.06477993726730347, -0.04117385298013687, 0.022339841350913048, 0.003014608984813094, -0.014416825026273727, 0.04159330204129219, -0.015369627624750137, -0.03207375481724739, -0.04249229282140732, 0.06448762118816376, 0.05906607210636139, -0.036246683448553085, 0.050665441900491714, 0.02827196568250656, -0.06197918578982353, 0.052592478692531586, -0.06154550611972809, -0.05748007819056511, -0.04516719654202461, 0.04356027767062187, -0.20046663284301758, -0.03614633530378342, 0.016869714483618736, -0.0950714498758316, 0.18185031414031982, 0.11018121242523193, 0.022170163691043854, 0.11111443489789963, 0.0007780936430208385, 0.03233266994357109, 0.055062517523765564, -0.03697636350989342, -0.03655150160193443, -0.0907139927148819, -0.04296993464231491, -0.06514851748943329, 0.010276641696691513, -0.07914374023675919, 0.03862912207841873, -0.16315717995166779, -0.006887068971991539, -0.008231495507061481, -0.015326508320868015, 0.018867256119847298, 0.05105581507086754, -0.025701912119984627, -0.027321597561240196, 0.06167083606123924, -0.007622243836522102, -0.09485181421041489, 0.10852886736392975, -0.2321508377790451, -0.07023625075817108, 0.07132025063037872, 0.062144067138433456, -0.062487028539180756, -0.11221276968717575, 0.016492681577801704, -0.030309190973639488, 0.05772620067000389, -0.004877057392150164, 0.10996266454458237, 0.015737593173980713, 0.08545739203691483, -0.09930451214313507, 0.05855154991149902, -0.004197408445179462, -0.004172314424067736, -0.04788381978869438, 0.07277519255876541, 0.0949404239654541, -0.17637014389038086, 0.027463063597679138, 0.01997368410229683, 0.05518488213419914, 0.18396344780921936, 0.04362492635846138, -0.05594480782747269, -0.03856022283434868, 0.011262333951890469, 0.0064011611975729465, -0.0047295414842665195, -0.008340776897966862, -0.021019376814365387, 0.01661764830350876, 0.06541245430707932, 0.07601234316825867, -0.035921502858400345, 0.041386928409338, 0.005128307733684778, -0.02692774496972561, -0.006682684179395437, 0.017477067187428474, -0.03809984773397446, 0.07421482354402542, -0.016780685633420944, -0.0012955496786162257, 0.0031507210806012154, -0.011557010002434254, -0.11563412100076675, 0.14203621447086334, -0.1392412930727005, -0.2168574184179306, -0.17456749081611633, -0.06347589194774628, -0.10000398010015488, 0.014852428808808327, -0.019665293395519257, -0.010159910656511784, -0.016461730003356934, -0.12322193384170532, 0.1849180907011032, -0.09132833778858185, -0.050147004425525665, -0.09062334895133972, -0.005768730770796537, -0.012835046276450157, -0.10650776326656342, 0.029859058558940887, -0.04957069829106331, -0.08280477672815323, 0.00391757395118475, -0.06834255158901215, 0.007580975070595741, 0.13824740052223206, 0.009585943073034286, -0.02844935841858387, -0.0021903894376009703, 0.21266965568065643, 0.0037265631835907698, 0.07954809814691544, 0.059743572026491165, -0.042194850742816925, 0.01690404675900936, 0.14114613831043243, 0.017094062641263008, -0.0932939276099205, 0.019712576642632484, -0.02984415553510189, -0.09958567470312119, -0.1338798701763153, -0.0714619979262352, -0.027339238673448563, 0.012012495659291744, 0.04779096320271492, 0.01819898560643196, 0.023590480908751488, 0.06197630614042282, -0.009266155771911144, 0.06097288429737091, 0.06778768450021744, 0.04928238317370415, 0.0095534548163414, -0.05021326616406441, 0.08367981016635895, -0.012937105260789394, 0.01961693726480007, 0.021252285689115524, 0.008750972338020802, 0.16109491884708405, 0.010887140408158302, 0.05342443659901619, 0.055565908551216125, 0.032094940543174744, 0.08322783559560776, 0.1565549373626709, -0.02816842496395111, 0.043662477284669876, -0.04209663346409798, -0.0536416657269001, -0.056060247123241425, 0.06553023308515549, 0.018485482782125473, -0.022847790271043777, -0.016649385914206505, 0.03220434486865997, -0.05635007098317146, 0.036667659878730774, 0.03442441299557686, -0.17571642994880676, -0.005151484161615372, 0.008588578552007675, 0.004203724209219217, -0.08362787961959839, 0.024014165624976158, 0.07662273943424225, -0.06248750910162926, 0.06615955382585526, -0.026485878974199295, 0.08616998046636581, -0.0535833016037941, 0.02225983701646328, -0.04855726659297943, 0.12244386970996857, -0.007123030722141266, 0.050565414130687714, -0.27761104702949524, 0.09814770519733429, 0.03491115942597389, 0.015257188118994236, -0.06210706755518913, 0.016691820695996284, -0.04691610112786293, 0.07674738019704819, 0.07649465650320053, 0.011662794277071953, -0.16455847024917603, -0.04108978062868118, -0.052994053810834885, 0.0054070912301540375, 0.08522289246320724, -0.011980313807725906, 0.06905817985534668, 0.08016463369131088, 0.03417055308818817, -0.05595466494560242, 0.04661458730697632, -0.19468048214912415, -0.17469361424446106, 0.07907669991254807, -0.01513525191694498, 0.05266086384654045, -0.014169533737003803, -0.006012544501572847, 0.04555600881576538, 0.1276381015777588, -0.10039296001195908, -0.053132567554712296, -0.0648077204823494, 0.06796146184206009, 0.15415218472480774, -0.0650627613067627, 0.03429596498608589, -0.03992922976613045, 0.039334725588560104, -0.06902464479207993, -0.13372424244880676, 0.06412164121866226, -0.05240606889128685, -0.0013098124181851745, -0.026724165305495262, 0.07493337243795395, 0.09150471538305283, 0.015522017143666744, 0.006051783449947834, -0.031939487904310226, 0.07902512699365616, -0.10459695011377335, -0.0710059180855751, 0.1628497987985611, -0.025321658700704575, 0.06741788983345032, -0.18363898992538452, -0.12540081143379211, -0.055224936455488205, 0.005839132703840733, 0.0872192233800888, 0.1010822057723999, -0.04992547258734703, 0.11190640926361084, 0.1830415427684784, -0.13254055380821228, -0.15742866694927216, -0.018317274749279022, 0.050014790147542953, 0.07909618318080902, 0.0064873844385147095, -0.22098615765571594, -0.027442635968327522, 0.002901339204981923, 0.05356910079717636, 0.07307972013950348, -0.2627214789390564, -0.1297854781150818, 0.08015498518943787, 0.03178907185792923, 0.17929038405418396, -0.06594815105199814, -0.07622652500867844, -0.06938225775957108, 0.03166796267032623, 0.10844901204109192, 0.022251160815358162, 0.08616650104522705, 0.027891980484128, 0.032902974635362625, 0.03616443648934364, 0.013621068559587002, 0.08144596964120865, 0.07847267389297485, 0.007659049239009619, -0.08863565325737, 0.06434676051139832, 0.02073718048632145, 0.0004425405932124704, 0.14442875981330872, 0.011164035648107529, -0.01524390559643507, -0.00934745091944933, -0.08344276249408722, -0.07503736764192581, 0.007711267564445734, -0.050700221210718155, -0.048701897263526917, -0.018017830327153206, 0.07484210282564163, -0.010107027366757393, 0.012873989529907703, -0.09877897053956985, -0.121018186211586, -0.03942861035466194, 0.16957056522369385, 0.14335916936397552, 0.005942639894783497, 0.035056065768003464, 0.016439512372016907, 0.022447306662797928, 0.08496864140033722, 0.012808216735720634, -0.0057935272343456745, 0.09599652886390686, -0.05145403370261192, 0.07935060560703278, 0.007510972209274769, -0.16750453412532806, -0.0020262240432202816, 0.05667264014482498, -0.014136355370283127, -0.16442306339740753, 0.033712081611156464, 0.06600786000490189, 0.039285819977521896, 0.04078294709324837, 0.19351854920387268, -0.07138034701347351, -0.020769432187080383, -0.0011279573664069176, 0.05069255456328392, -0.06094567850232124, 0.11829625070095062, -0.032462771981954575, -0.007721577305346727, -0.07550891488790512, 0.0886315256357193, 0.0339050330221653, -0.024857934564352036, -0.005332484375685453, 0.11879058927297592, -0.10073131322860718, -0.0690450444817543, -0.057375069707632065, 0.0800970122218132, -0.11583779752254486, -0.05145999789237976, 0.09139897674322128, -0.09393125027418137, 0.041483648121356964, 0.08467097580432892, -0.018531838431954384, 0.036354273557662964, -0.07359025627374649, -0.03336552157998085, 0.005978765431791544, -0.03852200135588646, 0.025233875960111618, 0.005698699038475752, 0.014434975571930408, 0.11512401700019836, 0.05767816677689552, 0.03469245135784149, -0.027055276557803154, -0.04719141870737076, -0.05740178003907204, 0.006558573804795742, -0.11329346150159836, -0.03923332691192627, -0.1185910701751709, -0.03721478953957558, 0.008914054371416569, 0.007978270761668682, 0.020878126844763756, -0.011313400231301785, -0.06021977588534355, -0.009110291488468647, -0.11125693470239639, -0.0010730456560850143, -0.006243614479899406, 0.029870066791772842, -0.04060273990035057, -0.029809892177581787, 0.07268958538770676, 0.0038234994281083345, -0.050899919122457504, 0.03563650697469711, -0.03254980221390724, 0.08880466967821121, -0.03465648740530014, -0.01073138415813446, -0.03211065009236336, -0.03331973776221275, -0.013292304240167141, 0.03903919830918312, 0.03609737381339073, -0.02094479463994503, 0.04289097711443901, -0.07348771393299103, 0.07017049938440323, 0.03554727882146835, 0.005386856850236654, -0.04550822079181671, 0.04793801158666611, -0.02503959834575653, 0.03901071473956108, 0.051485076546669006, -0.021720675751566887, 0.06274205446243286, -0.07148825377225876, -0.028545258566737175, 0.015218624845147133, 0.01856718771159649, 0.10358089208602905, -0.09028945863246918, 0.02819773182272911, 0.04281022772192955, 0.1352909952402115, 0.037679679691791534, -0.014017083682119846, -0.04522233456373215, -0.041869696229696274, -0.014495261013507843, 0.048398278653621674, 0.07633024454116821, 0.04633762314915657, 0.03038623556494713, -0.05902279540896416, 0.09274520725011826, -0.04150788486003876, 0.10641561448574066, 0.14860180020332336, 0.033694904297590256, 0.16517776250839233, 0.10621844232082367, 0.04713562875986099, -0.00018209034169558436, -0.011437703855335712, 0.05513530597090721, 0.020022906363010406, 0.04495682194828987, -0.049720991402864456, 0.07011403888463974, 0.08799563348293304, -0.06307575106620789, 0.09261243045330048, 0.08175207674503326, -0.0725003331899643, -0.08554685115814209, -0.2743624448776245, -0.003815184812992811, -0.032388996332883835, -0.05088698863983154, -0.08030351996421814, 0.005840929225087166, -0.013933870010077953, 0.06373327970504761, 0.00005968333061900921, 0.07794629782438278, -0.09439021348953247, -0.1483539193868637, 0.034484826028347015, 0.00045027318992652, 0.06695779412984848, 0.07539521902799606, 0.00789347942918539, 0.01551967952400446, 0.036595020443201065, 0.0336170569062233, 0.07596053928136826, 0.0560338981449604, 0.05604570358991623, -0.04932592064142227, 0.004185072146356106, -0.04901781305670738, 0.0010691978968679905, 0.0702294334769249, 0.20008252561092377, 0.028268685564398766, -0.14908428490161896, -0.019985856488347054, 0.04215450584888458, 0.037318453192710876, -0.10517606884241104, -0.1024286076426506, 0.17287768423557281, -0.019393811002373695, 0.07851914316415787, -0.09896355122327805, -0.030453406274318695, -0.02003009058535099, 0.20555749535560608, 0.23767304420471191, -0.06976135820150375, -0.05796634778380394, 0.030377475544810295, 0.002493214560672641, 0.005724527407437563, 0.15757322311401367, -0.0007856198935769498, 0.27090248465538025, -0.0009183923248201609, 0.07996927201747894, -0.06004931777715683, -0.022080671042203903, -0.00941835343837738, 0.08822347968816757, -0.040097858756780624, -0.011094723828136921, -0.02259073406457901, 0.07314886897802353, -0.10117628425359726, -0.1516946703195572, -0.01709538698196411, 0.0407593809068203, -0.04381246492266655, 0.0014796864707022905, -0.023125363513827324, 0.04624234512448311, 0.05899110063910484, -0.022169217467308044, 0.003895387053489685, 0.07567029446363449, -0.02486725151538849, -0.08703497797250748, -0.12024874985218048, 0.027275284752249718, 0.006405720021575689, 0.10004289448261261, 0.00637858547270298, 0.05884704366326332, 0.021658508107066154, 0.031752120703458786, -0.09753186255693436, 0.07035684585571289, -0.030349964275956154, 0.08072216808795929, 0.0520537905395031, 0.08433248102664948, -0.0015237766783684492, 0.08452993631362915, -0.009376912377774715, -0.11629906296730042, 0.08173967897891998, 0.0495615191757679, -0.05331099405884743, -0.01749817654490471, 0.07659412175416946, -0.0972737967967987, 0.16016292572021484, 0.14017446339130402, 0.006543587893247604, -0.0012682030210271478, -0.016525158658623695, 0.04301411285996437, -0.10364581644535065, 0.11207254976034164, -0.014880476519465446, -0.1398332118988037, -0.01789352111518383, -0.07393304258584976, 0.007123123854398727, -0.22010794281959534, -0.03251495212316513, -0.00699458597227931, -0.019933586940169334, -0.04672514647245407, 0.11219143122434616, 0.04641542211174965, -0.0003955320571549237, -0.04373342543840408, -0.16009224951267242, 0.03650601953268051, 0.06816516071557999, -0.09602634608745575, -0.08987078070640564 ]
null
null
transformers
# mBART-large-50 fine-tuned onpus100 and opusbook for Portuguese to English translation. [mBART-50](https://huggingface.co/facebook/mbart-large-50/) large fine-tuned on [opus100](https://huggingface.co/datasets/viewer/?dataset=opus100) dataset for **NMT** downstream task. # Details of mBART-50 🧠 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper. mBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. **Multilingual Denoising Pretraining**: The model incorporates N languages by concatenating data: `D = {D1, ..., DN }` where each Di is a collection of monolingual documents in language `i`. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution `(λ = 3.5)`. The decoder input is the original text with one position offset. A language id symbol `LID` is used as the initial token to predict the sentence. ## Details of the downstream task (NMT) - Dataset 📚 - **Homepage:** [Link](http://opus.nlpl.eu/opus-100.php) - **Repository:** [GitHub](https://github.com/EdinburghNLP/opus-100-corpus) - **Paper:** [ARXIV](https://arxiv.org/abs/2004.11867) ### Dataset Summary OPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS. ### Languages OPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k. ## Dataset Structure ### Data Fields - `src_tag`: `string` text in source language - `tgt_tag`: `string` translation of source language in target language ### Data Splits The dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set. ## Test set metrics 🧾 We got a **BLEU score of 26.12** ## Model in Action 🚀 ```sh git clone https://github.com/huggingface/transformers.git pip install -q ./transformers ``` ```python from transformers import MBart50TokenizerFast, MBartForConditionalGeneration ckpt = 'Narrativa/mbart-large-50-finetuned-opus-pt-en-translation' tokenizer = MBart50TokenizerFast.from_pretrained(ckpt) model = MBartForConditionalGeneration.from_pretrained(ckpt).to("cuda") tokenizer.src_lang = 'pt_XX' def translate(text): inputs = tokenizer(text, return_tensors='pt') input_ids = inputs.input_ids.to('cuda') attention_mask = inputs.attention_mask.to('cuda') output = model.generate(input_ids, attention_mask=attention_mask, forced_bos_token_id=tokenizer.lang_code_to_id['en_XX']) return tokenizer.decode(output[0], skip_special_tokens=True) translate('here your Portuguese text to be translated to English...') ``` Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": ["pt", "en"], "tags": ["translation"], "datasets": ["opus100", "opusbook"], "metrics": ["bleu"]}
translation
Narrativa/mbart-large-50-finetuned-opus-pt-en-translation
[ "transformers", "pytorch", "mbart", "text2text-generation", "translation", "pt", "en", "dataset:opus100", "dataset:opusbook", "arxiv:2008.00401", "arxiv:2004.11867", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2008.00401", "2004.11867" ]
[ "pt", "en" ]
TAGS #transformers #pytorch #mbart #text2text-generation #translation #pt #en #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us
# mBART-large-50 fine-tuned onpus100 and opusbook for Portuguese to English translation. mBART-50 large fine-tuned on opus100 dataset for NMT downstream task. # Details of mBART-50 mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Multilingual Denoising Pretraining" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper. mBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. Instead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below. Multilingual Denoising Pretraining: The model incorporates N languages by concatenating data: 'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, first randomly shuffling the original sentences' order, and second a novel in-filling scheme, where spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. 35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'. The decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence. ## Details of the downstream task (NMT) - Dataset - Homepage: Link - Repository: GitHub - Paper: ARXIV ### Dataset Summary OPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS. ### Languages OPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k. ## Dataset Structure ### Data Fields - 'src_tag': 'string' text in source language - 'tgt_tag': 'string' translation of source language in target language ### Data Splits The dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set. ## Test set metrics We got a BLEU score of 26.12 ## Model in Action Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "# mBART-large-50 fine-tuned onpus100 and opusbook for Portuguese to English translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.", "# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Details of the downstream task (NMT) - Dataset \n\n- Homepage: Link \n- Repository: GitHub\n- Paper: ARXIV", "### Dataset Summary\n\nOPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS.", "### Languages\n\nOPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k.", "## Dataset Structure", "### Data Fields\n\n- 'src_tag': 'string' text in source language\n- 'tgt_tag': 'string' translation of source language in target language", "### Data Splits\n\nThe dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set.", "## Test set metrics \n\nWe got a BLEU score of 26.12", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #mbart #text2text-generation #translation #pt #en #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# mBART-large-50 fine-tuned onpus100 and opusbook for Portuguese to English translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.", "# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence.", "## Details of the downstream task (NMT) - Dataset \n\n- Homepage: Link \n- Repository: GitHub\n- Paper: ARXIV", "### Dataset Summary\n\nOPUS-100 is English-centric, meaning that all training pairs include English on either the source or target side. The corpus covers 100 languages (including English). Languages were selected based on the volume of parallel data available in OPUS.", "### Languages\n\nOPUS-100 contains approximately 55M sentence pairs. Of the 99 language pairs, 44 have 1M sentence pairs of training data, 73 have at least 100k, and 95 have at least 10k.", "## Dataset Structure", "### Data Fields\n\n- 'src_tag': 'string' text in source language\n- 'tgt_tag': 'string' translation of source language in target language", "### Data Splits\n\nThe dataset is split into training, development, and test portions. Data was prepared by randomly sampled up to 1M sentence pairs per language pair for training and up to 2000 each for development and test. To ensure that there was no overlap (at the monolingual sentence level) between the training and development/test data, they applied a filter during sampling to exclude sentences that had already been sampled. Note that this was done cross-lingually so that, for instance, an English sentence in the Portuguese-English portion of the training data could not occur in the Hindi-English test set.", "## Test set metrics \n\nWe got a BLEU score of 26.12", "## Model in Action \n\n\n\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 81, 45, 360, 32, 58, 49, 6, 38, 137, 14, 50 ]
[ "passage: TAGS\n#transformers #pytorch #mbart #text2text-generation #translation #pt #en #dataset-opus100 #dataset-opusbook #arxiv-2008.00401 #arxiv-2004.11867 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# mBART-large-50 fine-tuned onpus100 and opusbook for Portuguese to English translation.\nmBART-50 large fine-tuned on opus100 dataset for NMT downstream task.# Details of mBART-50 \n\nmBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the \"Multilingual Denoising Pretraining\" objective. It was introduced in Multilingual Translation with Extensible Multilingual Pretraining and Finetuning paper.\n\n\nmBART-50 is a multilingual Sequence-to-Sequence model. It was created to show that multilingual translation models can be created through multilingual fine-tuning. \nInstead of fine-tuning on one direction, a pre-trained model is fine-tuned many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.\nMultilingual Denoising Pretraining: The model incorporates N languages by concatenating data: \n'D = {D1, ..., DN }' where each Di is a collection of monolingual documents in language 'i'. The source documents are noised using two schemes, \nfirst randomly shuffling the original sentences' order, and second a novel in-filling scheme, \nwhere spans of text are replaced with a single mask token. The model is then tasked to reconstruct the original text. \n35% of each instance's words are masked by random sampling a span length according to a Poisson distribution '(λ = 3.5)'.\nThe decoder input is the original text with one position offset. A language id symbol 'LID' is used as the initial token to predict the sentence." ]
[ -0.08405263721942902, 0.01962992176413536, -0.005752241238951683, 0.024454955011606216, 0.04934582859277725, 0.014763426966965199, 0.15795782208442688, -0.00429284293204546, -0.13849325478076935, 0.057851143181324005, 0.06907513737678528, 0.01927267201244831, 0.06669939309358597, 0.16882115602493286, 0.12521453201770782, -0.28071972727775574, 0.06911409646272659, -0.04453369230031967, 0.06025966629385948, 0.06514281034469604, 0.03601684793829918, -0.004628318361938, 0.06939494609832764, -0.011214811354875565, -0.09901208430528641, 0.004778802860528231, -0.01113130059093237, -0.020005475729703903, 0.039017677307128906, 0.10247690975666046, 0.027492038905620575, 0.007478624116629362, 0.06364811211824417, -0.11995816230773926, 0.007774241268634796, 0.03382430598139763, -0.004714251030236483, 0.041388124227523804, 0.05879223346710205, 0.014473301358520985, 0.20940370857715607, 0.0284868236631155, 0.007374325767159462, 0.04653214290738106, -0.09312261641025543, -0.012807034887373447, -0.03754722326993942, 0.0057391999289393425, -0.012834870256483555, 0.0672101303935051, -0.04755571484565735, 0.05058066546916962, 0.010316756553947926, 0.05513777583837509, 0.035011306405067444, -0.26628991961479187, -0.005805885419249535, 0.14448240399360657, 0.04714934527873993, 0.09049035608768463, 0.04066507890820503, 0.011848091147840023, 0.03162387013435364, 0.03570331260561943, 0.010180170647799969, -0.06252630054950714, 0.18161143362522125, -0.1376393884420395, -0.13518458604812622, 0.005017114803195, 0.05447893589735031, -0.034269899129867554, -0.11925049126148224, -0.07225366681814194, -0.025461306795477867, 0.014723744243383408, -0.023431474342942238, -0.0008488845196552575, 0.01176452450454235, 0.025219270959496498, 0.1280636340379715, -0.09010031074285507, -0.04397624358534813, -0.02257828414440155, -0.07642281800508499, 0.09143419563770294, 0.020532574504613876, 0.026338424533605576, -0.05972275137901306, 0.06899125128984451, -0.14360560476779938, -0.024057963863015175, -0.04351635277271271, -0.091222383081913, -0.06431430578231812, 0.052946947515010834, -0.04951447620987892, -0.11307578533887863, -0.02409982495009899, 0.08729034662246704, 0.004406516440212727, 0.07814183086156845, -0.06725388020277023, 0.04436146467924118, 0.0158360805362463, 0.08245564997196198, -0.09494626522064209, -0.07904604077339172, 0.028458887711167336, -0.008085583336651325, 0.00497246440500021, -0.032533880323171616, -0.1272735595703125, -0.044114816933870316, -0.002265028189867735, 0.05112922191619873, -0.002677574520930648, 0.09091895073652267, 0.05070841312408447, -0.049022357910871506, 0.05083014816045761, -0.12177113443613052, -0.015261311084032059, -0.028537405654788017, -0.036212243139743805, 0.04963133856654167, 0.06878142058849335, -0.011951843276619911, -0.10766224563121796, -0.03420073539018631, -0.021165957674384117, -0.0019500034395605326, -0.13485419750213623, -0.14917582273483276, -0.01881343312561512, -0.04026460275053978, -0.04737933352589607, -0.09022750705480576, -0.15023306012153625, -0.07797231525182724, 0.0321664996445179, -0.030866798013448715, 0.02147681452333927, -0.06637950241565704, 0.03280201554298401, -0.024559414014220238, -0.018629953265190125, 0.03939226269721985, 0.031077323481440544, 0.00016471341950818896, -0.041305940598249435, 0.06390661746263504, -0.16356442868709564, 0.0364186055958271, -0.09333779662847519, 0.05419379100203514, -0.1600855439901352, 0.1895734816789627, 0.029873983934521675, -0.031552914530038834, -0.07180095463991165, -0.02924112230539322, -0.0881035327911377, 0.0811028853058815, 0.060571879148483276, 0.10965787619352341, -0.22608113288879395, -0.02113199606537819, 0.15224991738796234, -0.12953677773475647, 0.026612112298607826, 0.021876323968172073, -0.02294103242456913, 0.19180285930633545, 0.10042787343263626, 0.058883994817733765, 0.06681577116250992, 0.08359632641077042, 0.06658929586410522, -0.04517816752195358, -0.05415033549070358, 0.05419524759054184, 0.02584892511367798, -0.011486240662634373, -0.06534979492425919, 0.06513325124979019, -0.042299285531044006, 0.02185725048184395, 0.002624229295179248, -0.014601475559175014, 0.041462965309619904, -0.01647842489182949, -0.03326918184757233, -0.041767705231904984, 0.06448482722043991, 0.05984837934374809, -0.03596055135130882, 0.05079856142401695, 0.028545282781124115, -0.06155502796173096, 0.05255034938454628, -0.06111651659011841, -0.0573049895465374, -0.045041780918836594, 0.043317392468452454, -0.2006581425666809, -0.035840705037117004, 0.017210578545928, -0.09504707157611847, 0.18158309161663055, 0.11042246222496033, 0.0226865503937006, 0.11038027703762054, 0.0006100262980908155, 0.03252790495753288, 0.05559983476996422, -0.03686714172363281, -0.0378132089972496, -0.08983638882637024, -0.04292958974838257, -0.06510688364505768, 0.010041613131761551, -0.07881355285644531, 0.03850259631872177, -0.16369983553886414, -0.007242536637932062, -0.008153839036822319, -0.01638549380004406, 0.018956944346427917, 0.05006198585033417, -0.025297878310084343, -0.026655858382582664, 0.06191950663924217, -0.007842130959033966, -0.09451966732740402, 0.10946433991193771, -0.2322450578212738, -0.0702856257557869, 0.07148654013872147, 0.06312285363674164, -0.06261663883924484, -0.11231035739183426, 0.015643812716007233, -0.03063126839697361, 0.0580466128885746, -0.005185400601476431, 0.10913108289241791, 0.015879683196544647, 0.0844690203666687, -0.09901294112205505, 0.05869021639227867, -0.003674399806186557, -0.0037590761203318834, -0.047780830413103104, 0.07295285910367966, 0.09509031474590302, -0.1747058480978012, 0.026673423126339912, 0.019687596708536148, 0.05611777305603027, 0.18295082449913025, 0.043097734451293945, -0.056417353451251984, -0.03841535747051239, 0.011054969392716885, 0.005954706575721502, -0.004853369202464819, -0.00903892982751131, -0.02115458995103836, 0.017102299258112907, 0.06608638912439346, 0.07605179399251938, -0.035481348633766174, 0.041145313531160355, 0.004445360042154789, -0.026949558407068253, -0.006549889221787453, 0.017980404198169708, -0.038100797683000565, 0.07451009005308151, -0.01644911989569664, -0.00129437237046659, 0.0025077962782233953, -0.01131842378526926, -0.11632874608039856, 0.14180167019367218, -0.13867256045341492, -0.21545834839344025, -0.17345023155212402, -0.061498284339904785, -0.09924232214689255, 0.014720932580530643, -0.02023738995194435, -0.00926192943006754, -0.016611648723483086, -0.12379062920808792, 0.18516340851783752, -0.09184426814317703, -0.04978269338607788, -0.09223427623510361, -0.006726869381964207, -0.012228158302605152, -0.105864018201828, 0.029699282720685005, -0.0500061959028244, -0.08398713171482086, 0.0037392384838312864, -0.06902973353862762, 0.006590418983250856, 0.13838890194892883, 0.010002835653722286, -0.028869353234767914, -0.0022046470548957586, 0.21284550428390503, 0.003774441545829177, 0.08004147559404373, 0.05983968451619148, -0.041900742799043655, 0.01655881479382515, 0.14037980139255524, 0.01629071682691574, -0.09323962777853012, 0.020398061722517014, -0.029032807797193527, -0.09948103874921799, -0.13299983739852905, -0.07102172076702118, -0.026687778532505035, 0.013696509413421154, 0.04752371832728386, 0.018208015710115433, 0.02311459742486477, 0.06220010295510292, -0.009653310291469097, 0.06009063497185707, 0.06775076687335968, 0.04888348653912544, 0.00784530583769083, -0.05048557370901108, 0.08447200059890747, -0.012387700378894806, 0.020872628316283226, 0.021060507744550705, 0.008123405277729034, 0.1607278287410736, 0.010987083427608013, 0.052457697689533234, 0.05591273307800293, 0.032408036291599274, 0.08356354385614395, 0.15661107003688812, -0.028512394055724144, 0.04353176802396774, -0.041694603860378265, -0.05326342210173607, -0.05629636347293854, 0.06533177196979523, 0.01929018832743168, -0.023700395599007607, -0.016678337007761, 0.03181115537881851, -0.05642915889620781, 0.03779064491391182, 0.03458395600318909, -0.1749344766139984, -0.005424435716122389, 0.008103087544441223, 0.00363414385356009, -0.08337203413248062, 0.024374892935156822, 0.07511006295681, -0.06363610923290253, 0.06530222296714783, -0.026621483266353607, 0.08601967990398407, -0.054111894220113754, 0.022433914244174957, -0.048646099865436554, 0.12042038142681122, -0.0069487434811890125, 0.0501461997628212, -0.2776070237159729, 0.0977652370929718, 0.03462182730436325, 0.014875843189656734, -0.06190815195441246, 0.015678646042943, -0.04676734656095505, 0.07662703096866608, 0.07683055102825165, 0.011737039312720299, -0.16258062422275543, -0.041151661425828934, -0.05385560169816017, 0.005435854662209749, 0.08545947819948196, -0.013449430465698242, 0.06873854249715805, 0.07983838766813278, 0.03429871425032616, -0.055707503110170364, 0.047565098851919174, -0.19520024955272675, -0.17298674583435059, 0.07916755229234695, -0.0157049298286438, 0.05224718526005745, -0.014214815571904182, -0.005908983759582043, 0.04379008710384369, 0.12633386254310608, -0.10011927038431168, -0.05285167321562767, -0.06466569006443024, 0.06726884096860886, 0.15393106639385223, -0.06484656035900116, 0.03394336625933647, -0.04032529890537262, 0.03870527446269989, -0.06934183835983276, -0.13406048715114594, 0.06452859193086624, -0.0519566684961319, -0.0023972189519554377, -0.026017358526587486, 0.0755246952176094, 0.09218817949295044, 0.015775129199028015, 0.005661142524331808, -0.03115704469382763, 0.07867274433374405, -0.10399769246578217, -0.07035597413778305, 0.16276401281356812, -0.025973202660679817, 0.066936194896698, -0.18357686698436737, -0.12467379868030548, -0.05555151402950287, 0.006184334866702557, 0.08579631894826889, 0.10173975676298141, -0.049979064613580704, 0.11280293017625809, 0.18268723785877228, -0.13176217675209045, -0.15842388570308685, -0.017766878008842468, 0.05011884868144989, 0.07854260504245758, 0.006943253334611654, -0.21987314522266388, -0.025994418188929558, 0.003242981154471636, 0.05289023742079735, 0.07348079234361649, -0.2621113061904907, -0.1304921954870224, 0.08181675523519516, 0.032018136233091354, 0.17824102938175201, -0.06701523065567017, -0.075838603079319, -0.0686580240726471, 0.03273475542664528, 0.10806547105312347, 0.021394208073616028, 0.08619984984397888, 0.027526969090104103, 0.03345769643783569, 0.036009885370731354, 0.013360964134335518, 0.08080004900693893, 0.08018989861011505, 0.007879563607275486, -0.08846750110387802, 0.06438343226909637, 0.021105296909809113, 0.0008393341558985412, 0.14484010636806488, 0.011992723681032658, -0.015111267566680908, -0.010341617278754711, -0.08344746381044388, -0.0758771225810051, 0.007496913895010948, -0.05097070708870888, -0.04909886419773102, -0.018297918140888214, 0.07551582902669907, -0.010760307312011719, 0.012931343168020248, -0.0989256277680397, -0.12221197783946991, -0.03914399445056915, 0.17154382169246674, 0.14362004399299622, 0.007832298055291176, 0.034698486328125, 0.01598566398024559, 0.022080834954977036, 0.0848039761185646, 0.01336269173771143, -0.005956602282822132, 0.09622648358345032, -0.05143589526414871, 0.07917334139347076, 0.007075225468724966, -0.1681637465953827, -0.0024831437040120363, 0.05584072321653366, -0.014726323075592518, -0.16592712700366974, 0.034407760947942734, 0.0646955817937851, 0.03809814900159836, 0.04131262004375458, 0.19325922429561615, -0.07180579006671906, -0.020671719685196877, -0.0011714721331372857, 0.0508430190384388, -0.06097984313964844, 0.11903971433639526, -0.0323476679623127, -0.00748586468398571, -0.07553020864725113, 0.08902425318956375, 0.03324059769511223, -0.02491890639066696, -0.0055302721448242664, 0.11792854964733124, -0.10147703438997269, -0.06898307055234909, -0.057103481143713, 0.07956275343894958, -0.11529134958982468, -0.050095826387405396, 0.09217333048582077, -0.09359002858400345, 0.04162413999438286, 0.08481437712907791, -0.0185176283121109, 0.03689894825220108, -0.07349427044391632, -0.03290384262800217, 0.005985807627439499, -0.03804173320531845, 0.02452882006764412, 0.00582814821973443, 0.014763114042580128, 0.11588939279317856, 0.056871164590120316, 0.03453870490193367, -0.02709958516061306, -0.04718095436692238, -0.057255275547504425, 0.006555148400366306, -0.11337084323167801, -0.04009396582841873, -0.11788997799158096, -0.03757483512163162, 0.008868427947163582, 0.008087349124252796, 0.0212847962975502, -0.011955531314015388, -0.06014446169137955, -0.009074964560568333, -0.1114063560962677, -0.0012873136438429356, -0.006734819617122412, 0.03008110448718071, -0.04063290357589722, -0.02934214472770691, 0.07285827398300171, 0.004907525144517422, -0.05018739402294159, 0.035558320581912994, -0.031210871413350105, 0.08953370898962021, -0.03569132089614868, -0.011201940476894379, -0.03267303854227066, -0.03259113430976868, -0.01419149897992611, 0.03851047903299332, 0.03664223849773407, -0.020372731611132622, 0.04375923424959183, -0.07352998852729797, 0.07011901587247849, 0.03548750653862953, 0.006579570006579161, -0.04583723843097687, 0.04726652801036835, -0.025934286415576935, 0.040109142661094666, 0.05072420462965965, -0.022107934579253197, 0.06271437555551529, -0.07204950600862503, -0.028521979227662086, 0.015176958404481411, 0.01858719252049923, 0.10402774065732956, -0.08969743549823761, 0.028369329869747162, 0.04254462197422981, 0.13480906188488007, 0.037280093878507614, -0.014868664555251598, -0.044586069881916046, -0.04187760502099991, -0.014507237821817398, 0.048253707587718964, 0.07661978900432587, 0.04711654782295227, 0.030205586925148964, -0.05815485864877701, 0.09304116666316986, -0.04255605861544609, 0.10661224275827408, 0.14911819994449615, 0.03517545759677887, 0.16357018053531647, 0.10551772266626358, 0.04772749915719032, -0.0003916826390195638, -0.01083238236606121, 0.05462414398789406, 0.01882501319050789, 0.0451580211520195, -0.05031508579850197, 0.07078082859516144, 0.08884643018245697, -0.06353246420621872, 0.0922195166349411, 0.08131908625364304, -0.0726245641708374, -0.0861048474907875, -0.274623841047287, -0.003804772859439254, -0.03188681602478027, -0.05092044547200203, -0.07999373227357864, 0.004772172775119543, -0.012889637611806393, 0.0642220601439476, -0.00022694018844049424, 0.07876620441675186, -0.09520740061998367, -0.14804823696613312, 0.03539993241429329, 0.00009400577255291864, 0.06548930704593658, 0.0753849446773529, 0.0075804502703249454, 0.016523992642760277, 0.035502009093761444, 0.03392884135246277, 0.07528313249349594, 0.05507534742355347, 0.05592041090130806, -0.04929123818874359, 0.0034443100448697805, -0.049068015068769455, 0.0009258450008928776, 0.06936509162187576, 0.1999020129442215, 0.028178516775369644, -0.1489686369895935, -0.020503176376223564, 0.04232863709330559, 0.03719494119286537, -0.1038045734167099, -0.1019001379609108, 0.17441895604133606, -0.02045108564198017, 0.0782921314239502, -0.09878194332122803, -0.030035236850380898, -0.019668828696012497, 0.20663386583328247, 0.23827417194843292, -0.07059258967638016, -0.057903774082660675, 0.030661286786198616, 0.0023266912903636694, 0.005933293607085943, 0.1579132080078125, -0.0005799765349365771, 0.2694544792175293, -0.00102421292103827, 0.08118020743131638, -0.05923711508512497, -0.02305203303694725, -0.00883910059928894, 0.08806674182415009, -0.039737626910209656, -0.009985929355025291, -0.02356615662574768, 0.07308676838874817, -0.1014716625213623, -0.15184298157691956, -0.01797918602824211, 0.04183203727006912, -0.04322059825062752, 0.0011976322857663035, -0.02523106150329113, 0.04597834497690201, 0.058774370700120926, -0.02218453213572502, 0.004980922676622868, 0.07515424489974976, -0.025123469531536102, -0.08709905296564102, -0.12055385112762451, 0.027691278606653214, 0.00747323501855135, 0.09898170828819275, 0.006235811859369278, 0.05797012895345688, 0.021344084292650223, 0.031468041241168976, -0.0977325588464737, 0.07016459852457047, -0.03021072782576084, 0.07989451289176941, 0.05073823034763336, 0.08552038669586182, -0.0012707936111837626, 0.08462851494550705, -0.009309368208050728, -0.11717360466718674, 0.08154943585395813, 0.04851469770073891, -0.05356042459607124, -0.018260236829519272, 0.07725079357624054, -0.0972200259566307, 0.16113919019699097, 0.13978002965450287, 0.006412897724658251, -0.0013817813014611602, -0.01653357595205307, 0.042109277099370956, -0.10456686466932297, 0.11144930869340897, -0.014632991515100002, -0.1396796554327011, -0.018036354333162308, -0.07445716112852097, 0.0065373945981264114, -0.22179636359214783, -0.03207241743803024, -0.007017516065388918, -0.019821124151349068, -0.04643186181783676, 0.11225827783346176, 0.04539862647652626, -0.0006757454248145223, -0.043956272304058075, -0.16035526990890503, 0.036965303122997284, 0.06897362321615219, -0.09617381542921066, -0.09024439007043839 ]
null
null
transformers
# Spanish GPT-2 trained on [Spanish RAP Lyrics](https://www.kaggle.com/smunoz3801/9325-letras-de-rap-en-espaol) Created by: [Narrativa](https://www.narrativa.com/) About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
{"language": "es", "license": "mit", "tags": ["GPT-2", "Rap", "Lyrics", "Songs"], "datasets": ["large_spanish_corpus"], "widget": [{"text": "D\u00e9jame contarte lo importante que es buscarte un plan\nNo para golpearles o ganarles, sino para darles paz\n"}]}
text-generation
Narrativa/spanish-gpt2-finetuned-rap-lyrics
[ "transformers", "pytorch", "gpt2", "text-generation", "GPT-2", "Rap", "Lyrics", "Songs", "es", "dataset:large_spanish_corpus", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "es" ]
TAGS #transformers #pytorch #gpt2 #text-generation #GPT-2 #Rap #Lyrics #Songs #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Spanish GPT-2 trained on Spanish RAP Lyrics Created by: Narrativa About Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI
[ "# Spanish GPT-2 trained on Spanish RAP Lyrics\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #GPT-2 #Rap #Lyrics #Songs #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Spanish GPT-2 trained on Spanish RAP Lyrics\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ 81, 58 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #GPT-2 #Rap #Lyrics #Songs #es #dataset-large_spanish_corpus #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Spanish GPT-2 trained on Spanish RAP Lyrics\n\n\nCreated by: Narrativa\n\nAbout Narrativa: Natural Language Generation (NLG) | Gabriele, our machine learning-based platform, builds and deploys natural language solutions. #NLG #AI" ]
[ -0.05180478096008301, 0.18334558606147766, -0.006929789204150438, 0.09555666893720627, 0.08684762567281723, -0.023759638890624046, 0.01119435578584671, 0.10323698073625565, -0.0043423413299024105, 0.00795979518443346, 0.09983271360397339, 0.21428900957107544, 0.016670290380716324, 0.013849158771336079, 0.06039422005414963, -0.2603137493133545, 0.019678430631756783, 0.012603550218045712, 0.034084271639585495, 0.09417928010225296, 0.07325660437345505, 0.0656866654753685, 0.0664774477481842, 0.043651994317770004, -0.16410115361213684, 0.012572051957249641, 0.03767700865864754, -0.10337209701538086, 0.0627998411655426, 0.055079273879528046, -0.05835700035095215, 0.05502092093229294, -0.004662609659135342, -0.09889291971921921, 0.010294939391314983, -0.02802182175219059, -0.05576005578041077, -0.0019992580637335777, 0.009104949422180653, -0.15001164376735687, 0.19092842936515808, -0.028824876993894577, -0.016495415940880775, 0.049725960940122604, -0.23751766979694366, -0.15724986791610718, -0.01528344489634037, -0.07182653248310089, 0.03931409493088722, 0.09000276774168015, -0.039313510060310364, 0.0004844027862418443, -0.05825100839138031, -0.029412703588604927, 0.09820277243852615, -0.31483194231987, -0.06666862219572067, 0.10747013986110687, 0.13631810247898102, -0.014540446922183037, -0.029380451887845993, 0.1298561543226242, 0.05302027240395546, 0.02423737943172455, -0.016868209466338158, -0.09893134981393814, -0.018409471958875656, -0.023546190932393074, -0.07095698267221451, -0.04613780975341797, 0.1443951427936554, -0.07110647857189178, -0.02811615914106369, -0.11420227587223053, 0.023623283952474594, -0.006068699527531862, -0.021120605990290642, -0.04174027219414711, -0.0304681695997715, 0.09175612777471542, -0.045234061777591705, -0.08168346434831619, -0.04018483683466911, -0.038404323160648346, -0.0700785219669342, 0.12858782708644867, 0.00815651100128889, -0.04106768220663071, -0.023170799016952515, 0.08165955543518066, 0.04575330391526222, -0.056001704186201096, 0.002771102823317051, -0.029033860191702843, 0.04828691482543945, 0.05558421090245247, -0.0352284200489521, 0.06036405637860298, 0.086005300283432, 0.08720236271619797, 0.007473282050341368, -0.06416217237710953, 0.04104112833738327, 0.055757492780685425, 0.066675566136837, 0.055416546761989594, 0.05382104218006134, -0.10945041477680206, 0.05508815497159958, -0.12207387387752533, 0.04969106242060661, -0.07169567793607712, -0.20225994288921356, -0.011418445967137814, -0.04089592397212982, 0.08908803015947342, 0.08525605499744415, 0.04949355497956276, -0.12310374528169632, -0.028687182813882828, 0.03683997318148613, -0.023593127727508545, 0.03797627240419388, -0.05505521595478058, 0.05907108634710312, 0.057598214596509933, -0.062036141753196716, 0.028083311393857002, -0.07244125753641129, -0.14564068615436554, -0.07130764424800873, -0.06543779373168945, -0.023261696100234985, 0.046518515795469284, 0.09308776259422302, 0.00022646576690021902, 0.090711809694767, -0.17139741778373718, -0.13873015344142914, -0.04098905250430107, 0.033504895865917206, -0.12134182453155518, -0.17646701633930206, -0.15650378167629242, 0.04491148516535759, 0.029995884746313095, -0.07104673236608505, -0.1168353334069252, -0.041112080216407776, 0.11847016215324402, -0.023623740300536156, 0.10762345790863037, -0.041678622364997864, 0.04955225810408592, -0.1518266797065735, -0.004741286858916283, -0.06982454657554626, 0.15520508587360382, 0.02409091778099537, -0.033160608261823654, -0.07090704888105392, -0.021698186174035072, -0.014356969855725765, 0.11503443866968155, -0.05458369106054306, 0.16747473180294037, -0.07614418864250183, -0.09573303163051605, 0.2744065523147583, -0.022069046273827553, -0.0009002686711028218, 0.15061062574386597, 0.04850408062338829, 0.1442195326089859, 0.1509665697813034, 0.2582489550113678, -0.09421177953481674, 0.03623940795660019, 0.061180662363767624, 0.03325891122221947, -0.015508081763982773, 0.003240677062422037, 0.08330157399177551, 0.012882961891591549, -0.07309231907129288, 0.009714719839394093, 0.05171946436166763, 0.08403007686138153, -0.0068145752884447575, -0.055062148720026016, 0.027249058708548546, 0.01814590021967888, 0.05689419060945511, -0.025761792436242104, -0.02071467787027359, -0.01380921620875597, -0.1238170862197876, -0.01329891849309206, 0.029242608696222305, 0.0018416038947179914, 0.03372076526284218, -0.13879162073135376, 0.16374365985393524, -0.043461304157972336, 0.032383471727371216, -0.052624721080064774, 0.013376949355006218, -0.04781094565987587, -0.014848937280476093, 0.14316527545452118, -0.06475221365690231, -0.03431478142738342, -0.0517168752849102, -0.036094773560762405, 0.07934194803237915, 0.019002465531229973, -0.025590334087610245, -0.005186471622437239, -0.1587647646665573, 0.14678727090358734, -0.020952781662344933, 0.03698965162038803, -0.12282830476760864, -0.02780655212700367, 0.10346254706382751, 0.0004412771377246827, 0.007456641178578138, -0.01878848299384117, 0.024089179933071136, 0.08578019589185715, -0.06898634880781174, -0.03446489945054054, 0.08792412281036377, 0.045106999576091766, -0.09477918595075607, 0.2170921415090561, -0.11036136746406555, 0.025252725929021835, 0.13124093413352966, -0.1419527530670166, -0.024136759340763092, 0.023577027022838593, 0.00623501930385828, 0.014095076359808445, 0.028725633397698402, -0.01310284435749054, 0.22774630784988403, -0.0973348319530487, 0.09499018639326096, -0.10844551771879196, -0.035892337560653687, 0.010605393908917904, -0.11047040671110153, -0.04190913960337639, 0.11135854572057724, 0.030138852074742317, -0.04384634643793106, 0.21619103848934174, 0.2024969756603241, 0.052857305854558945, 0.36100292205810547, 0.060920123010873795, 0.019666265696287155, -0.027247780933976173, -0.026063743978738785, 0.007825473323464394, 0.03093619830906391, -0.23267535865306854, 0.001211860217154026, -0.012927238829433918, 0.017812836915254593, 0.08147387951612473, -0.12335164099931717, -0.1553291231393814, -0.03185471519827843, -0.03803076222538948, 0.003372407518327236, 0.1090930625796318, -0.07636100798845291, 0.04909168928861618, 0.022824756801128387, -0.09441215544939041, 0.11855780333280563, 0.050764016807079315, -0.042204849421978, 0.10822753608226776, -0.11345815658569336, -0.3471548855304718, -0.08081940561532974, -0.14085885882377625, -0.03772016242146492, 0.06305915862321854, 0.10501184314489365, -0.11973296850919724, 0.06022214516997337, 0.060906048864126205, 0.16956405341625214, -0.07092733681201935, -0.1060803085565567, -0.018088985234498978, 0.009580415673553944, -0.10217957943677902, -0.03445471450686455, -0.0423535518348217, 0.05850314348936081, -0.20653067529201508, 0.06590230762958527, -0.16297601163387299, 0.07891775667667389, 0.21857166290283203, 0.13181424140930176, -0.022611070424318314, -0.043612606823444366, 0.21972498297691345, -0.20870450139045715, 0.015888968482613564, 0.1556245982646942, 0.048345837742090225, 0.007670571096241474, 0.025012554600834846, 0.028902290388941765, -0.1298884004354477, -0.002757174661383033, -0.009835012257099152, -0.09814808517694473, -0.21626970171928406, -0.1665540486574173, -0.07667016983032227, 0.058615945279598236, -0.013454895466566086, 0.052039459347724915, 0.09569735825061798, 0.057399723678827286, 0.0023726867511868477, 0.07683321833610535, 0.005163266323506832, 0.08071134984493256, 0.2051095813512802, -0.0675714910030365, 0.06739359349012375, -0.007402835879474878, -0.15066181123256683, 0.10607385635375977, 0.12082023918628693, 0.026783378794789314, 0.13042722642421722, 0.06249209865927696, 0.05676838383078575, 0.09386225044727325, 0.023134684190154076, -0.05952474847435951, 0.03021162934601307, -0.017069444060325623, -0.08058086782693863, -0.05522782728075981, -0.029199689626693726, -0.00023050032905302942, -0.0024933419190347195, -0.13041774928569794, -0.001438615843653679, -0.09616273641586304, 0.046036724001169205, 0.025669977068901062, 0.09198110550642014, -0.11722898483276367, 0.00466173654422164, 0.10908625274896622, 0.017252318561077118, -0.08472266048192978, 0.13279663026332855, 0.03951916843652725, -0.12968222796916962, 0.08383461833000183, 0.022696999832987785, 0.049681130796670914, -0.09733626246452332, 0.05205296352505684, -0.0736827552318573, -0.02675948478281498, 0.054241180419921875, 0.1101512759923935, -0.2197214961051941, 0.1996610313653946, 0.01881178840994835, 0.023649748414754868, -0.07454885542392731, -0.011221845634281635, -0.039056140929460526, 0.02123081497848034, 0.26297706365585327, 0.06573769450187683, -0.09266912937164307, -0.032011378556489944, -0.07438449561595917, 0.03497850149869919, 0.029689446091651917, -0.0664634108543396, -0.027543749660253525, -0.0038938364014029503, 0.031692732125520706, -0.07504381239414215, -0.048883359879255295, -0.0719408392906189, -0.21271872520446777, 0.050544723868370056, 0.04157296568155289, 0.13998520374298096, 0.006783042568713427, -0.04523326829075813, -0.038270190358161926, 0.21178793907165527, 0.009008346125483513, -0.057061415165662766, -0.06699269264936447, -0.07113951444625854, -0.02488909848034382, -0.03520744666457176, -0.00012886669719591737, 0.02383439801633358, -0.020664766430854797, -0.06177288666367531, -0.06339312344789505, 0.17639340460300446, -0.07121865451335907, -0.07827846705913544, -0.05272496119141579, 0.15469315648078918, 0.08511590957641602, 0.08551879227161407, 0.12139584124088287, -0.0011205325135961175, 0.009491329081356525, -0.11342990398406982, 0.05843237414956093, -0.0072725070640444756, -0.1302732229232788, 0.004037892911583185, -0.06768779456615448, -0.00861500483006239, -0.013194049708545208, -0.1605248600244522, 0.20325185358524323, 0.15260876715183258, -0.007408587262034416, 0.18645264208316803, 0.1709248423576355, -0.15073561668395996, -0.2956337630748749, -0.10061513632535934, -0.07214918732643127, 0.007178418338298798, -0.0028337077237665653, -0.3136685788631439, -0.019001541659235954, 0.06905547529459, -0.03573928028345108, 0.048206131905317307, -0.4167558550834656, -0.035478025674819946, 0.041539374738931656, -0.035495344549417496, 0.3630264401435852, -0.06916168332099915, -0.08823592215776443, -0.08610942959785461, 0.004823440220206976, 0.2059565931558609, -0.010019375011324883, 0.14167794585227966, 0.003211213508620858, 0.1252061277627945, 0.04211171716451645, 0.08264252543449402, 0.10239473730325699, 0.013425542041659355, -0.02416810765862465, -0.04890664666891098, 0.0018782588886097074, 0.08704089373350143, 0.06925483047962189, -0.05550219118595123, -0.08977331966161728, -0.078044094145298, -0.12832945585250854, -0.11804836988449097, -0.0922362431883812, 0.027932312339544296, -0.024777818471193314, -0.08415470272302628, -0.008935466408729553, 0.09078410267829895, 0.005832915659993887, -0.01846856251358986, 0.07444111257791519, -0.11016751080751419, 0.061845190823078156, -0.1151711493730545, 0.10543181747198105, -0.024927718564867973, -0.02499859780073166, -0.04453955963253975, -0.060554783791303635, 0.07616859674453735, -0.10941968858242035, 0.029005391523241997, 0.05911887809634209, -0.01386763621121645, 0.037495553493499756, 0.03130168467760086, -0.11240263283252716, 0.05902357026934624, 0.15214574337005615, -0.08598478138446808, -0.1557413935661316, -0.10014805197715759, -0.0019487704848870635, 0.20573683083057404, -0.03938112035393715, 0.17803503572940826, 0.03406362980604172, -0.11035839468240738, 0.016627689823508263, 0.02139178104698658, -0.07233070582151413, 0.03261866047978401, -0.04428375884890556, -0.06130698695778847, -0.10278890281915665, 0.05538208782672882, 0.06198205426335335, -0.11323587596416473, 0.05084458738565445, 0.09935131669044495, -0.02529192343354225, -0.07658576965332031, -0.06057695299386978, 0.008736073970794678, -0.2135823667049408, -0.004476476460695267, 0.011045405641198158, -0.12087568640708923, 0.014469853602349758, 0.03682643920183182, 0.03038298338651657, 0.051402147859334946, -0.0185870174318552, -0.0143992118537426, 0.01788424886763096, -0.027572225779294968, 0.027599865570664406, -0.024405011907219887, -0.037337224930524826, 0.016247229650616646, 0.03147191181778908, 0.09385395050048828, -0.08411315083503723, -0.059373170137405396, -0.16255491971969604, 0.023518750444054604, -0.04583977162837982, -0.05002069100737572, -0.12242003530263901, -0.011435849592089653, -0.017612207680940628, 0.0329558290541172, -0.055683355778455734, -0.046592727303504944, -0.07396206259727478, -0.017097048461437225, -0.009388653561472893, 0.06721010059118271, 0.009163388051092625, -0.009824941866099834, 0.04472190886735916, -0.0143874641507864, 0.1051526814699173, -0.026649286970496178, 0.0027054259553551674, 0.08102475106716156, -0.28954657912254333, 0.06529339402914047, 0.08043442666530609, 0.050273582339286804, 0.04044795036315918, -0.06670138984918594, 0.0162054356187582, 0.05698199197649956, 0.03535058721899986, 0.06717850267887115, -0.022370392456650734, -0.12978802621364594, 0.027964238077402115, 0.0615244098007679, -0.18357403576374054, 0.024509551003575325, 0.05317443236708641, 0.07890038192272186, -0.06478890031576157, 0.04827011376619339, -0.0998755469918251, 0.029950682073831558, -0.022381655871868134, 0.04004594311118126, -0.03162418678402901, -0.038310855627059937, -0.06916455924510956, -0.01726183108985424, 0.013503522612154484, 0.039803870022296906, 0.23304708302021027, 0.09833886474370956, -0.08269953727722168, -0.03106757067143917, 0.07711955904960632, 0.08764617145061493, -0.05097711458802223, 0.056475844234228134, 0.09588213264942169, -0.015412784181535244, -0.09839636832475662, 0.0659039318561554, 0.02105853520333767, 0.1375683695077896, -0.03428274020552635, -0.030275383964180946, 0.218927800655365, 0.0757819339632988, 0.030965248122811317, 0.022743958979845047, -0.13327041268348694, -0.021237174049019814, 0.052356231957674026, 0.03953048959374428, -0.028115656226873398, 0.09310421347618103, 0.07677892595529556, -0.026682620868086815, 0.05890960991382599, -0.002458931179717183, -0.030756274238228798, -0.17573665082454681, -0.3870011866092682, -0.014536661095917225, -0.11453963071107864, -0.025348765775561333, -0.10453987121582031, 0.06172092631459236, -0.031735386699438095, 0.02387472800910473, -0.07826156169176102, 0.0053330580703914165, 0.012315102852880955, -0.1352432370185852, 0.07714247703552246, 0.008011539466679096, 0.1365976184606552, -0.03754526376724243, 0.08793789893388748, -0.06395966559648514, 0.07446720451116562, 0.05678746476769447, 0.05817246437072754, 0.010688704438507557, 0.0449942946434021, -0.15703818202018738, -0.003462978173047304, -0.05423351004719734, 0.042578425258398056, 0.017626607790589333, 0.10544781386852264, 0.07543941587209702, -0.0323961079120636, 0.04507014900445938, 0.206455796957016, 0.03843022137880325, 0.007504864130169153, -0.050183117389678955, 0.21192651987075806, -0.07114891707897186, 0.09692764282226562, -0.03415640816092491, 0.01253161858767271, 0.01904992200434208, 0.24880211055278778, 0.23801998794078827, -0.06310001760721207, -0.04449599236249924, -0.015065995045006275, 0.029137704521417618, 0.05287717282772064, 0.051852427423000336, 0.0647052451968193, 0.34812483191490173, -0.03726797178387642, -0.09451739490032196, -0.11729243397712708, 0.014341541565954685, -0.0875910297036171, 0.06342381983995438, 0.037988584488630295, -0.044053949415683746, 0.012403151020407677, 0.14817026257514954, -0.26006048917770386, 0.09396791458129883, -0.07672230154275894, -0.16136042773723602, -0.14156393706798553, -0.008786287158727646, -0.03849264979362488, 0.10107120871543884, 0.1391601413488388, 0.018220342695713043, -0.07652567327022552, -0.015186687000095844, 0.037855494767427444, -0.1918470412492752, 0.016800062730908394, 0.022527538239955902, 0.03174049034714699, 0.20218627154827118, -0.02797367237508297, 0.04570356011390686, 0.07584085315465927, 0.07788453251123428, -0.02497979812324047, 0.049113817512989044, 0.0038430648855865, 0.018429972231388092, 0.05797647684812546, 0.06721707433462143, 0.011264572851359844, 0.011107244528830051, 0.1343744546175003, 0.11700799316167831, 0.06242623180150986, 0.10218960791826248, -0.012466625310480595, -0.019853997975587845, 0.0496995784342289, -0.14250516891479492, 0.02374991588294506, 0.0878431648015976, -0.013703468255698681, -0.049912240356206894, -0.020149080082774162, 0.02505275420844555, 0.017163602635264397, 0.09767685830593109, -0.010104586370289326, -0.11042320728302002, -0.053035181015729904, 0.05424097552895546, -0.006160257384181023, -0.09821971505880356, 0.011023437604308128, -0.09074851870536804, 0.035527270287275314, -0.15419712662696838, 0.09593354165554047, 0.009495030157268047, 0.029009545221924782, 0.03797585889697075, -0.06630061566829681, 0.010255318135023117, 0.02566419169306755, -0.061100900173187256, -0.06075168773531914 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # DeBERTa v3 small fine-tuned on hate_speech18 dataset for Hate Speech Detection This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the hate_speech18 dataset. It achieves the following results on the evaluation set: - Loss: 0.2922 - Accuracy: 0.9161 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.4147 | 1.0 | 650 | 0.3910 | 0.8832 | | 0.2975 | 2.0 | 1300 | 0.2922 | 0.9161 | | 0.2575 | 3.0 | 1950 | 0.3555 | 0.9051 | | 0.1553 | 4.0 | 2600 | 0.4263 | 0.9124 | | 0.1267 | 5.0 | 3250 | 0.4238 | 0.9161 | ### Framework versions - Transformers 4.12.5 - Pytorch 1.10.0+cu111 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["hate_speech18"], "metrics": ["accuracy"], "widget": [{"text": "ok, so do we need to kill them too or are the slavs okay ? for some reason whenever i hear the word slav , the word slobber comes to mind and i picture a slobbering half breed creature like the humpback of notre dame or Igor haha"}], "model-index": [{"name": "deberta-v3-small-hate-speech", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "hate_speech18", "type": "hate_speech18", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.916058394160584, "name": "Accuracy"}]}]}]}
text-classification
Narrativaai/deberta-v3-small-finetuned-hate_speech18
[ "transformers", "pytorch", "tensorboard", "deberta-v2", "text-classification", "generated_from_trainer", "dataset:hate_speech18", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #deberta-v2 #text-classification #generated_from_trainer #dataset-hate_speech18 #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us
DeBERTa v3 small fine-tuned on hate\_speech18 dataset for Hate Speech Detection =============================================================================== This model is a fine-tuned version of microsoft/deberta-v3-small on the hate\_speech18 dataset. It achieves the following results on the evaluation set: * Loss: 0.2922 * Accuracy: 0.9161 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 5 ### Training results ### Framework versions * Transformers 4.12.5 * Pytorch 1.10.0+cu111 * Datasets 1.16.1 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #deberta-v2 #text-classification #generated_from_trainer #dataset-hate_speech18 #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5", "### Training results", "### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ 71, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #deberta-v2 #text-classification #generated_from_trainer #dataset-hate_speech18 #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0+cu111\n* Datasets 1.16.1\n* Tokenizers 0.10.3" ]
[ -0.09231627732515335, 0.08070342242717743, -0.0031916217412799597, 0.11744385212659836, 0.164606511592865, 0.03613976016640663, 0.15093444287776947, 0.11785999685525894, -0.06909452378749847, 0.028093038126826286, 0.11239027976989746, 0.162613645195961, 0.018996449187397957, 0.10451482981443405, -0.06748143583536148, -0.2743294835090637, -0.016848327592015266, 0.05435045808553696, -0.040696218609809875, 0.13067933917045593, 0.11718053370714188, -0.12112495303153992, 0.08818081766366959, 0.0011462157126516104, -0.1896449625492096, 0.013497350737452507, 0.026190293952822685, -0.058087144047021866, 0.14977999031543732, 0.03176373615860939, 0.1315889060497284, 0.014924982562661171, 0.07719042897224426, -0.1781531125307083, 0.017699923366308212, 0.05075713247060776, -0.0027125668711960316, 0.08123220503330231, 0.037832457572221756, -0.020990749821066856, 0.13376815617084503, -0.08689659088850021, 0.06278839707374573, 0.03041653148829937, -0.13880321383476257, -0.231834277510643, -0.06627518683671951, 0.03713668882846832, 0.08149515092372894, 0.10192608833312988, -0.026636820286512375, 0.1257738173007965, -0.08390159159898758, 0.10094960778951645, 0.2146240770816803, -0.2566060721874237, -0.056558527052402496, 0.021031655371189117, 0.013907164335250854, 0.08508963137865067, -0.09996771812438965, -0.019300613552331924, 0.06006966158747673, 0.04562125355005264, 0.11184907704591751, -0.03254403918981552, -0.07047535479068756, 0.00693496223539114, -0.1420370191335678, -0.03144055977463722, 0.1481592208147049, 0.04537958651781082, -0.030401553958654404, -0.053074054419994354, -0.05087610334157944, -0.15171295404434204, -0.026950247585773468, -0.03194662556052208, 0.04635147005319595, -0.03291350603103638, -0.0799938440322876, -0.008883737958967686, -0.10927995294332504, -0.061943307518959045, -0.06615466624498367, 0.14046257734298706, 0.030243780463933945, 0.001559797441586852, -0.03581345081329346, 0.08995230495929718, -0.020965632051229477, -0.13223619759082794, 0.02812139317393303, 0.036217279732227325, 0.012847080826759338, -0.04502321034669876, -0.06337229162454605, -0.1053844541311264, -0.005020402371883392, 0.08588118106126785, -0.056244704872369766, 0.042294129729270935, 0.01224528718739748, 0.04217986389994621, -0.06409048289060593, 0.19186216592788696, -0.03444480523467064, -0.023229442536830902, 0.005193048156797886, 0.07241106033325195, 0.036321137100458145, -0.01585816591978073, -0.13423152267932892, -0.0023617225233465433, 0.10488270223140717, 0.027873963117599487, -0.0628681406378746, 0.07202845811843872, -0.0735582560300827, -0.03053644299507141, -0.012042643502354622, -0.0899958685040474, 0.028515992686152458, 0.014700525440275669, -0.08275937288999557, -0.009466787800192833, 0.011829813942313194, 0.012612217105925083, -0.021074146032333374, 0.12443045526742935, -0.09125366806983948, 0.026630384847521782, -0.08290489763021469, -0.14222203195095062, 0.020907044410705566, -0.1019444465637207, 0.03156616538763046, -0.10406256467103958, -0.15677617490291595, -0.017356880009174347, 0.02831699512898922, -0.021404366940259933, -0.04873491823673248, -0.08159787952899933, -0.06206229701638222, 0.008093894459307194, -0.01429713610559702, 0.061528291553258896, -0.06116018444299698, 0.10459166020154953, 0.029545124620199203, 0.06479524821043015, -0.04801289737224579, 0.05762983858585358, -0.09480730444192886, 0.011706707999110222, -0.155715212225914, 0.058753129094839096, -0.051447827368974686, 0.0592346154153347, -0.08139853179454803, -0.10908516496419907, 0.0052769361063838005, 0.013363610953092575, 0.07622544467449188, 0.10126931965351105, -0.19087010622024536, -0.10277104377746582, 0.1612265706062317, -0.07841821759939194, -0.1136787012219429, 0.1538745015859604, -0.0738774836063385, 0.04103143885731697, 0.09824708849191666, 0.19755877554416656, 0.050755973905324936, -0.10357607156038284, 0.010487372986972332, -0.0004042091895826161, 0.03159298002719879, -0.02691851183772087, 0.058778468519449234, -0.007658643648028374, 0.041450969874858856, 0.012233204208314419, -0.009608649648725986, 0.0560852475464344, -0.08598726242780685, -0.09119685739278793, -0.020568279549479485, -0.08793049305677414, 0.07248964160680771, 0.07057815045118332, 0.05807391554117203, -0.1135578528046608, -0.08539076149463654, 0.06878797709941864, 0.08574766665697098, -0.06453748792409897, 0.02409251593053341, -0.07025860249996185, 0.06997954100370407, -0.02209012396633625, -0.022330425679683685, -0.1587369441986084, -0.011258198879659176, 0.005647070240229368, 0.003957048989832401, 0.04364689067006111, 0.056335363537073135, 0.06730249524116516, 0.03444546461105347, -0.05908120423555374, -0.0002624760672915727, -0.04169180616736412, 0.011672739870846272, -0.12103516608476639, -0.20733658969402313, -0.012327768839895725, -0.016029316931962967, 0.15229502320289612, -0.23066942393779755, 0.03999920189380646, 0.01957464963197708, 0.07194413244724274, 0.0212174691259861, -0.012677585706114769, -0.032952096313238144, 0.07899896055459976, -0.03719569742679596, -0.03760924190282822, 0.0663456991314888, -0.0010663782013580203, -0.098584845662117, -0.024498503655195236, -0.15024885535240173, 0.1515215039253235, 0.1456270068883896, -0.13495728373527527, -0.08076631277799606, 0.014070054516196251, -0.03976406529545784, -0.01939472369849682, -0.037066176533699036, -0.009329462423920631, 0.19054388999938965, -0.0071370769292116165, 0.15187187492847443, -0.0790451392531395, -0.03837081417441368, 0.01575322076678276, -0.03701361268758774, 0.0002950486377812922, 0.131667822599411, 0.08759195357561111, -0.1087338775396347, 0.15522564947605133, 0.14709638059139252, -0.10548141598701477, 0.15371070802211761, -0.02415810152888298, -0.060913000255823135, -0.03163362294435501, -0.037504516541957855, -0.007678834721446037, 0.09821225702762604, -0.1857515126466751, -0.018029220402240753, 0.009714078158140182, 0.010177897289395332, 0.016544075682759285, -0.21280677616596222, -0.03574641793966293, 0.049947384744882584, -0.04985276609659195, -0.038593389093875885, -0.003474527271464467, 0.014441234990954399, 0.11292518675327301, 0.008869153447449207, -0.09667486697435379, 0.035031579434871674, -0.006611831020563841, -0.09404662996530533, 0.19751213490962982, -0.08922072499990463, -0.1608411967754364, -0.1084839478135109, -0.06062496453523636, -0.04955919831991196, 0.021932998672127724, 0.062092822045087814, -0.1027776375412941, -0.025981716811656952, -0.06274539977312088, 0.045944299548864365, -0.03647708520293236, 0.014614900574088097, 0.014795729890465736, -0.00208653649315238, 0.05448712408542633, -0.12282951921224594, -0.015140418894588947, -0.06524147093296051, -0.039662010967731476, 0.027057845145463943, 0.015300918370485306, 0.09852460026741028, 0.1715567260980606, -0.024679692462086678, 0.005323463119566441, -0.05093125253915787, 0.2530515193939209, -0.08248177170753479, -0.03161092847585678, 0.12753905355930328, -0.03972041606903076, 0.04623081162571907, 0.12005366384983063, 0.06755811721086502, -0.08999665826559067, -0.002034303965047002, 0.03520095720887184, -0.03530522435903549, -0.22855278849601746, -0.042340755462646484, -0.05053532496094704, 0.01661049574613571, 0.08446387201547623, 0.019308991730213165, 0.0532979816198349, 0.06319896876811981, 0.02868296019732952, 0.06691041588783264, -0.00574577646329999, 0.05591435730457306, 0.17028890550136566, 0.025300217792391777, 0.123532734811306, -0.03744949772953987, -0.07047117501497269, 0.04382387176156044, -0.020612141117453575, 0.19233642518520355, 0.02021683380007744, 0.16426649689674377, 0.05506550520658493, 0.12312155216932297, -0.004654699936509132, 0.0454055592417717, -0.012730888091027737, -0.03544071316719055, -0.014753571711480618, -0.032509710639715195, -0.04557535797357559, 0.027170391753315926, -0.0051757292822003365, 0.03946355730295181, -0.11813854426145554, -0.016929225996136665, 0.06053074076771736, 0.22111055254936218, 0.05639773979783058, -0.3235173523426056, -0.0930362418293953, 0.012619144283235073, -0.028718803077936172, -0.0010348086943849921, 0.025420576333999634, 0.09563757479190826, -0.07888822257518768, 0.04267740994691849, -0.050821270793676376, 0.09206470847129822, -0.08636555075645447, 0.06350497901439667, 0.03642972558736801, 0.09306303411722183, -0.011383582837879658, 0.07593821734189987, -0.30826860666275024, 0.27469900250434875, 0.00937921367585659, 0.06753133982419968, -0.06960471719503403, -0.019833745434880257, 0.033797621726989746, 0.05358145385980606, 0.06282801181077957, -0.01094765868037939, -0.004134227521717548, -0.21017667651176453, -0.026634713634848595, 0.014518411830067635, 0.09668053686618805, -0.01755261979997158, 0.08202901482582092, -0.021236620843410492, 0.013463209383189678, 0.09041772782802582, -0.0017745357472449541, -0.061602041125297546, -0.08920685946941376, 0.0015548828523606062, 0.06017019972205162, -0.039617277681827545, -0.05112626776099205, -0.11409558355808258, -0.11639125645160675, 0.13445189595222473, -0.045437417924404144, -0.05177935212850571, -0.090516097843647, 0.08980761468410492, 0.048238810151815414, -0.0876501277089119, 0.023764612153172493, 0.017391545698046684, 0.07575824856758118, 0.008253684267401695, -0.05194228142499924, 0.12377342581748962, -0.05754620209336281, -0.16010801494121552, -0.06958506256341934, 0.13079729676246643, 0.04714081808924675, 0.0720747858285904, 0.004778643138706684, 0.004884158261120319, -0.022796783596277237, -0.0749526172876358, 0.039698727428913116, -0.017551396042108536, 0.02245226688683033, 0.029732273891568184, -0.0393013097345829, -0.008296983316540718, -0.08542962372303009, -0.028946498408913612, 0.2045861780643463, 0.26239916682243347, -0.08336250483989716, 0.011521105654537678, 0.05668139457702637, -0.07318795472383499, -0.18525446951389313, 0.04443366080522537, 0.05607818812131882, 0.009552901610732079, 0.049438200891017914, -0.17605474591255188, 0.07088285684585571, 0.10512194037437439, -0.005852047819644213, 0.11595696955919266, -0.3188036382198334, -0.13368509709835052, 0.14067615568637848, 0.12835416197776794, 0.14651483297348022, -0.13510355353355408, -0.021462826058268547, -0.03718796744942665, -0.11375157535076141, 0.12047247588634491, -0.08058516681194305, 0.12393779307603836, -0.008410572074353695, 0.10831902921199799, 0.019832462072372437, -0.04427330940961838, 0.12428256869316101, 0.026416422799229622, 0.10040194541215897, -0.07289671897888184, -0.07181356847286224, 0.0475417897105217, -0.0304581206291914, 0.022985320538282394, -0.09475302696228027, 0.01634439453482628, -0.10511011630296707, -0.034451019018888474, -0.077627032995224, 0.030010631307959557, -0.040278397500514984, -0.06996789574623108, -0.043973345309495926, 0.022150086238980293, 0.041743602603673935, -0.014862527139484882, 0.12764225900173187, -0.011037240736186504, 0.14794708788394928, 0.10938282310962677, 0.11157476156949997, -0.046521227806806564, -0.05895267054438591, -0.003362968098372221, -0.0230382289737463, 0.05849545821547508, -0.12339767813682556, 0.03133142739534378, 0.13793766498565674, 0.02349400520324707, 0.1644577831029892, 0.07762856781482697, -0.045263443142175674, 0.034112218767404556, 0.05412014573812485, -0.13745185732841492, -0.11884976923465729, -0.017208632081747055, -0.09342252463102341, -0.12088967859745026, 0.03845541179180145, 0.12491080164909363, -0.06347984820604324, -0.005762550514191389, -0.0053209466859698296, 0.005348579026758671, -0.059498563408851624, 0.18566006422042847, 0.07836610823869705, 0.03834781423211098, -0.09310095012187958, 0.06528708338737488, 0.05659309774637222, -0.0838475152850151, 0.034206412732601166, 0.07545147836208344, -0.07125135511159897, -0.04489513486623764, 0.007175680715590715, 0.17915914952754974, -0.05654486268758774, -0.03451249748468399, -0.1480371057987213, -0.1335650384426117, 0.0623113252222538, 0.1664363592863083, 0.09842496365308762, 0.02434961125254631, -0.06139802187681198, 0.01943507045507431, -0.12376546859741211, 0.09535985440015793, 0.0507424995303154, 0.056708063930273056, -0.13181982934474945, 0.1702801138162613, 0.0033871722407639027, 0.035959646105766296, -0.02181267738342285, 0.014411368407309055, -0.11729563772678375, 0.018099825829267502, -0.13393180072307587, -0.02551979012787342, -0.027995554730296135, 0.010215922258794308, 0.002520836889743805, -0.058475054800510406, -0.05376333370804787, 0.015528534539043903, -0.11357150971889496, -0.014424415305256844, 0.04455799236893654, 0.054379913955926895, -0.12180186808109283, -0.04323827847838402, 0.018105655908584595, -0.05457683652639389, 0.08474472165107727, 0.05887199193239212, 0.003230109577998519, 0.05552069470286369, -0.15790154039859772, 0.017672892659902573, 0.06814579665660858, 0.01607155054807663, 0.058402035385370255, -0.08750040829181671, -0.0017064680578187108, -0.003383244387805462, 0.05770449340343475, 0.027918828651309013, 0.07912155240774155, -0.12209127843379974, 0.0175352543592453, 0.009072220884263515, -0.0889483317732811, -0.058156393468379974, 0.036030884832143784, 0.07087080180644989, 0.018807856366038322, 0.1999252885580063, -0.08070332556962967, 0.04881791025400162, -0.21014422178268433, 0.010183241218328476, -0.0019076301250606775, -0.10877617448568344, -0.10093752294778824, -0.06470033526420593, 0.06887124478816986, -0.07267555594444275, 0.12766708433628082, 0.04496388882398605, 0.0117484200745821, 0.04498358070850372, -0.007404575124382973, -0.0012691511074081063, 0.010228642262518406, 0.18589593470096588, 0.024308374151587486, -0.05738212540745735, 0.0195932500064373, 0.02712610550224781, 0.1136552020907402, 0.10474738478660583, 0.17721666395664215, 0.15276643633842468, -0.004333484452217817, 0.08937957882881165, 0.06245355308055878, -0.0374099537730217, -0.16473105549812317, 0.023483984172344208, -0.02619338035583496, 0.10683970898389816, -0.017883485183119774, 0.20758448541164398, 0.08888023346662521, -0.14674988389015198, 0.04281313717365265, -0.05791155993938446, -0.09513458609580994, -0.12303292006254196, -0.07217581570148468, -0.08366409689188004, -0.13760265707969666, 0.01015566848218441, -0.11910169571638107, 0.02259891852736473, 0.09329713135957718, 0.018463706597685814, -0.044851262122392654, 0.1398232877254486, 0.012353102676570415, 0.004661046899855137, 0.09083780646324158, 0.0000013409612620307598, -0.031820423901081085, -0.11356524378061295, -0.06040791794657707, 0.0032133893109858036, -0.01001670490950346, 0.030633898451924324, -0.061168037354946136, -0.056118711829185486, 0.013481871224939823, -0.037109341472387314, -0.11095153540372849, 0.012300834991037846, 0.03259756416082382, 0.06318309158086777, 0.07098297029733658, 0.010755873285233974, 0.013989314436912537, -0.0018275987822562456, 0.19378113746643066, -0.05941198766231537, -0.06342404335737228, -0.09389832615852356, 0.24371998012065887, 0.04172665253281593, -0.0009418983827345073, 0.02418862096965313, -0.08734561502933502, 0.012046711519360542, 0.2223293036222458, 0.23316924273967743, -0.10057900846004486, 0.006927763111889362, -0.01822424866259098, 0.004417988937348127, -0.006050282157957554, 0.08844584971666336, 0.10851363092660904, 0.04751121252775192, -0.07375209778547287, 0.003913233056664467, -0.04998498782515526, -0.0015221760841086507, -0.014442258514463902, 0.06936225295066833, 0.051207974553108215, 0.0019409729866310954, -0.046457771211862564, 0.07914671301841736, -0.0624653659760952, -0.1013794094324112, 0.028802433982491493, -0.23439756035804749, -0.15158650279045105, -0.007768782787024975, 0.08555033802986145, 0.02962568588554859, 0.07598534226417542, -0.020582647994160652, -0.008429333567619324, 0.08849228173494339, -0.01933455839753151, -0.10201232880353928, -0.0865231305360794, 0.10002805292606354, -0.09569886326789856, 0.1802893579006195, -0.054006461054086685, 0.07959651201963425, 0.11841750144958496, 0.07088974118232727, -0.047203440219163895, 0.0759923979640007, 0.04510938376188278, -0.05481984093785286, 0.016749054193496704, 0.09098421037197113, -0.04086725413799286, 0.08566645532846451, 0.05922476947307587, -0.1338200718164444, 0.023742426186800003, -0.05451981723308563, -0.07972594350576401, -0.051910094916820526, -0.017959576100111008, -0.05862417072057724, 0.11766541004180908, 0.21256814897060394, -0.03434384614229202, -0.012110602110624313, -0.0510622076690197, -0.0177488774061203, 0.07757305353879929, 0.023118074983358383, -0.06236535310745239, -0.22575873136520386, 0.003005832666531205, 0.07397432625293732, -0.0009547699592076242, -0.2650952935218811, -0.07563632726669312, -0.01178827229887247, -0.05913333594799042, -0.08016319572925568, 0.07531599700450897, 0.06810221076011658, 0.04804094880819321, -0.051026780158281326, -0.05647372826933861, -0.058420952409505844, 0.164356529712677, -0.14678435027599335, -0.09078209847211838 ]