omarmomen commited on
Commit
6f0ed27
·
verified ·
1 Parent(s): 97d925e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ library_name: transformers
13
  This model is part of the experiments in the published paper at the BabyLM workshop in CoNLL 2023.
14
  The paper titled "Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building" (https://aclanthology.org/2023.conll-babylm.29/)
15
 
16
- <strong>omarmomen/structroberta_sx2_final</strong> is a modification of the vanilla transformer encoder to incorporate syntactic inductive bias using an unsupervised parsing mechanism.
17
 
18
  This model variant places the parser network ahead of all the attention blocks.
19
 
 
13
  This model is part of the experiments in the published paper at the BabyLM workshop in CoNLL 2023.
14
  The paper titled "Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building" (https://aclanthology.org/2023.conll-babylm.29/)
15
 
16
+ <strong>omarmomen/structformer_s1_final_with_pos</strong> is a modification of the vanilla transformer encoder to incorporate syntactic inductive bias using an unsupervised parsing mechanism.
17
 
18
  This model variant places the parser network ahead of all the attention blocks.
19