nielsr HF Staff commited on
Commit
98d1a57
·
verified ·
1 Parent(s): 9943452

Improve model card: fix paper link and TIL Blog Post link, set pipeline tag to text-generation, add Github link

Browse files

This PR improves the model card by:

- Fixing the empty paper link, so that it points to [It's All in The [MASK]: Simple Instruction-Tuning Enables BERT-like Masked Language Models As Generative Classifiers](https://arxiv.org/abs/2502.03793).
- Fixing the empty TIL Blog Post link, as the URL was missing
- Setting the `pipeline_tag` to `text-generation` instead of `fill-mask`
- Adding a link to the Github repository, for the mini cookbook.

Files changed (1) hide show
  1. README.md +9 -13
README.md CHANGED
@@ -1,24 +1,20 @@
1
  ---
2
- library_name: transformers
3
- license: apache-2.0
4
  language:
5
  - en
6
- widget:
7
- - text: "You will be given a question and options. Select the right answer.
8
- QUESTION: If (G, .) is a group such that (ab)^-1 = a^-1b^-1, for all a, b in G, then G is a/an
9
- CHOICES:
10
- - A: commutative semi group
11
- - B: abelian group
12
- - C: non-abelian group
13
- - D: None of these
14
- ANSWER: [unused0] [MASK]"
15
  tags:
16
  - fill-mask
17
  - masked-lm
18
  - long-context
19
  - classification
20
  - modernbert
21
- pipeline_tag: fill-mask
 
 
 
 
22
  inference: false
23
  ---
24
 
@@ -39,7 +35,7 @@ ModernBERT-Instruct-Large is a lightly instruction-tuned version of [ModernBERT-
39
 
40
  Despite a very straightforward pre-training and inference pipeline, this model proves to be a very strong model in a variety of contexts, in both zero-shot and fully-finetuned settings.
41
 
42
- For more details, we recommend checking out the [TIL Blog Post](), the [mini cookbook GitHub repository](https://github.com/AnswerDotAI/ModernBERT-Instruct-mini-cookbook) or the [Technical Report](https://arxiv.org/abs/2502.03793).
43
 
44
  ## Usage
45
 
 
1
  ---
 
 
2
  language:
3
  - en
4
+ library_name: transformers
5
+ license: apache-2.0
6
+ pipeline_tag: text-generation
 
 
 
 
 
 
7
  tags:
8
  - fill-mask
9
  - masked-lm
10
  - long-context
11
  - classification
12
  - modernbert
13
+ widget:
14
+ - text: 'You will be given a question and options. Select the right answer. QUESTION:
15
+ If (G, .) is a group such that (ab)^-1 = a^-1b^-1, for all a, b in G, then G is
16
+ a/an CHOICES: - A: commutative semi group - B: abelian group - C: non-abelian
17
+ group - D: None of these ANSWER: [unused0] [MASK]'
18
  inference: false
19
  ---
20
 
 
35
 
36
  Despite a very straightforward pre-training and inference pipeline, this model proves to be a very strong model in a variety of contexts, in both zero-shot and fully-finetuned settings.
37
 
38
+ For more details, we recommend checking out the [TIL Blog Post](https://til.simonwillison.net/llms/masked-language-model-classifier), the [mini cookbook GitHub repository](https://github.com/AnswerDotAI/ModernBERT-Instruct-mini-cookbook) or the [Technical Report](https://arxiv.org/abs/2502.03793).
39
 
40
  ## Usage
41