ncoop57 commited on
Commit
b89f82b
1 Parent(s): a98fa4a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -85,6 +85,8 @@ model-index:
85
 
86
  `stable-code-3b` is a 2.7B billion parameter decoder-only language model pre-trained on 1.3 trillion tokens of diverse textual and code datasets. `stable-code-3b` is trained on 18 programming languages (selected based on the 2023 StackOverflow Developer Survey) and demonstrates state-of-the-art performance (compared to models of similar size) on the MultiPL-E metrics across multiple programming languages tested using [BigCode's Evaluation Harness](https://github.com/bigcode-project/bigcode-evaluation-harness/tree/main).
87
 
 
 
88
  **Key Features**
89
  * Fill in Middle Capability (FIM)
90
  * Supports Long Context, trained with Sequences upto 16,384
 
85
 
86
  `stable-code-3b` is a 2.7B billion parameter decoder-only language model pre-trained on 1.3 trillion tokens of diverse textual and code datasets. `stable-code-3b` is trained on 18 programming languages (selected based on the 2023 StackOverflow Developer Survey) and demonstrates state-of-the-art performance (compared to models of similar size) on the MultiPL-E metrics across multiple programming languages tested using [BigCode's Evaluation Harness](https://github.com/bigcode-project/bigcode-evaluation-harness/tree/main).
87
 
88
+ ![performance](stable_code_3b_evals.png)
89
+
90
  **Key Features**
91
  * Fill in Middle Capability (FIM)
92
  * Supports Long Context, trained with Sequences upto 16,384