Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ library_name: transformers
|
|
20 |
|
21 |
## Model Description
|
22 |
|
23 |
-
`stable-code-3b` is a 2.7B billion parameter decoder-only language model pre-trained on 1.3 trillion tokens of diverse textual and code datasets. `stable-code-3b` is trained on
|
24 |
|
25 |
**Key Features**
|
26 |
* Fill in Middle Capability (FIM)
|
|
|
20 |
|
21 |
## Model Description
|
22 |
|
23 |
+
`stable-code-3b` is a 2.7B billion parameter decoder-only language model pre-trained on 1.3 trillion tokens of diverse textual and code datasets. `stable-code-3b` is trained on 18 programming languages (selected based on the 2023 StackOverflow Developer Survey) and demonstrates state-of-the-art performance (compared to models of similar size) on the MultiPL-E metrics across multiple programming languages tested using [BigCode's Evaluation Harness](https://github.com/bigcode-project/bigcode-evaluation-harness/tree/main).
|
24 |
|
25 |
**Key Features**
|
26 |
* Fill in Middle Capability (FIM)
|