Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
@@ -25,6 +25,8 @@ BigCode is an open scientific collaboration working on responsible training of l
|
|
25 |
StarCoder is a 15.5B parameters language model for code trained for 1T tokens on 80+ programming languages. It uses MQA for efficient generation, has 8,192 tokens context window and can do fill-in-the-middle.
|
26 |
|
27 |
### Models
|
|
|
|
|
28 |
- [StarCoder](https://huggingface.co/bigcode/starcoder): StarCoderBase further trained on Python.
|
29 |
- [StarCoderBase](https://huggingface.co/bigcode/starcoderbase): Trained on 80+ languages from The Stack.
|
30 |
- [StarEncoder](https://huggingface.co/bigcode/starencoder): Encoder model trained on TheStack.
|
|
|
25 |
StarCoder is a 15.5B parameters language model for code trained for 1T tokens on 80+ programming languages. It uses MQA for efficient generation, has 8,192 tokens context window and can do fill-in-the-middle.
|
26 |
|
27 |
### Models
|
28 |
+
- [Paper](): A technical report about StarCoder.
|
29 |
+
- [GitHub](https://github.com/bigcode-project/starcoder/tree/main): All you need to know about using or fine-tuning StarCoder.
|
30 |
- [StarCoder](https://huggingface.co/bigcode/starcoder): StarCoderBase further trained on Python.
|
31 |
- [StarCoderBase](https://huggingface.co/bigcode/starcoderbase): Trained on 80+ languages from The Stack.
|
32 |
- [StarEncoder](https://huggingface.co/bigcode/starencoder): Encoder model trained on TheStack.
|