Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ base_model:
|
|
12 |
|
13 |
<!-- Provide a quick summary of what the model is/does. -->
|
14 |
|
15 |
-
ModularStarEncoder-finetuned (MoSE) is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [
|
16 |
ModularStarEncoder, fine-tuned, is an encoder for code-to-code and text-to-code retrieval tasks, enabling the end user to select the model size that meets their memory and computational constraints.
|
17 |
We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
|
18 |
|
|
|
12 |
|
13 |
<!-- Provide a quick summary of what the model is/does. -->
|
14 |
|
15 |
+
ModularStarEncoder-finetuned (MoSE) is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [SynthCoNL](https://huggingface.co/datasets/andreagurioli1995/SynthCode2Code2NL-neardedup).
|
16 |
ModularStarEncoder, fine-tuned, is an encoder for code-to-code and text-to-code retrieval tasks, enabling the end user to select the model size that meets their memory and computational constraints.
|
17 |
We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
|
18 |
|