Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ base_model:
|
|
13 |
<!-- Provide a quick summary of what the model is/does. -->
|
14 |
|
15 |
ModularStarEncoder-finetuned is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [SynthCode2Code2NL](https://huggingface.co/datasets/andreagurioli1995/SynthCode2Code2NL-neardedup).
|
16 |
-
ModularStarEncoder fine-tuned is an encoder for
|
17 |
We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
|
18 |
|
19 |
The model is finetuned with [CLIP objective](https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/loss.py).
|
|
|
13 |
<!-- Provide a quick summary of what the model is/does. -->
|
14 |
|
15 |
ModularStarEncoder-finetuned is an encoder built on top of [ModularStarEncoder-1B Pre-trained](https://huggingface.co/andreagurioli1995/ModularStarEncoder) on [SynthCode2Code2NL](https://huggingface.co/datasets/andreagurioli1995/SynthCode2Code2NL-neardedup).
|
16 |
+
ModularStarEncoder, fine-tuned, is an encoder for code-to-code and nl-to-code retrieval tasks, enabling the end user to select the model size that meets their memory and computational constraints.
|
17 |
We built ModularStarEncoder on top of [StarCoder-2](https://huggingface.co/bigcode/starcoder2-15b), reducing its size from 15B to 1B parameters in bfloat16.
|
18 |
|
19 |
The model is finetuned with [CLIP objective](https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/loss.py).
|