Feature Extraction
Transformers
Safetensors
ModularStarEncoder
custom_code
andreagurioli1995 commited on
Commit
65836a2
·
verified ·
1 Parent(s): 0fa8d8d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -91,7 +91,7 @@ Here we briefly show our codeSearchNet (codeXGLUE) results between different lay
91
  | [Layer 27](https://huggingface.co/modularStarEncoder/ModularStarEncoder-finetuned-27)* | 80.3 |
92
  | [Layer 36](https://huggingface.co/modularStarEncoder/ModularStarEncoder-finetuned)* | 79.6 |
93
 
94
- * size and corresponding projection head present in this model
95
 
96
  ## Licence
97
  The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).
 
91
  | [Layer 27](https://huggingface.co/modularStarEncoder/ModularStarEncoder-finetuned-27)* | 80.3 |
92
  | [Layer 36](https://huggingface.co/modularStarEncoder/ModularStarEncoder-finetuned)* | 79.6 |
93
 
94
+ - (* size and corresponding projection head present in this model)
95
 
96
  ## Licence
97
  The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement [here](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement).