Update README.md
Browse files
README.md
CHANGED
@@ -5,10 +5,13 @@ pipeline_tag: text-generation
|
|
5 |
# π¦π» Safurai-Csharp-34B
|
6 |
|
7 |
π [Article](https://www.safurai.com/blog/introducing-safurai-csharp)
|
|
|
8 |
|
9 |
<center><img src="https://i.imgur.com/REPqbYM.png" width="300"></center>
|
10 |
|
11 |
-
This is a [`codellama/CodeLlama-
|
|
|
|
|
12 |
|
13 |
## π§ Training
|
14 |
|
|
|
5 |
# π¦π» Safurai-Csharp-34B
|
6 |
|
7 |
π [Article](https://www.safurai.com/blog/introducing-safurai-csharp)
|
8 |
+
π [Paper](https://www.safurai.com/)
|
9 |
|
10 |
<center><img src="https://i.imgur.com/REPqbYM.png" width="300"></center>
|
11 |
|
12 |
+
This is a [`codellama/CodeLlama-34b-hf`](https://huggingface.co/codellama/CodeLlama-34b-hf) model fine-tuned using QLoRA (4-bit precision) on 13B tokens of csharp evolved Q&A
|
13 |
+
|
14 |
+
We obtained state-of-the-art performance on the MultiPL-E code LLM benchmark for csharp, reaching 56% at pass@1 with n=5.
|
15 |
|
16 |
## π§ Training
|
17 |
|