Update README.md
Browse files
README.md
CHANGED
@@ -109,7 +109,7 @@ DeciLM 6B is a 5.7 billion parameter decoder-only text generation model. With a
|
|
109 |
|
110 |
### Model Description
|
111 |
|
112 |
-
Deci developed and publically released the DeciLM 6B large language model, a pretrained, high-efficiency generative text model with 5.7 billion parameters. DeciLM 6B outpaces pretrained models in its class, with a throughput that's up to 15 times that of Llama 2 7B's. DeciLM-6B was further LoRA fine-tuned for instruction following on a subset of the OpenOrca dataset, creating DeciLM 6B
|
113 |
|
114 |
- **Developed by:** Deci
|
115 |
- **Model type:** DeciLM is an auto-regressive language model using an optimized transformer decoder architecture that includes variable Grouped-Query Attention.
|
|
|
109 |
|
110 |
### Model Description
|
111 |
|
112 |
+
Deci developed and publically released the DeciLM 6B large language model, a pretrained, high-efficiency generative text model with 5.7 billion parameters. DeciLM 6B outpaces pretrained models in its class, with a throughput that's up to 15 times that of Llama 2 7B's. DeciLM-6B was further LoRA fine-tuned for instruction following on a subset of the OpenOrca dataset, creating [DeciLM 6B-Instruct](https://huggingface.co/Deci/DeciLM-6b-instruct)
|
113 |
|
114 |
- **Developed by:** Deci
|
115 |
- **Model type:** DeciLM is an auto-regressive language model using an optimized transformer decoder architecture that includes variable Grouped-Query Attention.
|