Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
frankenmoe
Merge
mergekit
lazymergekit
Locutusque/TinyMistral-248M-v2
Locutusque/TinyMistral-248M-v2.5
Locutusque/TinyMistral-248M-v2.5-Instruct
jtatman/tinymistral-v2-pycoder-instruct-248m
Felladrin/TinyMistral-248M-SFT-v4
Locutusque/TinyMistral-248M-v2-Instruct
text-generation-inference
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ base_model:
|
|
23 |
|
24 |
# TinyMistral-6x248M
|
25 |
|
26 |
-
TinyMistral-6x248M is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
27 |
* [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2)
|
28 |
* [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5)
|
29 |
* [Locutusque/TinyMistral-248M-v2.5-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5-Instruct)
|
|
|
23 |
|
24 |
# TinyMistral-6x248M
|
25 |
|
26 |
+
TinyMistral-6x248M is a Mixure of Experts (MoE) inspired by @jtatman made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
27 |
* [Locutusque/TinyMistral-248M-v2](https://huggingface.co/Locutusque/TinyMistral-248M-v2)
|
28 |
* [Locutusque/TinyMistral-248M-v2.5](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5)
|
29 |
* [Locutusque/TinyMistral-248M-v2.5-Instruct](https://huggingface.co/Locutusque/TinyMistral-248M-v2.5-Instruct)
|