Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,10 @@
|
|
1 |
---
|
2 |
-
license:
|
3 |
language:
|
4 |
- en
|
|
|
|
|
|
|
5 |
---
|
6 |
# Athena 120b
|
7 |
|
@@ -25,7 +28,4 @@ Coming soon.
|
|
25 |
[@alpindale](https://huggingface.co/alpindale) - for [Goliath-120B](https://huggingface.co/alpindale/goliath-120b?text=Hey+my+name+is+Thomas%21+How+are+you%3F) that started this crazy endeavor for us all
|
26 |
[@nsfwthrowitaway69](https://huggingface.co/nsfwthrowitaway69) - for sharing the merge config for [Venus-120B](https://huggingface.co/nsfwthrowitaway69/Venus-120b-v1.1) and getting me off the starting block with some questions on mergekit and tokenizers
|
27 |
|
28 |
-
Keep it open and keep sharing everyone! With Mixtral and MOE changes to mergekit coupled with these larger merged models? I think the sky is the limit for us all. I can only imagine what will happen if we took a group of these 120 models, fin tuned them each a bit and applied the MOE Mixtral merge method to them? I would also point out that if a clever VC came along and funded that work? You have the people you need right here on huggingface and all they need is the equipment to do it on.
|
29 |
-
|
30 |
-
|
31 |
-
|
|
|
1 |
---
|
2 |
+
license: llama2
|
3 |
language:
|
4 |
- en
|
5 |
+
pipeline_tag: conversational
|
6 |
+
tags:
|
7 |
+
- merge
|
8 |
---
|
9 |
# Athena 120b
|
10 |
|
|
|
28 |
[@alpindale](https://huggingface.co/alpindale) - for [Goliath-120B](https://huggingface.co/alpindale/goliath-120b?text=Hey+my+name+is+Thomas%21+How+are+you%3F) that started this crazy endeavor for us all
|
29 |
[@nsfwthrowitaway69](https://huggingface.co/nsfwthrowitaway69) - for sharing the merge config for [Venus-120B](https://huggingface.co/nsfwthrowitaway69/Venus-120b-v1.1) and getting me off the starting block with some questions on mergekit and tokenizers
|
30 |
|
31 |
+
Keep it open and keep sharing everyone! With Mixtral and MOE changes to mergekit coupled with these larger merged models? I think the sky is the limit for us all. I can only imagine what will happen if we took a group of these 120 models, fin tuned them each a bit and applied the MOE Mixtral merge method to them? I would also point out that if a clever VC came along and funded that work? You have the people you need right here on huggingface and all they need is the equipment to do it on.
|
|
|
|
|
|