|
--- |
|
library_name: transformers |
|
license: other |
|
license_name: gemma-terms-of-use |
|
license_link: https://ai.google.dev/gemma/terms |
|
--- |
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/655dc641accde1bbc8b41aec/xOe1Nb3S9Nb53us7_Ja3s.jpeg) |
|
|
|
# Gemma-Wukong-2b |
|
|
|
Gemma-Wukong-2b is a dealigned chat finetune of the original Gemma 2b developed by the Google Deepmind and various other teams |
|
|
|
This model was trained on the teknium OpenHeremes-2.5 dataset and the from Cognitive Computations https://erichartford.com/dolphin 🐬 |
|
|
|
This model was trained for 3 epochs over 4 3090's. |
|
|
|
|
|
# Original Model Card Below |