Compute sponsored by Arrow Denmark and Nvidia
- Developed by: ThatsGroes
- License: apache-2.0
- Finetuned from model : unsloth/gemma-2-27b-it
This gemma2 model was trained 2x faster with Unsloth and Huggingface's TRL library.
[codecarbon INFO @ 21:07:45] 2.748063 kWh of electricity used since the beginning.
- Downloads last month
- 22
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.