File size: 591 Bytes
4d23214 2197e34 08cc691 4d23214 702ef2e 2197e34 c8c8d62 2c2356b cf5b212 da08437 cf5b212 2c2356b 7de38a8 faf4ea0 4495a1a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
license: bigscience-openrail-m
datasets:
- iamplus/Instruction_Tuning
---
First Version of Instruction Tuned Bloomz-7B1 model on ChatGPT dataset (85k data) using ***HF Deepspeed***
**Base Model:** bigscience/bloomz-7b1
**Training Details :**
* Epochs: 5
* Batch Size : 5 instantaneous per device x 2 gradient accumulation steps x 8 gpus = 80
* Max Length : 512
* Weight Decay : 0
* Learning Rate : 5e-5
* Learning Rate Scheduler Type : Linear
* Number of warmup steps : 0
* Machine : 8xA100 80GB
**Dataset Details :**
Dataset : iamplus/Instruction_Tuning
Files :
* chat_gpt_v1.csv |