The LHC (Large Heap o' Chuubas) is aiming to be the model for all of your VTuber needs.
Currently on Version 0.5 (considered alpha). It is a v-prediction model, with the necessary settings enabled to allow for automatic detection in ComfyUI and (Re)Forge. For a comprehensive overview of all vtubers that were trained on, refer to https://catbox.moe/c/pjfwt1 or the txt file, detailing the vtubers, as well as useful/necessary tagging.
Training Details
Due to some major changes between version 0.4 and 0.5, the training parameters were not yet dialed in, resulting in a very long training process that was resumed multiple times from intermediate checkpoints that weren't trained to satisfaction. Therefore, exact training parameters aren't stated, instead the training logs were uploaded, and a general overview is given here.
Parameters
- Total Epochs: 102
- Effective Batchsize: 32 (or 16, or 8)
- Initial Learning Rate: 5e-5
- Scheduler: Cosine
- TE Training Epochs: ~10
- Downloads last month
- 0
Model tree for Jyrrata/LHC_XL
Base model
Laxhar/noobai-XL-1.0