Trained for 0 epochs and 1500 steps.
Browse filesTrained with datasets ['text-embeds', 'cardatasets_w_wlh2']
Learning rate 0.0001, batch size 1, and 1 gradient accumulation steps.
Used DDPM noise scheduler for training with epsilon prediction type and rescaled_betas_zero_snr=False
Using 'trailing' timestep spacing.
Base model: /data/shared_workspace/zgt/text2car_dataset/FLUX.1-dev
VAE: /data/shared_workspace/zgt/text2car_dataset/FLUX.1-dev
pytorch_lora_weights.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 109535132
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:51def8a38cd880d80f9b83dc3e5dacd75aad92bfd5b17b4e8ec1e6e0ab2aba0d
|
3 |
size 109535132
|