Transformers
English
controlnet
Inference Endpoints
GeroldMeisinger commited on
Commit
59da8a0
·
1 Parent(s): fcb72eb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +43 -1
README.md CHANGED
@@ -4,4 +4,46 @@ datasets:
4
  - laion/laion2B-en-aesthetic
5
  language:
6
  - en
7
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  - laion/laion2B-en-aesthetic
5
  language:
6
  - en
7
+ ---
8
+
9
+ Based on https://github.com/lllyasviel/ControlNet/discussions/318
10
+
11
+ ```
12
+ accelerate launch train_controlnet.py ^
13
+ --pretrained_model_name_or_path="runwayml/stable-diffusion-v1-5" ^
14
+ --output_dir="control-edgedrawing-default-drop50-fp16/" ^
15
+ --dataset_name="mydataset" ^
16
+ --mixed_precision="fp16" ^
17
+ --proportion_empty_prompts=0.5 ^
18
+ --resolution=512 ^
19
+ --learning_rate=1e-5 ^
20
+ --train_batch_size=1 ^
21
+ --gradient_accumulation_steps=4 ^
22
+ --gradient_checkpointing ^
23
+ --use_8bit_adam ^
24
+ --enable_xformers_memory_efficient_attention ^
25
+ --set_grads_to_none ^
26
+ --seed=0
27
+ ```
28
+
29
+ Trained for 40000 steps on images converted with https://github.com/shaojunluo/EDLinePython using `smoothed = False` and default settings:
30
+
31
+ ```
32
+ { 'ksize' : 5
33
+ , 'sigma' : 1.0
34
+ , 'gradientThreshold': 36
35
+ , 'anchorThreshold' : 8
36
+ , 'scanIntervals' : 1
37
+ }
38
+ ```
39
+
40
+ **TODO**
41
+
42
+ Results are not good so far:
43
+
44
+ * `--proportion_empty_prompts=0.5` me be too excessive for 40000 steps
45
+ * Use `smoothed = True` next time, maybe control net doesn't pick up on single pixels
46
+ * Find better parameter spread instead of default values, most images are very sparse
47
+ * Train on more steps
48
+ * Train on more diverse dataset
49
+ * Train on higher-precision