rahul7star commited on
Commit
23b96a6
·
verified ·
1 Parent(s): fcc02a2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -416
README.md CHANGED
@@ -1,416 +1,10 @@
1
- # AI Toolkit by Ostris
2
-
3
- AI Toolkit is an all in one training suite for diffusion models. I try to support all the latest models on consumer grade hardware. Image and video models. It can be run as a GUI or CLI. It is designed to be easy to use but still have every feature imaginable.
4
-
5
- ## Support My Work
6
-
7
- If you enjoy my projects or use them commercially, please consider sponsoring me. Every bit helps! 💖
8
-
9
- [Sponsor on GitHub](https://github.com/orgs/ostris) | [Support on Patreon](https://www.patreon.com/ostris) | [Donate on PayPal](https://www.paypal.com/donate/?hosted_button_id=9GEFUKC8T9R9W)
10
-
11
- ### Current Sponsors
12
-
13
- All of these people / organizations are the ones who selflessly make this project possible. Thank you!!
14
-
15
- _Last updated: 2025-04-22 16:45 UTC_
16
-
17
- <p align="center">
18
- <a href="https://github.com/replicate" target="_blank" rel="noopener noreferrer"><img src="https://avatars.githubusercontent.com/u/60410876?v=4" alt="Replicate" width="200" height="200" style="border-radius:8px;margin:5px;"></a>
19
- <a href="https://github.com/josephrocca" target="_blank" rel="noopener noreferrer"><img src="https://avatars.githubusercontent.com/u/1167575?u=92d92921b4cb5c8c7e225663fed53c4b41897736&v=4" alt="josephrocca" width="200" height="200" style="border-radius:8px;margin:5px;"></a>
20
- </p>
21
- <hr style="width:100%;border:none;height:2px;background:#ddd;margin:30px 0;">
22
- <p align="center">
23
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/162524101/81a72689c3754ac5b9e38612ce5ce914/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=3XLSlLFCWAQ-0wd2_vZMikyotdQNSzKOjoyeoJiZEw0%3D" alt="Prasanth Veerina" width="150" height="150" style="border-radius:8px;margin:5px;">
24
- <a href="https://github.com/weights-ai" target="_blank" rel="noopener noreferrer"><img src="https://avatars.githubusercontent.com/u/185568492?v=4" alt="Weights" width="150" height="150" style="border-radius:8px;margin:5px;"></a>
25
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/161471720/dd330b4036d44a5985ed5985c12a5def/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=qkRvrEc5gLPxaXxLvcvbYv1W1lcmOoTwhj4A9Cq5BxQ%3D" alt="Vladimir Sotnikov" width="150" height="150" style="border-radius:8px;margin:5px;">
26
- <img src="https://c8.patreon.com/3/200/33158543" alt="clement Delangue" width="150" height="150" style="border-radius:8px;margin:5px;">
27
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/54890369/45cea21d82974c78bf43956de7fb0e12/eyJ3IjoyMDB9/2.jpeg?token-time=2145916800&token-hash=IK6OT6UpusHgdaC4y8IhK5XxXiP5TuLy3vjvgL77Fho%3D" alt="Eli Slugworth" width="150" height="150" style="border-radius:8px;margin:5px;">
28
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/8654302/b0f5ebedc62a47c4b56222693e1254e9/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=lpeicIh1_S-3Ji3W27gyiRB7iXurp8Bx8HAzDHftOuo%3D" alt="Misch Strotz" width="150" height="150" style="border-radius:8px;margin:5px;">
29
- <img src="https://c8.patreon.com/3/200/93304" alt="Joseph Rocca" width="150" height="150" style="border-radius:8px;margin:5px;">
30
- </p>
31
- <hr style="width:100%;border:none;height:2px;background:#ddd;margin:30px 0;">
32
- <p align="center">
33
- <a href="https://x.com/NuxZoe" target="_blank" rel="noopener noreferrer"><img src="https://pbs.twimg.com/profile_images/1714760743273574400/tdvQjNTl_400x400.jpg" alt="tungsten" width="100" height="100" style="border-radius:8px;margin:5px;"></a>
34
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/2298192/1228b69bd7d7481baf3103315183250d/eyJ3IjoyMDB9/1.jpg?token-time=2145916800&token-hash=1B7dbXy_gAcPT9WXBesLhs7z_9APiz2k1Wx4Vml_-8Q%3D" alt="Mohamed Oumoumad" width="100" height="100" style="border-radius:8px;margin:5px;">
35
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/120239481/49b1ce70d3d24704b8ec34de24ec8f55/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=Dv1NPKwdv9QT8fhYYwbGnQIvfiyqTUlh52bjDW1vYxY%3D" alt="nitish PNR" width="100" height="100" style="border-radius:8px;margin:5px;">
36
- <img src="https://c8.patreon.com/3/200/548524" alt="Steve Hanff" width="100" height="100" style="border-radius:8px;margin:5px;">
37
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/152118848/3b15a43d71714552b5ed1c9f84e66adf/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=IEKE18CBHVZ3k-08UD7Dkb7HbiFHb84W0FATdLMI0Dg%3D" alt="Kristjan Retter" width="100" height="100" style="border-radius:8px;margin:5px;">
38
- <img src="https://c8.patreon.com/3/200/83319230" alt="Miguel Lara" width="100" height="100" style="border-radius:8px;margin:5px;">
39
- </p>
40
- <hr style="width:100%;border:none;height:2px;background:#ddd;margin:30px 0;">
41
- <p align="center">
42
- <img src="https://c8.patreon.com/3/200/8449560" alt="Patron" width="60" height="60" style="border-radius:8px;margin:5px;">
43
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/27288932/6c35d2d961ee4e14a7a368c990791315/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=dpFFssZXZM_KZMKQhl3uDwwusdFw1c_v9x_ChJU7_zc%3D" alt="David Garrido" width="60" height="60" style="border-radius:8px;margin:5px;">
44
- <img src="https://c8.patreon.com/3/200/2410522" alt="George Gostyshev" width="60" height="60" style="border-radius:8px;margin:5px;">
45
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/16287560/78130de30950410ca528d8a888997081/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=Ok-HSL2MthKXF09SmCOlPFCPfbMctFBZKCuTnPwxZ3A%3D" alt="Vitaly Golubenko" width="60" height="60" style="border-radius:8px;margin:5px;">
46
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/570742/4ceb33453a5a4745b430a216aba9280f/eyJ3IjoyMDB9/1.jpg?token-time=2145916800&token-hash=wUzsI5cO5Evp2ukIGdSgBbvKeYgv5LSOQMa6Br33Rrs%3D" alt="Al H" width="60" height="60" style="border-radius:8px;margin:5px;">
47
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/131773947/eda3405aa582437db4582fce908c8739/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=S4Bh0sMqTNmJlo3uRr7co5d_kxvBjITemDTfi_1KrCA%3D" alt="Jodh Singh" width="60" height="60" style="border-radius:8px;margin:5px;">
48
- <img src="https://c8.patreon.com/3/200/22809690" alt="Michael Levine" width="60" height="60" style="border-radius:8px;margin:5px;">
49
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/99036356/7ae9c4d80e604e739b68cca12ee2ed01/eyJ3IjoyMDB9/3.png?token-time=2145916800&token-hash=zK0dHe6A937WtNlrGdefoXFTPPzHUCfn__23HP8-Ui0%3D" alt="Noctre" width="60" height="60" style="border-radius:8px;margin:5px;">
50
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/141098579/1a9f0a1249d447a7a0df718a57343912/eyJ3IjoyMDB9/2.png?token-time=2145916800&token-hash=Rd_AjZGhMATVkZDf8E95ILc0n93gvvFWe1Ig0_dxwf4%3D" alt="The Local Lab" width="60" height="60" style="border-radius:8px;margin:5px;">
51
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/98811435/3a3632d1795b4c2b9f8f0270f2f6a650/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=93w8RMxwXlcM4X74t03u6P5_SrKvlm1IpjnD2SzVpJk%3D" alt="EmmanuelMr18" width="60" height="60" style="border-radius:8px;margin:5px;">
52
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/338551/e8f257d8d3dd46c38272b391a5785948/eyJ3IjoyMDB9/1.jpg?token-time=2145916800&token-hash=GLom1rGgOZjBeO7I1OnjiIgWmjl6PO9ZjBB8YTvc7AM%3D" alt="Plaidam" width="60" height="60" style="border-radius:8px;margin:5px;">
53
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/82763/f99cc484361d4b9d94fe4f0814ada303/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=BpwC020pR3TRZ4r0RSCiSIOh-jmatkrpy1h2XU4sGa4%3D" alt="Doron Adler" width="60" height="60" style="border-radius:8px;margin:5px;">
54
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/103077711/bb215761cc004e80bd9cec7d4bcd636d/eyJ3IjoyMDB9/2.jpeg?token-time=2145916800&token-hash=zvtBie29rRTKTXvAA2KhOI-l3mSMk9xxr-mg_CksLtc%3D" alt="John Dopamine" width="60" height="60" style="border-radius:8px;margin:5px;">
55
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/93348210/5c650f32a0bc481d80900d2674528777/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=PpXK9B_iy288annlNdLOexhiQHbTftPEDeCh-sTQ2KA%3D" alt="Armin Behjati" width="60" height="60" style="border-radius:8px;margin:5px;">
56
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/155963250/6f8fd7075c3b4247bfeb054ba49172d6/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=twmKs4mADF_h7bKh5jBuigYVScMeaeHv2pEPin9K0Dg%3D" alt="Un Defined" width="60" height="60" style="border-radius:8px;margin:5px;">
57
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/45562978/0de33cf52ec642ae8a2f612cddec4ca6/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=hSAvaD4phiLcF0pvX7FP0juI5NQWCon-_TZSNpJzQJg%3D" alt="Jack English" width="60" height="60" style="border-radius:8px;margin:5px;">
58
- <img src="https://c8.patreon.com/3/200/27791680" alt="Jean-Tristan Marin" width="60" height="60" style="border-radius:8px;margin:5px;">
59
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/60995694/92e0e8f336eb4a5bb8d99b940247d1d1/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=pj6Tm8XRdpGJcAEdnCakqYSNiSjoAYjvZescX7d0ic0%3D" alt="Abraham Irawan" width="60" height="60" style="border-radius:8px;margin:5px;">
60
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/164958178/4eb7a37baa0541bab7a091f2b14615b7/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=_aaum7fBJAGaJhMBhlR8vqYavDhExdVxmO9mwd3_XMw%3D" alt="Austin Robinson" width="60" height="60" style="border-radius:8px;margin:5px;">
61
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/134129880/680c7e14cd1a4d1a9face921fb010f88/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=vNKojv67krNqx7gdpKBX1R_stX2TkMRYvRc0xZrbY6s%3D" alt="Bharat Prabhakar" width="60" height="60" style="border-radius:8px;margin:5px;">
62
- <img src="https://c8.patreon.com/3/200/70218846" alt="Cosmosis" width="60" height="60" style="border-radius:8px;margin:5px;">
63
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/83054970/13de6cb103ad41a5841edf549e66cd51/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=wU_Eke9VYcfI40FAQvdEV84Xspqlo5VSiafLqhg_FOE%3D" alt="Gili Ben Shahar" width="60" height="60" style="border-radius:8px;margin:5px;">
64
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/30931983/54ab4e4ceab946e79a6418d205f9ed51/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=LBmsSsMQZhO6yRZ_YyRwTgE6a7BVWrGNsAVveLXHXR0%3D" alt="HestoySeghuro ." width="60" height="60" style="border-radius:8px;margin:5px;">
65
- <img src="https://c8.patreon.com/3/200/4105384" alt="Jack Blakely" width="60" height="60" style="border-radius:8px;margin:5px;">
66
- <img src="https://c8.patreon.com/3/200/494309" alt="Julian Tsependa" width="60" height="60" style="border-radius:8px;margin:5px;">
67
- <img src="https://c8.patreon.com/3/200/24653779" alt="RayHell" width="60" height="60" style="border-radius:8px;margin:5px;">
68
- <img src="https://c8.patreon.com/3/200/4541423" alt="Sören " width="60" height="60" style="border-radius:8px;margin:5px;">
69
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/31950857/c567dc648f6144be9f6234946df05da2/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=3Vx4R1eOfD4X_ZPPd40MsZ-3lyknLM35XmaHRELnWjM%3D" alt="Trent Hunter" width="60" height="60" style="border-radius:8px;margin:5px;">
70
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/110407414/30f9e9d88ef945ddb0f47fd23a8cbac2/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=QQRWOkMyOfDBERHn4O8N2wMB32zeiIEsydVTbSNUw-I%3D" alt="Wesley Reitzfeld" width="60" height="60" style="border-radius:8px;margin:5px;">
71
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/162398691/89d78d89eecb4d6b981ce8c3c6a3d4b8/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=SWhI-0jGpY6Nc_bUQeXz4pa9DRURi9VnnnJ3Mxjg1po%3D" alt="Zoltán-Csaba Nyiró" width="60" height="60" style="border-radius:8px;margin:5px;">
72
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/97985240/3d1d0e6905d045aba713e8132cab4a30/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=pG3X2m-py2lRYI2aoJiXI47_4ArD78ZHdSm6jCAHA_w%3D" alt="עומר מכלוף" width="60" height="60" style="border-radius:8px;margin:5px;">
73
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/140599287/cff037fb93804af28bc3a4f1e91154f8/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=vkscmpmFoM5wq7GnsLmOEgNhvyXe-774kNGNqD0wurE%3D" alt="Lukas" width="60" height="60" style="border-radius:8px;margin:5px;">
74
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/96561218/b0694642d13a49faa75aec9762ff2aeb/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=sLQXomYm1iMYpknvGwKQ49f30TKQ0B1R2W3EZfCJqr8%3D" alt="Ultimate Golf Archives" width="60" height="60" style="border-radius:8px;margin:5px;">
75
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/81275465/1e4148fe9c47452b838949d02dd9a70f/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=uzJzkUq9rte3wx8wDLjGAgvSoxdtZcAnH7HctDhdYEo%3D" alt="Aaron Amortegui" width="60" height="60" style="border-radius:8px;margin:5px;">
76
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/44568304/a9d83a0e786b41b4bdada150f7c9271c/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=SBphTD654nwr-OTrvIBIJBEQho7GE2PtRre8nyaG1Fk%3D" alt="Albert Bukoski" width="60" height="60" style="border-radius:8px;margin:5px;">
77
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/49304261/d0a730de1c3349e585c49288b9f419c6/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=C2BMZ3ci-Ty2nhnSwKZqsR-5hOGsUNDYcvXps0Geq9w%3D" alt="Arvin Flores" width="60" height="60" style="border-radius:8px;margin:5px;">
78
- <img src="https://c8.patreon.com/3/200/5048649" alt="Ben Ward" width="60" height="60" style="border-radius:8px;margin:5px;">
79
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/130338124/f904a3bb76cd4588ac8d8f595c6cb486/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=k-inISRUtYDu9q7fNAKc3S2S7qcaw26fr1pj7PqU28Q%3D" alt="Bnp" width="60" height="60" style="border-radius:8px;margin:5px;">
80
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/111904990/08b1cf65be6a4de091c9b73b693b3468/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=OAJc9W5Ak0uJfQ2COlo1Upo38K3aj1fMQFCMC7ft5tM%3D" alt="Brian Smith" width="60" height="60" style="border-radius:8px;margin:5px;">
81
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/113207022/d4a67cc113e84fb69032bef71d068720/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=mu-tIg88VwoQdgLEOmxuVkhVm9JT59DdnHXJstmkkLU%3D" alt="Fagem X" width="60" height="60" style="border-radius:8px;margin:5px;">
82
- <img src="https://c8.patreon.com/3/200/5602036" alt="Kelevra" width="60" height="60" style="border-radius:8px;margin:5px;">
83
- <img src="https://c8.patreon.com/3/200/358350" alt="L D" width="60" height="60" style="border-radius:8px;margin:5px;">
84
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/159203973/36c817f941ac4fa18103a4b8c0cb9cae/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=9toslDfsO14QyaOiu6vIf--d4marBsWCZWN3gdPqbIU%3D" alt="Marko jak" width="60" height="60" style="border-radius:8px;margin:5px;">
85
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/11198131/e696d9647feb4318bcf16243c2425805/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=o6Hrpzw9rf2Ucd4cZ-hdUkGejLNv44-pqF8smeOF3ts%3D" alt="Nicholas Agranoff" width="60" height="60" style="border-radius:8px;margin:5px;">
86
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/785333/bdb9ede5765d42e5a2021a86eebf0d8f/eyJ3IjoyMDB9/2.jpg?token-time=2145916800&token-hash=dr5eaMg3Ua0wyCy40Qv3F-ZFajWZmuz2fWG55FskREc%3D" alt="Sapjes " width="60" height="60" style="border-radius:8px;margin:5px;">
87
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/44738426/b01ff676da864d4ab9c21f226275b63e/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=54nIkcxFaGszJ3q0jNhtrVSBbV3WNK9e5WX9VzXltYk%3D" alt="Shakeel Saleemi" width="60" height="60" style="border-radius:8px;margin:5px;">
88
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/76566911/6485eaf5ec6249a7b524ee0b979372f0/eyJ3IjoyMDB9/1.jpeg?token-time=2145916800&token-hash=S1QK78ief5byQU7tB_reqnw4V2zhW_cpwTqHThk-tGc%3D" alt="the biitz" width="60" height="60" style="border-radius:8px;margin:5px;">
89
- <img src="https://c8.patreon.com/3/200/83034" alt="william tatum" width="60" height="60" style="border-radius:8px;margin:5px;">
90
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/32633822/1ab5612efe80417cbebfe91e871fc052/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=RHYMcjr0UGIYw5FBrUfJdKMGuoYWhBQlLIykccEFJvo%3D" alt="Zack Abrams" width="60" height="60" style="border-radius:8px;margin:5px;">
91
- <img src="https://c10.patreonusercontent.com/4/patreon-media/p/user/138787189/2b5662dcb638466282ac758e3ac651b4/eyJ3IjoyMDB9/1.png?token-time=2145916800&token-hash=IlUAs9JAlVRphfx81V-Jt-nMiSBS8mPewRr9u6pQjaQ%3D" alt="Антон Антонио" width="60" height="60" style="border-radius:8px;margin:5px;">
92
- </p>
93
-
94
- ---
95
-
96
-
97
-
98
-
99
-
100
-
101
- ## Installation
102
-
103
- Requirements:
104
- - python >3.10
105
- - Nvidia GPU with enough ram to do what you need
106
- - python venv
107
- - git
108
-
109
-
110
- Linux:
111
- ```bash
112
- git clone https://github.com/ostris/ai-toolkit.git
113
- cd ai-toolkit
114
- python3 -m venv venv
115
- source venv/bin/activate
116
- # install torch first
117
- pip3 install --no-cache-dir torch==2.6.0 torchvision==0.21.0 --index-url https://download.pytorch.org/whl/cu126
118
- pip3 install -r requirements.txt
119
- ```
120
-
121
- Windows:
122
- ```bash
123
- git clone https://github.com/ostris/ai-toolkit.git
124
- cd ai-toolkit
125
- python -m venv venv
126
- .\venv\Scripts\activate
127
- pip install --no-cache-dir torch==2.6.0 torchvision==0.21.0 --index-url https://download.pytorch.org/whl/cu126
128
- pip install -r requirements.txt
129
- ```
130
-
131
-
132
- # AI Toolkit UI
133
-
134
- <img src="https://ostris.com/wp-content/uploads/2025/02/toolkit-ui.jpg" alt="AI Toolkit UI" width="100%">
135
-
136
- The AI Toolkit UI is a web interface for the AI Toolkit. It allows you to easily start, stop, and monitor jobs. It also allows you to easily train models with a few clicks. It also allows you to set a token for the UI to prevent unauthorized access so it is mostly safe to run on an exposed server.
137
-
138
- ## Running the UI
139
-
140
- Requirements:
141
- - Node.js > 18
142
-
143
- The UI does not need to be kept running for the jobs to run. It is only needed to start/stop/monitor jobs. The commands below
144
- will install / update the UI and it's dependencies and start the UI.
145
-
146
- ```bash
147
- cd ui
148
- npm run build_and_start
149
- ```
150
-
151
- You can now access the UI at `http://localhost:8675` or `http://<your-ip>:8675` if you are running it on a server.
152
-
153
- ## Securing the UI
154
-
155
- If you are hosting the UI on a cloud provider or any network that is not secure, I highly recommend securing it with an auth token.
156
- You can do this by setting the environment variable `AI_TOOLKIT_AUTH` to super secure password. This token will be required to access
157
- the UI. You can set this when starting the UI like so:
158
-
159
- ```bash
160
- # Linux
161
- AI_TOOLKIT_AUTH=super_secure_password npm run build_and_start
162
-
163
- # Windows
164
- set AI_TOOLKIT_AUTH=super_secure_password && npm run build_and_start
165
-
166
- # Windows Powershell
167
- $env:AI_TOOLKIT_AUTH="super_secure_password"; npm run build_and_start
168
- ```
169
-
170
-
171
- ## FLUX.1 Training
172
-
173
- ### Tutorial
174
-
175
- To get started quickly, check out [@araminta_k](https://x.com/araminta_k) tutorial on [Finetuning Flux Dev on a 3090](https://www.youtube.com/watch?v=HzGW_Kyermg) with 24GB VRAM.
176
-
177
-
178
- ### Requirements
179
- You currently need a GPU with **at least 24GB of VRAM** to train FLUX.1. If you are using it as your GPU to control
180
- your monitors, you probably need to set the flag `low_vram: true` in the config file under `model:`. This will quantize
181
- the model on CPU and should allow it to train with monitors attached. Users have gotten it to work on Windows with WSL,
182
- but there are some reports of a bug when running on windows natively.
183
- I have only tested on linux for now. This is still extremely experimental
184
- and a lot of quantizing and tricks had to happen to get it to fit on 24GB at all.
185
-
186
- ### FLUX.1-dev
187
-
188
- FLUX.1-dev has a non-commercial license. Which means anything you train will inherit the
189
- non-commercial license. It is also a gated model, so you need to accept the license on HF before using it.
190
- Otherwise, this will fail. Here are the required steps to setup a license.
191
-
192
- 1. Sign into HF and accept the model access here [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev)
193
- 2. Make a file named `.env` in the root on this folder
194
- 3. [Get a READ key from huggingface](https://huggingface.co/settings/tokens/new?) and add it to the `.env` file like so `HF_TOKEN=your_key_here`
195
-
196
- ### FLUX.1-schnell
197
-
198
- FLUX.1-schnell is Apache 2.0. Anything trained on it can be licensed however you want and it does not require a HF_TOKEN to train.
199
- However, it does require a special adapter to train with it, [ostris/FLUX.1-schnell-training-adapter](https://huggingface.co/ostris/FLUX.1-schnell-training-adapter).
200
- It is also highly experimental. For best overall quality, training on FLUX.1-dev is recommended.
201
-
202
- To use it, You just need to add the assistant to the `model` section of your config file like so:
203
-
204
- ```yaml
205
- model:
206
- name_or_path: "black-forest-labs/FLUX.1-schnell"
207
- assistant_lora_path: "ostris/FLUX.1-schnell-training-adapter"
208
- is_flux: true
209
- quantize: true
210
- ```
211
-
212
- You also need to adjust your sample steps since schnell does not require as many
213
-
214
- ```yaml
215
- sample:
216
- guidance_scale: 1 # schnell does not do guidance
217
- sample_steps: 4 # 1 - 4 works well
218
- ```
219
-
220
- ### Training
221
- 1. Copy the example config file located at `config/examples/train_lora_flux_24gb.yaml` (`config/examples/train_lora_flux_schnell_24gb.yaml` for schnell) to the `config` folder and rename it to `whatever_you_want.yml`
222
- 2. Edit the file following the comments in the file
223
- 3. Run the file like so `python run.py config/whatever_you_want.yml`
224
-
225
- A folder with the name and the training folder from the config file will be created when you start. It will have all
226
- checkpoints and images in it. You can stop the training at any time using ctrl+c and when you resume, it will pick back up
227
- from the last checkpoint.
228
-
229
- IMPORTANT. If you press crtl+c while it is saving, it will likely corrupt that checkpoint. So wait until it is done saving
230
-
231
- ### Need help?
232
-
233
- Please do not open a bug report unless it is a bug in the code. You are welcome to [Join my Discord](https://discord.gg/VXmU2f5WEU)
234
- and ask for help there. However, please refrain from PMing me directly with general question or support. Ask in the discord
235
- and I will answer when I can.
236
-
237
- ## Gradio UI
238
-
239
- To get started training locally with a with a custom UI, once you followed the steps above and `ai-toolkit` is installed:
240
-
241
- ```bash
242
- cd ai-toolkit #in case you are not yet in the ai-toolkit folder
243
- huggingface-cli login #provide a `write` token to publish your LoRA at the end
244
- python flux_train_ui.py
245
- ```
246
-
247
- You will instantiate a UI that will let you upload your images, caption them, train and publish your LoRA
248
- ![image](assets/lora_ease_ui.png)
249
-
250
-
251
- ## Training in RunPod
252
- Example RunPod template: **runpod/pytorch:2.2.0-py3.10-cuda12.1.1-devel-ubuntu22.04**
253
- > You need a minimum of 24GB VRAM, pick a GPU by your preference.
254
-
255
- #### Example config ($0.5/hr):
256
- - 1x A40 (48 GB VRAM)
257
- - 19 vCPU 100 GB RAM
258
-
259
- #### Custom overrides (you need some storage to clone FLUX.1, store datasets, store trained models and samples):
260
- - ~120 GB Disk
261
- - ~120 GB Pod Volume
262
- - Start Jupyter Notebook
263
-
264
- ### 1. Setup
265
- ```
266
- git clone https://github.com/ostris/ai-toolkit.git
267
- cd ai-toolkit
268
- git submodule update --init --recursive
269
- python -m venv venv
270
- source venv/bin/activate
271
- pip install torch
272
- pip install -r requirements.txt
273
- pip install --upgrade accelerate transformers diffusers huggingface_hub #Optional, run it if you run into issues
274
- ```
275
- ### 2. Upload your dataset
276
- - Create a new folder in the root, name it `dataset` or whatever you like.
277
- - Drag and drop your .jpg, .jpeg, or .png images and .txt files inside the newly created dataset folder.
278
-
279
- ### 3. Login into Hugging Face with an Access Token
280
- - Get a READ token from [here](https://huggingface.co/settings/tokens) and request access to Flux.1-dev model from [here](https://huggingface.co/black-forest-labs/FLUX.1-dev).
281
- - Run ```huggingface-cli login``` and paste your token.
282
-
283
- ### 4. Training
284
- - Copy an example config file located at ```config/examples``` to the config folder and rename it to ```whatever_you_want.yml```.
285
- - Edit the config following the comments in the file.
286
- - Change ```folder_path: "/path/to/images/folder"``` to your dataset path like ```folder_path: "/workspace/ai-toolkit/your-dataset"```.
287
- - Run the file: ```python run.py config/whatever_you_want.yml```.
288
-
289
- ### Screenshot from RunPod
290
- <img width="1728" alt="RunPod Training Screenshot" src="https://github.com/user-attachments/assets/53a1b8ef-92fa-4481-81a7-bde45a14a7b5">
291
-
292
- ## Training in Modal
293
-
294
- ### 1. Setup
295
- #### ai-toolkit:
296
- ```
297
- git clone https://github.com/ostris/ai-toolkit.git
298
- cd ai-toolkit
299
- git submodule update --init --recursive
300
- python -m venv venv
301
- source venv/bin/activate
302
- pip install torch
303
- pip install -r requirements.txt
304
- pip install --upgrade accelerate transformers diffusers huggingface_hub #Optional, run it if you run into issues
305
- ```
306
- #### Modal:
307
- - Run `pip install modal` to install the modal Python package.
308
- - Run `modal setup` to authenticate (if this doesn’t work, try `python -m modal setup`).
309
-
310
- #### Hugging Face:
311
- - Get a READ token from [here](https://huggingface.co/settings/tokens) and request access to Flux.1-dev model from [here](https://huggingface.co/black-forest-labs/FLUX.1-dev).
312
- - Run `huggingface-cli login` and paste your token.
313
-
314
- ### 2. Upload your dataset
315
- - Drag and drop your dataset folder containing the .jpg, .jpeg, or .png images and .txt files in `ai-toolkit`.
316
-
317
- ### 3. Configs
318
- - Copy an example config file located at ```config/examples/modal``` to the `config` folder and rename it to ```whatever_you_want.yml```.
319
- - Edit the config following the comments in the file, **<ins>be careful and follow the example `/root/ai-toolkit` paths</ins>**.
320
-
321
- ### 4. Edit run_modal.py
322
- - Set your entire local `ai-toolkit` path at `code_mount = modal.Mount.from_local_dir` like:
323
-
324
- ```
325
- code_mount = modal.Mount.from_local_dir("/Users/username/ai-toolkit", remote_path="/root/ai-toolkit")
326
- ```
327
- - Choose a `GPU` and `Timeout` in `@app.function` _(default is A100 40GB and 2 hour timeout)_.
328
-
329
- ### 5. Training
330
- - Run the config file in your terminal: `modal run run_modal.py --config-file-list-str=/root/ai-toolkit/config/whatever_you_want.yml`.
331
- - You can monitor your training in your local terminal, or on [modal.com](https://modal.com/).
332
- - Models, samples and optimizer will be stored in `Storage > flux-lora-models`.
333
-
334
- ### 6. Saving the model
335
- - Check contents of the volume by running `modal volume ls flux-lora-models`.
336
- - Download the content by running `modal volume get flux-lora-models your-model-name`.
337
- - Example: `modal volume get flux-lora-models my_first_flux_lora_v1`.
338
-
339
- ### Screenshot from Modal
340
-
341
- <img width="1728" alt="Modal Traning Screenshot" src="https://github.com/user-attachments/assets/7497eb38-0090-49d6-8ad9-9c8ea7b5388b">
342
-
343
- ---
344
-
345
- ## Dataset Preparation
346
-
347
- Datasets generally need to be a folder containing images and associated text files. Currently, the only supported
348
- formats are jpg, jpeg, and png. Webp currently has issues. The text files should be named the same as the images
349
- but with a `.txt` extension. For example `image2.jpg` and `image2.txt`. The text file should contain only the caption.
350
- You can add the word `[trigger]` in the caption file and if you have `trigger_word` in your config, it will be automatically
351
- replaced.
352
-
353
- Images are never upscaled but they are downscaled and placed in buckets for batching. **You do not need to crop/resize your images**.
354
- The loader will automatically resize them and can handle varying aspect ratios.
355
-
356
-
357
- ## Training Specific Layers
358
-
359
- To train specific layers with LoRA, you can use the `only_if_contains` network kwargs. For instance, if you want to train only the 2 layers
360
- used by The Last Ben, [mentioned in this post](https://x.com/__TheBen/status/1829554120270987740), you can adjust your
361
- network kwargs like so:
362
-
363
- ```yaml
364
- network:
365
- type: "lora"
366
- linear: 128
367
- linear_alpha: 128
368
- network_kwargs:
369
- only_if_contains:
370
- - "transformer.single_transformer_blocks.7.proj_out"
371
- - "transformer.single_transformer_blocks.20.proj_out"
372
- ```
373
-
374
- The naming conventions of the layers are in diffusers format, so checking the state dict of a model will reveal
375
- the suffix of the name of the layers you want to train. You can also use this method to only train specific groups of weights.
376
- For instance to only train the `single_transformer` for FLUX.1, you can use the following:
377
-
378
- ```yaml
379
- network:
380
- type: "lora"
381
- linear: 128
382
- linear_alpha: 128
383
- network_kwargs:
384
- only_if_contains:
385
- - "transformer.single_transformer_blocks."
386
- ```
387
-
388
- You can also exclude layers by their names by using `ignore_if_contains` network kwarg. So to exclude all the single transformer blocks,
389
-
390
-
391
- ```yaml
392
- network:
393
- type: "lora"
394
- linear: 128
395
- linear_alpha: 128
396
- network_kwargs:
397
- ignore_if_contains:
398
- - "transformer.single_transformer_blocks."
399
- ```
400
-
401
- `ignore_if_contains` takes priority over `only_if_contains`. So if a weight is covered by both,
402
- if will be ignored.
403
-
404
- ## LoKr Training
405
-
406
- To learn more about LoKr, read more about it at [KohakuBlueleaf/LyCORIS](https://github.com/KohakuBlueleaf/LyCORIS/blob/main/docs/Guidelines.md). To train a LoKr model, you can adjust the network type in the config file like so:
407
-
408
- ```yaml
409
- network:
410
- type: "lokr"
411
- lokr_full_rank: true
412
- lokr_factor: 8
413
- ```
414
-
415
- Everything else should work the same including layer targeting.
416
-
 
1
+ title: AI-TOOLKIT-Training
2
+ emoji: 📹⚡️
3
+ colorFrom: pink
4
+ colorTo: gray
5
+ sdk: gradio
6
+ sdk_version: 5.29.0
7
+ app_file: flux_train_ui.py
8
+ pinned: false
9
+ license: apache-2.0
10
+ short_description: fast video generation from images & text