Edit model card

malgosia-st-sd3.5-lokr-adamw-1e-6-bs4-v03

This is a LyCORIS adapter derived from stabilityai/stable-diffusion-3.5-large.

No validation prompt was used during training.

None

Validation settings

  • CFG: 4.5
  • CFG Rescale: 0.0
  • Steps: 20
  • Sampler: None
  • Seed: 420
  • Resolution: 832x1216

Note: The validation settings are not necessarily the same as the training settings.

You can find some example images in the following gallery:

Prompt
unconditional (blank prompt)
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a young woman with long, wavy brown hair and bangs wearing an elegant metallic bronze bustier paired with a teal satin-like short cape, set against a dreamy, pastel-colored fantasy background with abstract, ethereal structures, with an oversized pink flower headpiece, styled minimally with a natural, light makeup look, her hand gently touching her face, adorned with a detailed statement necklace, exuding a serene and contemplative mood, through her soft expression and calm demeanor, high detailed face
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a young woman stands boldly with striking white hair styled in a jagged, edgy cut. She is set against a moody, artistic backdrop featuring a dark, abstract design. Her daring, modern look includes a sleek black leather jacket, and her makeup is minimal yet striking, with bold eyeliner highlighting her intense eyes. She exudes confidence and rebellion, with a pose that embodies empowerment and defiance
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a vibrant young woman with multicolored hair and striking features stands against a neon-infused digital backdrop, embodying an energetic urban style. Her hair cascades in vivid shades of blue, pink, and yellow, while her bold makeup features bright eyeshadow and lips painted in coordinating colors. With long, sharp-painted nails, she sports a playful expression, her tongue playfully sticking out and hand raised. This evokes a lively and fun mood, radiating a sense of rebellious joy and youthful exuberance
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a young woman with long, flowing hair and captivating eyes stands in a minimalist studio with a white background, embodying an artistic style characterized by bold lines and expressive details. Her hair is styled in voluminous waves, adding a dynamic flair to her look. She conveys an intense and introspective mood through her focused expression and the intricate design elements surrounding her
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a woman with long, flowing red hair sits gracefully on a wooden block in a minimalist studio with a gray background. She wears a chic black Chanel blouse and skirt, complemented by Brian Atwood pumps. Her styling features elegantly tousled hair and natural makeup, creating a dynamic and mysterious mood. Her poised, contemplative posture and partially hidden face convey a sense of intrigue and style
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a young woman with long blonde hair stands outside by a white picket fence in a suburban setting, likely during golden hour. She wears a stylish, colorful floral-patterned top paired with light blue jeans featuring a distinctive belt. Her hair is loose and natural, contributing to a casual vibe. As she smokes with a thoughtful expression, she embodies a laid-back, contemplative mood in this relaxed outdoor environment
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a young woman stands confidently, leaning against a concrete backdrop in a dimly lit urban night setting. She wears a modern, edgy outfit that features a colorful striped sequined romper paired with a stylish black leather jacket. Her blonde hair flows in loose, casual waves, and she completes the bold look with high platform heels. The atmosphere is daring and fierce, with her posture and style reflecting a strong sense of unapologetic self-expression
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a woman with long red hair is shown in a curled-up pose, her face tilted toward her knees, gazing intensely at the camera with a look that combines vulnerability and curiosity, her lips are slightly parted, adding a sense of intrigue, while her green eyes, half-shaded by her bangs, convey a mixture of thoughtfulness and quiet intensity, she wears a bright orange headband that contrasts vividly with the green of her outfit and the background, the outfit is made of olive-green lace, with intricate embroidery that stands out against her fair skin, her left arm is bent and rests on her leg, adorned with an ornate gold bracelet featuring white and black details, the background is a muted teal-green, which enhances the bold colors of the image, creating an elegant contrast between warm and cool tones, the overall atmosphere is sophisticated, with a mix of sensuality, mystery, and strength conveyed through her expressive gaze and compact pose
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a woman in her twenties, with long, sleek blonde hair and striking facial features, poses in a minimalist studio with a plain wall. She models a seductive, fashion-forward look in a sheer lace bodysuit paired with fishnet gloves. Her natural makeup highlights her eyes, and her hand rests casually on a chair. She exudes a bold and alluring mood, with an intense, captivating gaze and poised posture that create a striking contrast against the background
Negative Prompt
blurry, cropped, ugly
Prompt
malgosia, a woman with strong features and slicked-back hair stands confidently in a minimalist studio against a soft gray background. She wears a modern, elegant long-sleeved gown with draping fabric, and her natural makeup accentuates her defined cheekbones. With a powerful pose and an arm raised, she exudes strength and determination. Her intense gaze conveys sophistication and purpose, creating a dramatic and striking presence, free of any distracting accessories
Negative Prompt
blurry, cropped, ugly

The text encoder was not trained. You may reuse the base model text encoder for inference.

Training settings

  • Training epochs: 24
  • Training steps: 11000
  • Learning rate: 1e-06
  • Max grad norm: 0.01
  • Effective batch size: 4
    • Micro-batch size: 4
    • Gradient accumulation steps: 1
    • Number of GPUs: 1
  • Prediction type: flow-matching
  • Rescaled betas zero SNR: False
  • Optimizer: adamw_bf16
  • Precision: Pure BF16
  • Quantised: Yes: int8-quanto
  • Xformers: Not used
  • LyCORIS Config:
{
    "algo": "lokr",
    "multiplier": 1.0,
    "linear_dim": 1000000,
    "linear_alpha": 1,
    "factor": 1,
    "full_matrix": true,
    "apply_preset": {
        "target_module": [
            "JointTransformerBlock"
        ],
        "name_algo_map": {
            "transformer_blocks.0.norm1*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.0.norm1_context*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.0.ff*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.0.*": {
                "algo": "lokr",
                "factor": 2,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.1.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.1.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.1.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.1.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.2.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.2.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.2.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.2.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.3.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.3.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.3.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.3.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.4.norm1*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.4.norm1_context*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.4.ff*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.4.*": {
                "algo": "lokr",
                "factor": 12,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.5.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.5.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.5.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.5.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.6.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.6.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.6.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.6.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.7.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.7.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.7.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.7.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.8.norm1*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.8.norm1_context*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.8.ff*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.8.*": {
                "algo": "lokr",
                "factor": 16,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.9.norm1*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.9.norm1_context*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.9.ff*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.9.*": {
                "algo": "lokr",
                "factor": 16,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.10.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.10.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.10.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.10.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.11.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.11.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.11.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.11.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.12.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.12.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.12.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.12.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.13.norm1*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.13.norm1_context*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.13.ff*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.13.*": {
                "algo": "lokr",
                "factor": 16,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.14.norm1*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.14.norm1_context*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.14.ff*": {
                "algo": "lokr",
                "factor": 7,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.14.*": {
                "algo": "lokr",
                "factor": 14,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.15.norm1*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.15.norm1_context*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.15.ff*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.15.*": {
                "algo": "lokr",
                "factor": 12,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.16.norm1*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.16.norm1_context*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.16.ff*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.16.*": {
                "algo": "lokr",
                "factor": 12,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.17.norm1*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.17.norm1_context*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.17.ff*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.17.*": {
                "algo": "lokr",
                "factor": 12,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.18.norm1*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.18.norm1_context*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.18.ff*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.18.*": {
                "algo": "lokr",
                "factor": 12,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.19.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.19.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.19.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.19.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.20.norm1*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.20.norm1_context*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.20.ff*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.20.*": {
                "algo": "lokr",
                "factor": 10,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.21.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.21.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.21.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.21.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.22.norm1*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.22.norm1_context*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.22.ff*": {
                "algo": "lokr",
                "factor": 5,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.22.*": {
                "algo": "lokr",
                "factor": 10,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.23.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.23.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.23.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.23.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.24.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.24.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.24.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.24.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.25.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.25.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.25.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.25.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.26.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.26.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.26.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.26.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.27.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.27.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.27.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.27.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.28.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.28.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.28.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.28.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.29.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.29.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.29.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.29.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.30.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.30.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.30.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.30.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.31.norm1*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.31.norm1_context*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.31.ff*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.31.*": {
                "algo": "lokr",
                "factor": 2,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.32.norm1*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.32.norm1_context*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.32.ff*": {
                "algo": "lokr",
                "factor": 4,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.32.*": {
                "algo": "lokr",
                "factor": 8,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.33.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.33.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.33.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.33.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.34.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.34.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.34.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.34.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.35.norm1*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.35.norm1_context*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.35.ff*": {
                "algo": "lokr",
                "factor": 1,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.35.*": {
                "algo": "lokr",
                "factor": 2,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.36.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.36.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.36.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.36.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.37.norm1*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.37.norm1_context*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.37.ff*": {
                "algo": "lokr",
                "factor": 3,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            },
            "transformer_blocks.37.*": {
                "algo": "lokr",
                "factor": 6,
                "linear_dim": 1000000,
                "linear_alpha": 1,
                "full_matrix": true
            }
        },
        "use_fnmatch": true
    }
}

Datasets

MALGOSIA-SD35-V03-512

  • Repeats: 1
  • Total number of images: 285
  • Total number of aspect buckets: 2
  • Resolution: 0.262144 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-SD35-V03-768

  • Repeats: 1
  • Total number of images: 235
  • Total number of aspect buckets: 4
  • Resolution: 0.589824 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-SD35-V03-1024

  • Repeats: 1
  • Total number of images: 125
  • Total number of aspect buckets: 7
  • Resolution: 1.048576 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-RAFRAF-SD35-V03-512

  • Repeats: 2
  • Total number of images: 30
  • Total number of aspect buckets: 1
  • Resolution: 0.262144 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-RAFRAF-SD35-V03-768

  • Repeats: 2
  • Total number of images: 30
  • Total number of aspect buckets: 1
  • Resolution: 0.589824 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-RAFRAF-SD35-V03-1024

  • Repeats: 2
  • Total number of images: 30
  • Total number of aspect buckets: 1
  • Resolution: 1.048576 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

MALGOSIA-PINTEREST-SD35-V03-512

  • Repeats: 1
  • Total number of images: 56
  • Total number of aspect buckets: 7
  • Resolution: 0.262144 megapixels
  • Cropped: True
  • Crop style: random
  • Crop aspect: closest
  • Used for regularisation data: No

Inference

import torch
from diffusers import DiffusionPipeline
from lycoris import create_lycoris_from_weights

model_id = 'stabilityai/stable-diffusion-3.5-large'
adapter_id = 'pytorch_lora_weights.safetensors' # you will have to download this manually
lora_scale = 1.0
wrapper, _ = create_lycoris_from_weights(lora_scale, adapter_id, pipeline.transformer)
wrapper.merge_to()

prompt = "An astronaut is riding a horse through the jungles of Thailand."
negative_prompt = 'blurry, cropped, ugly'
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
    prompt=prompt,
    negative_prompt=negative_prompt,
    num_inference_steps=20,
    generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
    width=832,
    height=1216,
    guidance_scale=4.5,
).images[0]
image.save("output.png", format="PNG")
Downloads last month
198
Inference API
Examples

Model tree for gattaplayer/malgosia-st-sd3.5-lokr-adamw-1e-6-bs4-v03

Adapter
(110)
this model