burnerbaby/blah converted into TensorRT

Model converted from diffusers into TensorRT for accelerated inference up to 4x faster.

For how to use the model check https://github.com/nicholaskao1029/sda-node

forked from https://github.com/chavinlo/sda-node/

This model was automatically converted by SDA-node

Compilation configuration:

{
    "_class_name": "StableDiffusionAccelerated_Base",
    "_sda_version": "0.1.2",
    "_trt_version": "8.5.3",
    "_cuda_version": "none",
    "_cudnn_version": "none",
    "_onnx2trt_version": "8.5.3",
    "unet": {
        "precision": "fp16",
        "path": "engine/unet.plan"
    },
    "clip": {
        "path": "engine/clip.plan"
    },
    "de_vae": {
        "path": "engine/de_vae.plan"
    }
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .