Place this snippet on top of your Inference Code to download the model automatically

from huggingface_hub import snapshot_download

model_path = "..." # The local directory to save downloaded checkpoint
snapshot_download(
    "radna/LTX-Video-Minimal",
    local_dir=model_path,
    local_dir_use_symlinks=False,
    repo_type="model",
)
Downloads last month
0
Inference API
Inference API (serverless) does not yet support diffusers models for this pipeline type.

Model tree for radna/LTX-Video-Minimal

Finetuned
(9)
this model