MagicAnimate: Temporally Consistent Human Image Animation using Diffusion Model
Zhongcong Xu
·
Jianfeng Zhang
·
Jun Hao Liew
·
Hanshu Yan
·
Jia-Wei Liu
·
Chenxu Zhang
·
Jiashi Feng
·
Mike Zheng Shou
National University of Singapore | ByteDance
## 📢 News
* **[2023.12.4]** Release inference code and gradio demo. We are working to improve MagicAnimate, stay tuned!
* **[2023.11.23]** Release MagicAnimate paper and project page.
## 🏃♂️ Getting Started
Download the pretrained base models for [StableDiffusion V1.5](https://huggingface.co/runwayml/stable-diffusion-v1-5) and [MSE-finetuned VAE](https://huggingface.co/stabilityai/sd-vae-ft-mse).
Download our MagicAnimate [checkpoints](https://huggingface.co/zcxu-eric/MagicAnimate).
Please follow the huggingface download instructions to download the above models and checkpoints, `git lfs` is recommended.
Place the based models and checkpoints as follows:
```bash
magic-animate
|----pretrained_models
|----MagicAnimate
|----appearance_encoder
|----diffusion_pytorch_model.safetensors
|----config.json
|----densepose_controlnet
|----diffusion_pytorch_model.safetensors
|----config.json
|----temporal_attention
|----temporal_attention.ckpt
|----sd-vae-ft-mse
|----config.json
|----diffusion_pytorch_model.safetensors
|----stable-diffusion-v1-5
|----scheduler
|----scheduler_config.json
|----text_encoder
|----config.json
|----pytorch_model.bin
|----tokenizer (all)
|----unet
|----diffusion_pytorch_model.bin
|----config.json
|----v1-5-pruned-emaonly.safetensors
|----...
```
## ⚒️ Installation
prerequisites: `python>=3.8`, `CUDA>=11.3`, and `ffmpeg`.
Install with `conda`:
```bash
conda env create -f environment.yaml
conda activate manimate
```
or `pip`:
```bash
pip3 install -r requirements.txt
```
## 💃 Inference
Run inference on single GPU:
```bash
bash scripts/animate.sh
```
Run inference with multiple GPUs:
```bash
bash scripts/animate_dist.sh
```
## 🎨 Gradio Demo
#### Online Gradio Demo:
Try our [online gradio demo](https://huggingface.co/spaces/zcxu-eric/magicanimate) quickly.
#### Local Gradio Demo:
Launch local gradio demo on single GPU:
```bash
python3 -m demo.gradio_animate
```
Launch local gradio demo if you have multiple GPUs:
```bash
python3 -m demo.gradio_animate_dist
```
Then open gradio demo in local browser.
## 🙏 Acknowledgements
We would like to thank [AK(@_akhaliq)](https://twitter.com/_akhaliq?lang=en) and huggingface team for the help of setting up oneline gradio demo.
## 🎓 Citation
If you find this codebase useful for your research, please use the following entry.
```BibTeX
@inproceedings{xu2023magicanimate,
author = {Xu, Zhongcong and Zhang, Jianfeng and Liew, Jun Hao and Yan, Hanshu and Liu, Jia-Wei and Zhang, Chenxu and Feng, Jiashi and Shou, Mike Zheng},
title = {MagicAnimate: Temporally Consistent Human Image Animation using Diffusion Model},
booktitle = {arXiv},
year = {2023}
}
```