metadata
title: vits_onnx
emoji: π³
colorFrom: purple
colorTo: gray
sdk: docker
app_port: 7860
onnx inference server in docker container
Copy the demo web from link
Thanks a lot to @CjangCjengh Thanks a lot to wetts
Only used for entertainment. Don't used for bussiness
quick start
To use other model and config
please use -v /path/to/dir:/app/.model to mount your model and config
export name=vits_onnx
docker stop $name
docker rm $name
docker run -d \
--name $name \
-p 7860:7860 \
ccdesue/vits_demo:onnx
# -v /path/to/dir:/app/.model
dir structure
βββ app # gradio code
βββ build.sh
βββ Dockerfile
βββ export # some util for export model
βββ LICENSE
βββ poetry.lock
βββ __pycache__
βββ pyproject.toml
βββ README.md
βββ setup.sh
βββ util # some posibile util
Helpful info
- please read the source code to better understand
- refer to the demo config.json to tail to your own model config
- refer the dockerfile
limitation
- only test on japanese_cleaners and japanese_cleaners2 in config.json with raw vits
Reference
license
GPLv2