Download CPU or GPU and place under model/
model/
F5-TTS-ONNX/
F5_Preprocess.ort
F5_Transformer.ort
F5_Decode.ort
Inference with gpu option
model/
F5-TTS-ONNX/
F5_Preprocess.onnx
F5_Transformer.onnx
F5_Decode.onnx
ONNX Github: https://github.com/DakeQQ/F5-TTS-ONNX