Adding ONNX file of this model

#9
by Haolin99 - opened

Beep boop I am the ONNX export bot πŸ€–πŸŽοΈ. On behalf of Haolin99, I would like to add to this repository the model converted to ONNX.

What is ONNX? It stands for "Open Neural Network Exchange", and is the most commonly used open standard for machine learning interoperability. You can find out more at onnx.ai!

The exported ONNX model can be then be consumed by various backends as TensorRT or TVM, or simply be used in a few lines with πŸ€— Optimum through ONNX Runtime, check out how here!

onnx quanize is missing in onnx folder ,
tried to create it and convert it from my side , using the convert script , but i am getting this error:

The ONNX export succeeded and the exported model was saved at: models/xlm-roberta-large-finetuned-conll03-english
Quantizing: 0%| | 0/1 [00:01<?, ?it/s]
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "C:\huggingface\convert.py", line 545, in
main()
File "C:\huggingface\convert.py", line 521, in main
quantize([
File "C:\huggingface\convert.py", line 294, in quantize
quantize_dynamic(
TypeError: quantize_dynamic() got an unexpected keyword argument 'optimize_model'

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment