Small model

#6
by Mantrytounderstand - opened

I am using 'numind/NuExtract-v1.5' on colab and I'm impressed with results.
Do you have any smaller/quantised model which, we can load in local machine and use it for project?

NuMind org

Hello!

We just today put up a 1.7B parameter version that is based on SmolLM2, which you can find here (https://huggingface.co/numind/NuExtract-1.5-smol).

Alternatively there are 3rd party quantizations of this main model (e.g. https://huggingface.co/bartowski/NuExtract-v1.5-GGUF), or even a 0.5B tiny variant (https://huggingface.co/numind/NuExtract-1.5-tiny).

Sign up or log in to comment