openmoe-base / README.md
OrionZheng's picture
Update README.md
ec74ec4 verified
|
raw
history blame
513 Bytes
metadata
license: apache-2.0

Here is a inference demo on Colab.

Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is not valid.
If you want to see a decent performance on QAs, please try the openmoe-8b(with compute comparable to a 1.6B LLaMA), or the openmoe-34B version later.