--- license: apache-2.0 --- [Here](https://colab.research.google.com/drive/1xIfIVafnlCP2XVICmRwkUFK3cwTJYjCY#scrollTo=bG1ed2WQfoU0) is an inference demo on Colab. Please note that openmoe-base is only a small, debugging-focused model and hasn't been extensively trained. Oftentimes, its output is invalid. If you want to see a decent performance on QAs, please try the [openmoe-8b](https://huggingface.co/OrionZheng/openmoe-8B-chat)(with FLOPs comparable to a 1.6B LLaMA), or the openmoe-34B version later.