File size: 219 Bytes
4dc64f0
 
 
1
2
3
This repo is quantized TinyLlama-1.1B-Chat-v1.0 model for the WasmEdge Wasi-NN plugin MLX Backend.

The detailed Wasi-NN tutorial please refer [Wasi-NN-examples](https://github.com/second-state/WasmEdge-WASINN-examples)