File size: 712 Bytes
34ccb80 f8c09fb 34ccb80 c6620c1 34ccb80 09b89a2 950b820 f496b5c 35080e8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
library_name: transformers
license: apache-2.0
datasets:
- netcat420/MFANN
---
I am now basing all future releases of the MFANN experiment using llama-3 as a base model, I may continue fine-tuning mistral-7b every other release
this model uses meta's llama-3 as its base, and benchmarks are pending

changed the model name to MFANNV0.6 due to a failed benchmark and the need to resubmit
edit: due to continuous benchmark fails I am renaming the model back to MFANNver0.6, the 3b model is also failing benchmarks for some reason despite the fact both models run fine on my machine :(
|