Spaces:
Running
Running
Add MPT-7B-Chat to the models to chose from
#180
by
FlipTip
- opened
Please add MPT-7b-Chat, it was build by finetuning MPT-7B on the ShareGPT-Vicuna, HC3, Alpaca, HH-RLHF, and Evol-Instruct datasets.
We are investigating which model to add next, adding this one to the list ๐
MosaicML has released a new version with 30b parameters MPT-30b-Chat that performs very well in benchmarks (see Open LLM Leaderboard or here), which is better to add.
FlipTip
changed discussion status to
closed