Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
kvssetty
's Collections
LLMs
LLMs
updated
Mar 15
Upvote
-
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
Text Generation
•
Updated
Apr 30
•
4.11k
•
•
417
Upvote
-
Share collection
View history
Collection guide
Browse collections