CombinHorizon's picture
Update README.md
7f67394 verified
|
raw
history blame
335 Bytes
metadata
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
tags:
  - CPO

This is a model released from the preprint: SimPO: Simple Preference Optimization with a Reference-Free Reward. Please refer to our repository for more details.