dpo_summarization / README.md
ojo2's picture
Upload with huggingface_hub
4a4fd3e verified

Training procedure

Framework versions

  • PEFT 0.5.0