Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Spaces:
Dovakiins
/
qwerrwe
Build error

App Files Files Community
Fetching metadata from the HF Docker repository...
qwerrwe / src /axolotl /prompt_strategies /orpo
Ctrl+K
Ctrl+K
  • 100 contributors
History: 4 commits
winglian's picture
winglian
re-enable DPO for tests in modal ci (#1374)
1f151c0 unverified 12 months ago
  • __init__.py
    201 Bytes
    ORPO Trainer replacement (#1551) about 1 year ago
  • chat_template.py
    9.41 kB
    re-enable DPO for tests in modal ci (#1374) 12 months ago