metadata
library_name: transformers
base_model:
- mistralai/Mistral-Nemo-Instruct-2407
datasets:
- jondurbin/gutenberg-dpo-v0.1
- nbeerbower/gutenberg2-dpo
license: apache-2.0
Mistral-Nemo-Gutenberg-Doppel-12B
mistralai/Mistral-Nemo-Instruct-2407 finetuned on jondurbin/gutenberg-dpo-v0.1 and nbeerbower/gutenberg2-dpo.
Method
ORPO tuned with an RTX 3090 for 3 epochs.