ccore's picture
Update README.md
94388bd
|
raw
history blame
1.83 kB
metadata
license: other
datasets:
  - Open-Orca/OpenOrca
  - ehartford/wizard_vicuna_70k_unfiltered
tags:
  - code
  - prompt
  - reverse prompt
widget:
  - text: >-
      [RESPONSE]The given phrase is incorrect because it contains a grammatical
      error. The correct phrase should be 'You were late.' The word 'you' is a
      pronoun that refers to a person or people, and 'were' is the past tense
      form of the verb 'to be.' In this sentence, 'were' is used to indicate
      that the person being referred to was late in the past. The word 'your' is
      a possessive adjective that is used to show ownership or possession of
      something. It is not a substitute for the subject pronoun 'you.'
      Therefore, the phrase 'Your was late' is grammatically incorrect.

      [REVERSED-PROMPT]
    example_title: reverse prompt

core-prompt-reverser-opt-1.3b

This model is a fine-tuned version of facebook/opt-1.3b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4784
  • Accuracy: 0.6753

Model description

[INSTRUCTION] {your question}
[RESPONSE] {model response}

or

[RESPONSE] {response}
[REVERSED-PROMPT] {model prompt reversed}

Intended uses & limitations

More information needed

Training and evaluation data

Wizard, openOrca, custom data

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1.0

Training results

this model is still training, it ran only 5% of the total training data, it will finish in 4/set

Framework versions

  • Transformers 4.33.0.dev0
  • Pytorch 2.1.0.dev20230605+cu121
  • Datasets 2.14.4
  • Tokenizers 0.13.3