Intro

This is a OPT-125m model trained with HF dataset on a single 3090 GPU.

How to use

You can use this model directly with a pipeline for text generation.

>>> from transformers import pipeline

>>> generator = pipeline('text-generation', model="facebook/opt-125m")
>>> generator("Hello, I'm am conscious and")
[{'generated_text': 'Hello, I am conscious and aware of the fact that I am a woman. I am aware of'}]

By default, generation is deterministic. In order to use the top-k sampling, please set do_sample to True.

>>> from transformers import pipeline, set_seed

>>> set_seed(32)
>>> generator = pipeline('text-generation', model="facebook/opt-125m", do_sample=True)
>>> generator("Hello, I'm am conscious and")
[{'generated_text': 'Hello, I am conscious and active member of the Khaosan Group, a private, self'}]

Training data

This model uses AHRLHF for RL https://huggingface.co/datasets/Anthropic/hh-rlhf

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Inference API (serverless) has been turned off for this model.