Model Overview
It is a TT-compressed model of original BART-based detoxification model s-nlp/bart-base-detox.
How to use
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM \
.from_pretrained('s-nlp/bart-base-detox-ttd', trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained('facebook/bart-base')
toxics = ['that sick fuck is going to be out in 54 years.']
tokens = tokenizer(toxics)
tokens = model.generate(**tokens, num_return_sequences=1, do_sample=False,
temperature=1.0, repetition_penalty=10.0,
max_length=128, num_beams=5)
neutrals = tokenizer.decode(tokens[0, ...], skip_special_tokens=True)
print(neutrals) # stdout: She is going to be out in 54 years.
- Downloads last month
- 7
Inference API (serverless) does not yet support model repos that contain custom code.