default parameters for model.generate

#39
by christinagottsch - opened

I can't seem to find the documentation with the default values for parameters like top_p and temperature. What are the standards if you don't set them?
I'm using the model via transformers and the AutoClass.
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")

outputs = model.generate(
    input_ids,
    max_length=1024,
    num_beams=5,
    no_repeat_ngram_size=2,
    eos_token_id=tokenizer.eos_token_id,
    pad_token_id=tokenizer.eos_token_id,
    do_sample=True,
    top_k=50,
    top_p=0.95,
    temperature=0.8
)

I hope someone can help. Thank you!

Good question, any answers appreciated.

I have gone to the official mistral inference repo and can see that temperature is set to 0 and 0.7 somewhere else and 0.35 on the README.

I can't see other parameters just yet though :/

Sign up or log in to comment