metadata
license: apache-2.0
FP32 version of original BF16 mistral safetensors.
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-v0.1")
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-v0.1", torch_dtype=torch.float)
model.save_pretrained("./Mistral-7B-v0.1-fp32")
tokenizer.save_pretrained("./Mistral-7B-v0.1-fp32")