Question about psychbert
Hello,
I want to use psychbert.
Is the following code correct?
model = AutoModel.from_pretrained("mnaylor/psychbert-cased", from_flax=True)
tokenizer = AutoTokenizer.from_pretrained("mnaylor/psychbert-cased")
Thank you!
Hi @lilias ! Apologies for the delay - I just saw this message.
Yes, that code appears to work, although the AutoModel
returns a BertModel
whose pooler layer will be randomly initialized. We did not use the pooler layer during pretraining, so no pretrained weights are being lost, and you are still able to fine-tune on whatever task you'd like :)
Do you have a GGUF?
@kfsone
no - I haven't actually done anything with this model in the last couple of years. However, this is a generic BERT model from Hugging Face (it actually originated from the bert-base-cased
checkpoint), so presumably any other process for converting those models would also apply to this one.