Running into "Flash attention implementation does not support kwargs: prompt_length" when using the exact example from the Readme
#49 opened 3 days ago
by
HotSauce7
Bulk embedding error
3
#48 opened 4 days ago
by
aikokul
RuntimeError: FlashAttention is not installed.
2
#47 opened 7 days ago
by
seregadgl
SentenceTransformerTrainer: checkpoint saves custom_st.py in the wrong folder
1
#46 opened 7 days ago
by
botkop
Release Custom finetune code with torch?
4
#39 opened 20 days ago
by
CVHvn
How can I fine-tune my own dataset?
3
#36 opened 24 days ago
by
liu0815
Accept tokens instead of string and question regarding tokenizer behaviour
2
#35 opened 24 days ago
by
MH1P
Fine-Tuning 'retrieval.query' and 'retrieval.passage' LoRA Adapters Together?
2
#31 opened 27 days ago
by
miweru
ONNX ERROR: Unsupported model IR version: 10, max supported IR version: 9
1
#30 opened 29 days ago
by
cseeeee
Created the custom branch for serving using text-embedding-inference
#28 opened about 1 month ago
by
sigridjineth