Can't be ran
#5 opened 4 days ago
by
qe2
Quantized Version
#4 opened 5 days ago
by
bobwu
Can we have Qwen2.5-vl-72b-abliterated?
2
#2 opened 6 days ago
by
chibop
Can you provide GGUF model usable with Ollama locally
1
#1 opened 26 days ago
by
ryg81