Yet Another Smaller Version Request

#7
by chilegazelle - opened

Yet another request, just in case the previous one gets closed. MiniMaxAI/MiniMax-VL-01 is undoubtedly the best VL model, but its size makes deployment really tough. It would be amazing to have a smaller distilled version or at least a well-optimized quantized one to make it more accessible while preserving its VL capabilities. Hoping this gets considered!

A good example of this is what DeepSeek did with DeepSeek-R1. They applied knowledge distillation to create smaller, more efficient versions while keeping strong reasoning capabilities. Something similar for MiniMax-VL-01 could make deployment much easier while maintaining its VL strengths.

MiniMax org

Hello, if we have any plans regarding this, we will inform you and other developers in advance. Thank you for your support of our MiniMax VL-01 model. We invite you to follow the future progress of our model!

MiniMax-AI changed discussion status to closed

Sign up or log in to comment