⚠️ You are here because you're looking for an FP8 version of MiniMax-M2.5 ⚠️
‼️ But MiniMax-M2.5 is already natively FP8!!!
So just use the original checkpoint MiniMaxAI/MinimMax-M2.5. Thank you for coming to my ted tailk
Now since I saved you the confusion, why not drop me a follow? ;)
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for marksverdhei/MiniMax-M2.5-FP8
Base model
MiniMaxAI/MiniMax-M2.5