A GPT-4o Level MLLM for Vision, Speech and Multimodal Live Streaming on Your Phone

MiniCPM-o 2.6 int4

This is the int4 quantized version of MiniCPM-o 2.6.
Running with int4 version would use lower GPU memory (about 9GB).

Downloads last month
0
Safetensors
Model size
2.97B params
Tensor type
F32
·
I32
·
BF16
·
Inference API
Inference API (serverless) does not yet support model repos that contain custom code.

Dataset used to train openbmb/MiniCPM-o-2_6-int4