A GPT-4o Level MLLM for Vision, Speech and Multimodal Live Streaming on Your Phone
MiniCPM-o 2.6 int4
This is the int4 quantized version of MiniCPM-o 2.6.
Running with int4 version would use lower GPU memory (about 9GB).
- Downloads last month
- 0
Inference API (serverless) does not yet support model repos that contain custom code.