Model Card for Model ID

Uses

from transformers import AutoProcessor, AutoModel  
import torch
from PIL import Image

model = AutoModel.from_pretrained('princepride/MiniCPM-V-2_6-VPM', trust_remote_code=True,
attn_implementation='flash_attention_2', torch_dtype=torch.bfloat16)
processor = AutoProcessor.from_pretrained('princepride/MiniCPM-V-2_6-VPM', trust_remote_code=True)
image = Image.open(r'workspace/00002-2654981627.png').convert('RGB')


inputs = processor(
        [image], 
        max_slice_nums=max_slice_nums,
        use_image_id=use_image_id,
        return_tensors="pt", 
        max_length=max_inp_length
    )
model(inputs)
model = model.eval().cuda()  
Downloads last month
10
Safetensors
Model size
418M params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.