Does this model support batch inference?
#8
by
vahyd
- opened
Does this model supports batch inference? I see bs variables and list implementations in source code but cannot infer with a list of images and prompts. I use the default code in the model card and pass a list of images and messages to it.
vahyd
changed discussion title from
Does this model supports batch inference?
to Does this model support batch inference?
Not supported yet
Is batch_inference supported now ?