How to do batch inference?

#1
by ppsking - opened

I'm wondering how to use this fantastic model to do batch inference, when using tokenizer to pad the batch inputs, it turns out with "Asking to pad but the tokenizer does not have a padding token."

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment