Input validation error: `inputs` tokens + `max_new_tokens` must be <= 32768
#43
by
rg2410
- opened
The context length of command-r shouldbe about 128k but I'm having this error message:
huggingface_hub.inference._text_generation.ValidationError: Input validation error: inputs
tokens + max_new_tokens
must be <= 32768. Given: 33717 inputs
tokens and 256 max_new_tokens
any ideas why?
Your prompt is too long.
Hi, this issue looks resolved so closing it but feel free to reopen in case you still need help!
shivi
changed discussion status to
closed