Error
After request get this error
"Input validation error: inputs tokens + max_new_tokens must be <= 32768. Given: 71508 inputs tokens and 0 max_new_tokens
{"error":"Input validation error: inputs tokens + max_new_tokens must be <= 32768. Given: 71508 inputs tokens and 0 max_new_tokens","error_type":"validation"}"
I greeted 'Hi ALfred' and I got the same error message.
Error: Error in generating model output:
422 Client Error: Unprocessable Entity for url: https://router.huggingface.co/hf-inference/models/Qwen/Qwen2.5-Coder-32B-Instruct/v1/chat/completions (Request ID: Root=1-67f8fb2e-0a231650278948ce63026324;5bae414c-d947-4fae-80fd-f35effe2ab6c)
Input validation error: inputs tokens + max_new_tokens must be <= 32768. Given: 84327 inputs tokens and 0 max_new_tokens
{"error":"Input validation error: inputs tokens + max_new_tokens must be <= 32768. Given: 84327 inputs tokens and 0 max_new_tokens","error_type":"validation"}
Similar error here after duplicated spaces. How to resolve issue? Below error.
Thanks!
Error: 404 Client Error: Not Found for url: https://router.huggingface.co/fireworks-ai/inference/v1/chat/completions (Request ID: Root=1-6827709c-12c679c0567c28f719fc57bc;564bfc8d-8814-4fcb-8b44-f1e84c8aa4d3)
Model not found, inaccessible, and/or not deployed
Solved! It was a matter of priority in Inference Providers into account preferences!
Thanks!
Similar error here after duplicated spaces. How to resolve issue? Below error.
Thanks!Error: 404 Client Error: Not Found for url: https://router.huggingface.co/fireworks-ai/inference/v1/chat/completions (Request ID: Root=1-6827709c-12c679c0567c28f719fc57bc;564bfc8d-8814-4fcb-8b44-f1e84c8aa4d3)
Model not found, inaccessible, and/or not deployed