Text Generation
Transformers
PyTorch
JAX
Safetensors
bloom
text-generation-inference
Inference Endpoints

I can't find the max_sequence_length that bloom support?????

#45
by ShaneSue - opened

I can't find the max_sequence_length that bloom support?????

BigScience Workshop org

Maximum sequence length at training time was 2048 tokens, but since the model uses ALiBI encodings, it supports sequences longer than that at inference time.

ShaneSue changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment