Bf16 not supported
#1
by
samuelqy
- opened
We only have v100s which don’t support bf16. Could you release f16/f32 versions?
you can set the torch_dtype to torch.float16
samuelqy
changed discussion status to
closed