TypeError: bad operand type for unary -: 'NoneType'
When I generate text
TypeError: bad operand type for unary -: 'NoneType'
File "/mnt/p/home/flyang/text-generation-webui/modules/callbacks.py", line 57, in gentask
ret = self.mfunc(callback=_callback, *args, **self.kwargs)
File "/mnt/p/home/flyang/text-generation-webui/modules/text_generation.py", line 351, in generate_with_callback
shared.model.generate(**kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 1652, in generate
return self.sample(
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/generation/utils.py", line 2734, in sample
outputs = self(
File "/home/flyang/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 1045, in forward
outputs = self.model(
File "/home/flyang/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 888, in forward
attention_mask = self._prepare_decoder_attention_mask(
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 796, in _prepare_decoder_attention_mask
combined_attention_mask = _make_sliding_window_causal_mask(
File "/home/flyang/.local/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 88, in _make_sliding_window_causal_mask
mask = torch.triu(mask, diagonal=-sliding_window)
TypeError: bad operand type for unary -: 'NoneType'
I got a same error:
User: hi
Assistant: Exception in thread Thread-3 (generate):
Traceback (most recent call last):
File "/root/miniconda3/envs/llama_factory/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/root/miniconda3/envs/llama_factory/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/generation/utils.py", line 1652, in generate
return self.sample(
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/generation/utils.py", line 2734, in sample
outputs = self(
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 1045, in forward
outputs = self.model(
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 888, in forward
attention_mask = self._prepare_decoder_attention_mask(
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 796, in _prepare_decoder_attention_mask
combined_attention_mask = _make_sliding_window_causal_mask(
File "/root/miniconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py", line 88, in _make_sliding_window_causal_mask
mask = torch.triu(mask, diagonal=-sliding_window)
TypeError: bad operand type for unary -: 'NoneType'
hi everyone,
can you update the transformers package? pip install -U transformers
Upgrading to a newer version of transformers
worked for me.
An alternative is to change sliding_window
in config.json
from null
to something like 4096
, if you can not update your transformers
just like me.
Updating transformers worked for me!