runtime error
Exit code: 1. Reason: You are using the default legacy behaviour of the <class 'transformers.models.t5.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 Traceback (most recent call last): File "/home/user/app/app.py", line 10, in <module> numini_model = T5ForConditionalGeneration.from_pretrained("sanjudebnath/Numini") File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3913, in from_pretrained raise EnvironmentError( OSError: sanjudebnath/Numini does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
Container logs:
Fetching error logs...