runtime error
Exit code: 1. Reason: o, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet` pytorch_model.bin: 0%| | 0.00/990M [00:00<?, ?B/s][A pytorch_model.bin: 6%|▋ | 62.9M/990M [00:01<00:16, 57.4MB/s][A pytorch_model.bin: 26%|██▌ | 256M/990M [00:02<00:05, 132MB/s] [A pytorch_model.bin: 85%|████████▌ | 844M/990M [00:05<00:00, 160MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 36, in <module> model = build_model(cfg, datamodule) File "/home/user/app/mGPT/models/build_model.py", line 8, in build_model return instantiate_from_config(model_config) File "/home/user/app/mGPT/config.py", line 42, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/user/app/mGPT/models/mgpt.py", line 44, in __init__ self.lm = instantiate_from_config(lm) File "/home/user/app/mGPT/config.py", line 42, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/user/app/mGPT/archs/mgpt_lm.py", line 54, in __init__ self.language_model = T5ForConditionalGeneration.from_pretrained( File "/home/user/.pyenv/versions/3.10.17/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/home/user/.pyenv/versions/3.10.17/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4260, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/home/user/.pyenv/versions/3.10.17/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1080, in _get_resolved_checkpoint_files raise EnvironmentError( OSError: google/flan-t5-base does not appear to have a file named pytorch_model.bin but there is a file for TensorFlow weights. Use `from_tf=True` to load this model from those weights.
Container logs:
Fetching error logs...