runtime error
Exit code: 1. Reason: ors: 86%|████████▋ | 529M/613M [00:10<00:01, 57.4MB/s][A model.safetensors: 97%|█████████▋| 592M/613M [00:13<00:00, 40.4MB/s][A model.safetensors: 100%|█████████▉| 613M/613M [00:15<00:00, 40.0MB/s] tokenizer_config.json: 0%| | 0.00/775 [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 775/775 [00:00<00:00, 127kB/s] vocab.json: 0%| | 0.00/1.06M [00:00<?, ?B/s][A vocab.json: 100%|██████████| 1.06M/1.06M [00:00<00:00, 76.1MB/s] special_tokens_map.json: 0%| | 0.00/460 [00:00<?, ?B/s][A special_tokens_map.json: 100%|██████████| 460/460 [00:00<00:00, 426kB/s] Traceback (most recent call last): File "app.py", line 7, in <module> detector = pipeline(model=checkpoint, task="zero-shot-object-detection") File "/home/user/.local/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 967, in pipeline tokenizer = AutoTokenizer.from_pretrained( File "/home/user/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2060, in _from_pretrained slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( File "/home/user/.local/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.local/lib/python3.8/site-packages/transformers/models/clip/tokenization_clip.py", line 332, in __init__ with open(merges_file, encoding="utf-8") as merges_handle: TypeError: expected str, bytes or os.PathLike object, not NoneType
Container logs:
Fetching error logs...