runtime error
Exit code: 1. Reason: cs/transformers/en/model_doc/auto#auto-classes - If you are the owner of the model architecture code, please modify your model class such that it inherits from `GenerationMixin` (after `PreTrainedModel`, otherwise you'll get an exception). - If you are not the owner of the model architecture class, please contact the model code owner to update it. Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s][A Loading checkpoint shards: 50%|█████ | 1/2 [00:05<00:05, 5.86s/it][A Loading checkpoint shards: 100%|██████████| 2/2 [00:08<00:00, 3.74s/it][A Loading checkpoint shards: 100%|██████████| 2/2 [00:08<00:00, 4.06s/it] generation_config.json: 0%| | 0.00/111 [00:00<?, ?B/s][A generation_config.json: 100%|██████████| 111/111 [00:00<00:00, 797kB/s] tokenizer_config.json: 0%| | 0.00/2.58k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 2.58k/2.58k [00:00<00:00, 13.4MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 13, in <module> tokenizer = AutoTokenizer.from_pretrained('openbmb/MiniCPM-V', trust_remote_code=True) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 919, in from_pretrained return tokenizer_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1690, in __getattribute__ requires_backends(cls, cls._backends) File "/usr/local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1678, in requires_backends raise ImportError("".join(failed)) ImportError: LlamaTokenizerWrapper requires the SentencePiece library but it was not found in your environment. Checkout the instructions on the installation page of its repo: https://github.com/google/sentencepiece#installation and follow the ones that match your environment. Please note that you may need to restart your runtime after installation.
Container logs:
Fetching error logs...