TypeError when trying to run model with Transformers
#1
by
afmelani
- opened
Hi!
I am trying to use this model with the transformers library but even with just using the code snippet provided I have not been able to get it working. I keep getting the following error:
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType.
I have tried upgrading libraries but it has not been possible to find a solution. I would appreciate if someone could give me some clue on what could be happening.
Thanks in advance!
Hi!
You could try the following code to load the model.
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = 'linjc16/Panacea-7B-Chat'
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)