Text Generation
Transformers
llama
Not-For-All-Audiences
nsfw
Inference Endpoints

How to load this on oobabooga

#1
by yideli - opened

How to load this on oobabooga, I get an error when loading.
The error message is:

Traceback (most recent call last): File “I:\Chat-AI-3\text-generation-webui\server.py”, line 76, in load_model_wrapper shared.model, shared.tokenizer = load_model(shared.model_name) File “I:\Chat-AI-3\text-generation-webui\modules\models.py”, line 116, in load_model output = load_func(model_name) File “I:\Chat-AI-3\text-generation-webui\modules\models.py”, line 204, in huggingface_loader model = LoaderClass.from_pretrained( File “I:\Chat-AI-3\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py”, line 493, in from_pretrained return model_class.from_pretrained( File “I:\Chat-AI-3\installer_files\env\lib\site-packages\transformers\modeling_utils.py”, line 2474, in from_pretrained raise EnvironmentError( OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\voxta_MLewd-L2-Chat-13B-5.0bpw-exl2.

If this is the wrong way to do it, please tell me the correct way to use it.
In addition, I am very much looking forward to having a suitable language model for lewd. The original author's 13b model is too large and it is too slow to run locally.

Thanks alot.

Sign up or log in to comment