Error while deserializing header: MetadataIncompleteBuffer (nvm, fixed)

#2
by Lolkid654 - opened

is anyone getting this on OOBABOOGA or is it just me?

2023-07-18 15:56:34 INFO:Loading TheBloke_Llama-2-13B-chat-GPTQ...
2023-07-18 15:56:34 ERROR:Failed to load the model.
Traceback (most recent call last):
File "D:\OOBABOOGA\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "D:\OOBABOOGA\text-generation-webui\modules\models.py", line 79, in load_model
output = load_func_maploader
File "D:\OOBABOOGA\text-generation-webui\modules\models.py", line 327, in ExLlama_HF_loader
return ExllamaHF.from_pretrained(model_name)
File "D:\OOBABOOGA\text-generation-webui\modules\exllama_hf.py", line 126, in from_pretrained
return ExllamaHF(config)
File "D:\OOBABOOGA\text-generation-webui\modules\exllama_hf.py", line 31, in init
self.ex_model = ExLlama(self.ex_config)
File "D:\OOBABOOGA\installer_files\env\lib\site-packages\exllama\model.py", line 660, in init
with safe_open(self.config.model_path, framework = "pt", device = "cpu") as f:
safetensors_rust.SafetensorError: Error while deserializing header: MetadataIncompleteBuffer

NVM, it was error while downloading. Needed to redownload. Ignore this.

Lolkid654 changed discussion title from Error while deserializing header: MetadataIncompleteBuffer to Error while deserializing header: MetadataIncompleteBuffer (nvm, fixed)
Lolkid654 changed discussion status to closed

hi, i also have the same problem. how did you solve it?

Sign up or log in to comment