Unable to load the model
#1
by
ParthMandaliya
- opened
@alexandreteles I tried to load the model the way you mentioned using Bonito class (code below):
from bonito import Bonito
# Initialize the Bonito model
bonito = Bonito("alexandreteles/bonito-v1-awq", dtype="float16")
But I am getting the following error:
...
/usr/local/lib/python3.10/dist-packages/vllm/model_executor/model_loader.py in get_model(model_config, device_config, **kwargs)
84 else:
85 # Load the weights from the cached or downloaded files.
---> 86 model.load_weights(model_config.model, model_config.download_dir,
87 model_config.load_format, model_config.revision)
88 return model.eval()
/usr/local/lib/python3.10/dist-packages/vllm/model_executor/models/llama.py in load_weights(self, model_name_or_path, cache_dir, load_format, revision)
389 weight_loader = getattr(param, "weight_loader",
390 default_weight_loader)
--> 391 weight_loader(param, loaded_weight)
/usr/local/lib/python3.10/dist-packages/vllm/model_executor/layers/linear.py in weight_loader(self, param, loaded_weight)
550 shard_size = param_data.shape[input_dim]
551 start_idx = tp_rank * shard_size
--> 552 loaded_weight = loaded_weight.narrow(input_dim, start_idx,
553 shard_size)
554 assert param_data.shape == loaded_weight.shape
RuntimeError: start (0) + length (14336) exceeds dimension size (4096).
The stack trace is truncated, do let me know if you need the complete one. Perhaps I am loading the model wrong or missing some additional steps?
Your help is appreciated.
The model card is just a copy of the original published by BatsResearch. To load an AWQ model you should follow the instructions for a runtime that supports the format (ex. vLLM). I will look into updating the card when I get some free time.