Rename gptq_model-4bit-128g.safetensors to model.safetensors

#7
by matatonic - opened

Resolve loading error:

    self.model = AutoModelForCausalLM.from_pretrained(**self.params).eval()
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3929, in from_pretrained
    raise EnvironmentError(
OSError: AIDC-AI/Ovis1.6-Gemma2-9B-GPTQ-Int4 does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment