Unable to use with device_map='auto'
I got the following error when trying to load it using device_map='auto'
Traceback (most recent call last):
File "/NS/llm-1/work/afkhan/fact/src/main.py", line 46, in main
run_project(
File "/NS/llm-1/work/afkhan/fact/src/lib_project/lib_project/project.py", line 74, in run_project
handle.experiment(experiment_config)
File "/NS/llm-1/work/afkhan/fact/src/lib_project/lib_project/experiment.py", line 73, in experiment_wrapper
value = func(dataclass_config, task_description)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/NS/llm-1/work/afkhan/fact/src/experiments/knowledge_probing/experiment1.py", line 77, in kp_experiment
model, tokenizer = load_model_tokenizer(config.model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/NS/llm-1/work/afkhan/fact/src/lib_llm/lib_llm/models/load.py", line 97, in load_model_tokenizer
model = load_model(
^^^^^^^^^^^
File "/NS/llm-1/work/afkhan/fact/src/lib_llm/lib_llm/models/load.py", line 50, in load_model
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/NS/venvs/nobackup/afkhan/pypoetry_cache/virtualenvs/llm-knowledge-extraction-tF3ncB3K-py3.11/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 566, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/NS/venvs/nobackup/afkhan/pypoetry_cache/virtualenvs/llm-knowledge-extraction-tF3ncB3K-py3.11/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3606, in from_pretrained
no_split_modules = model._get_no_split_modules(device_map)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/NS/venvs/nobackup/afkhan/pypoetry_cache/virtualenvs/llm-knowledge-extraction-tF3ncB3K-py3.11/lib/python3.11/site-packages/transformers/modeling_utils.py", line 1690, in _get_no_split_modules
raise ValueError(
ValueError: PhiForCausalLM does not support `device_map='auto'`. To implement support, the model class needs to implement the `_no_split_modules` attribute.
My code is pretty generic so I wouldn't really like to add exceptions for specific models and probably a lot of other people would also benefit if you can add this into the main release? It seems this error is related so I decided to raise a PR myself but I noticed that the _no_split_modules
is actually already present so I'm not sure why this occurs.
What is your transformers
version?
@gugarosa I'm using version 4.36.2, do I need to upgrade it?
@gugarosa I'm using version 4.36.2, do I need to upgrade it?
You may need to use the latest version from the transformers repository (4.37.0.dev0) via “pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers”.
Thanks! I'll give this a go