transformers.js 3 error with "model_q4f16.onnx_data"

#7
by FcoDonDev - opened

Im getting the following error when im trying to generate an instance of the model in my node.js app:

2024-11-14 13:14:32.3905471 [E:onnxruntime:, inference_session.cc:2106 onnxruntime::InferenceSession::Initialize::<lambda_667f556cc066b06c92362bf5f36cce18>::operator ()] Exception during initialization: file_size: unknown error: "model_q4f16.onnx_data"
[Nest] 32920  - 14-11-2024, 1:14:32 p. m.   ERROR [ExceptionsHandler] Exception during initialization: file_size: unknown error: "model_q4f16.onnx_data"

This is my code (it was inspired by another thread in this same repo):

     this.text_generation_instance = await pipeline(
                'text-generation',
                 'onnx-community/Phi-3.5-mini-instruct-onnx-web',
                {
                    dtype: 'q4f16',
                     use_external_data_format: true,                    
                });

Already tried using the remote and local files options:

#env.allowRemoteModels = true;
env.allowRemoteModels = false;
env.localModelPath = "./.my_models_path"

i manually downloaded the file "model_q4f16.onnx_data" and put in the same folder that "model_q4f16.onnx", just in case ... but it did'nt work.

pls help :c

Sign up or log in to comment