This model was exported from ibm-granite/granite-3b-code-instruct-128k using Optimum with float16 conversion and the basic optimization by ONNX Runtime.

The repository owner maintains this model for use with Transformers.js, but unfortunately, the external data format is not supported yet.

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including kazssym/granite-3b-code-instruct-128k-onnx-float16