Introduction

This repository hosts the SSDLite320 MobilenetV3 Large model for the React Native ExecuTorch library. It includes the model exported for xnnpack in .pte format, ready for use in the ExecuTorch runtime.

If you'd like to run these models in your own ExecuTorch runtime, refer to the official documentation for setup instructions.

⚠️ Warning: This model does not contain the NMS layers! If you intend to use this model outside of React Native ExecuTorch or useObjectDetection hook, you have to implement the postprocessing. For more information, refer to this code

Compatibility

If you intend to use this models outside of React Native ExecuTorch, make sure your runtime is compatible with the ExecuTorch version used to export the .pte files. For more details, see the compatibility note in the ExecuTorch GitHub repository. If you work with React Native ExecuTorch, the constants shipped with the library will guarantee compatibility with runtime used behind the scenes.

These models were exported using commit fe20be98c and no forward compatibility is guaranteed. Older versions of the runtime may not work with these files.

Repository Structure

  • The ssdlite320-mobilenet-v3-large.pte file url from the root of this repository should be passed to the modelSource parameter.
Downloads last month
7
Inference API
Unable to determine this model's library. Check the docs .