immch machine learning with rockchip NPU (rknpu) model, with rknn format.
modify your docker-compose file:
immich-machine-learning:
container_name: immich_machine_learning
# For hardware acceleration, add one of -[armnn, cuda, openvino, rknn] to the image tag.
# Example tag: ${IMMICH_VERSION:-release}-cuda
image: registry.cn-hangzhou.aliyuncs.com/devinzhang91/immich-machine-learning:rknn-toolkit-lite2-rknn
extends: # uncomment this section for hardware acceleration - see https://immich.app/docs/features/ml-hardware-acceleration
file: hwaccel.ml.yml
service: rknn # set to one of [armnn, cuda, openvino, openvino-wsl, rknn] for accelerated inference - use the `-wsl` version for WSL2 where applicable
volumes:
- cache-{YOUR_PLATFORM}:/cache
OR
docker run -v cache-{YOUR_PLATFORM}:/cache ...
- github: here
immich-rknn docker images (ARC):
docker pull registry.cn-hangzhou.aliyuncs.com/devinzhang91/immich-machine-learning:rknn-toolkit-lite2-rknn
docker pull registry.cn-hangzhou.aliyuncs.com/devinzhang91/immich-server:rknn-toolkit-lite2-rknn
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.