Original model is here. This model created by bdsqlsz. Hyper FLUX.1-dev-related LoRAs are here.

Notice

This is an experimental conversion in Spaces using a homebrew script. serverless Inference API does not currently support torch float8_e4m3fn, so it does not work. I have not been able to confirm if the conversion is working properly. Please consider this as a test run only.

Downloads last month
535
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for John6666/hyper-flux1-dev-fp8-flux

Adapters
1 model

Space using John6666/hyper-flux1-dev-fp8-flux 1