Flux.1-Dev Hyper NF4 + Flux.1-Dev BNB NF4 + Flux.1-Schnell BNB NF4 - Flux.1-Dev BNB NF4 v2
Hervorgehobene Bilder
Empfohlene Parameter
steps
Versions-Highlights
V2 is 0.5 GB larger than the previous version, since the chunk 64 norm is now stored in full precision float32, making it much more precise than the previous version. Also, since V2 does not have second compression stage, it now has less computation overhead for on-the-fly decompression, making the inference a bit faster!
Ersteller-Sponsoren
☕ Buy me a coffee: https://ko-fi.com/ralfingerai
🍺 Join my discord: https://discord.com/invite/pAz4Bt3rqb
Flux.1-Dev Hyper NF4:
Source: https://huggingface.co/ZhenyaYang/flux_1_dev_hyper_8steps_nf4/tree/main from ZhenyaYang converted Hyper-SD to NF4 (8 steps)
Flux.1-Dev BNB NF4 (v1 & v2):
Source: https://huggingface.co/lllyasviel/flux1-dev-bnb-nf4/tree/main from lllyasviel
Flux.1-Schnell BNB NF4:
Source: https://huggingface.co/silveroxides/flux1-nf4-weights/tree/main from silveroxides
ComfyUI: https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
Forge: https://github.com/lllyasviel/stable-diffusion-webui-forge/discussions/981
☕ Buy me a coffee: https://ko-fi.com/ralfingerai
🍺 Join my discord: https://discord.com/invite/pAz4Bt3rqb
Mitwirkende
Modell-Details
Modelltyp
Basismodell
Modellversion
Modell-Hash
Ersteller
Diskussion
Bitte log in um einen Kommentar zu hinterlassen.