32G ram failed to run
#3
by
ZKong
- opened
Models eat too much ram, can't run .32G ram +32G virtual ram
The model itself at 4 bit is 26 GB. Can't really do much other than saying use Linux instead of Windows or get 64 GB RAM.
ok,i can run it in comfyui with gguf,which is small. i do have tried your z image, very small and as fast as comfyui,but much less ram cost
ZKong
changed discussion status to
closed
Transformer is the most important, if your model can decouple it with text encoder and vae , it will gain much more attention i think. Because we can use other smaller size text encoder or much more efficient vae