[bug]: Image finishes generating, and then OOM's and models that are well under 24gigs OOM's after generating the image. even models as small as 6gigs. #7691
Labels
bug
Something isn't working
Is there an existing issue for this problem?
Operating system
Linux
GPU vendor
Nvidia (CUDA)
GPU model
Tesla P40
GPU VRAM
24G
Version number
5.7.0rc1
Browser
chrome, firefox, edge
Python dependencies
Local System
accelerate
1.0.1
compel
2.0.2
cuda
12.4
diffusers
0.31.0
numpy
1.26.3
opencv
4.9.0.80
onnx
1.16.1
pillow
11.0.0
python
3.11.10
torch
2.4.1+cu124
torchvision
0.19.1+cu124
transformers
4.46.3
xformers
Not Installed
What happened
i will select a model, Invoke will start the generation fine, i can see the ram being taken up by nvtop. once the image is done, and it before it places it in the gallery, it OOM's. even with smaller mdoels no bigger than 6Gigs. This started with most models a few weeks ago and i cant figure out how to correct it. partial loading was enabled as well but i never used it before.
invoke error.docx
What you expected to happen
image should be placed in gallery
How to reproduce the problem
start the app, chose a model, start generating, when finished it OOMS
Additional context
No response
Discord username
No response
The text was updated successfully, but these errors were encountered: