Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: vram not released after unable to create image due to lack of vram #12511

Closed
1 task done
hungtooc opened this issue Aug 13, 2023 · 5 comments
Closed
1 task done
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@hungtooc
Copy link

Is there an existing issue for this?

  • I have searched the existing issues and checked the recent builds/commits

What happened?

everything worked fine until i created an image with a high configuration that my machine couldn't handle, of course the image wouldn't be able to create but then the vram wasn't freed.

Steps to reproduce the problem

use --api and create txt2img via api
created an image with a high configuration that my machine couldn't handle, usually create very high resolution images

What should have happened?

image cannot be created, then release the vram

Version or Commit where the problem happens

version: v1.5.1  •  python: 3.11.3  •  torch: 2.1.0.dev20230727+cu121  •  xformers: N/A  •  gradio: 3.32.0  •  checkpoint: ed989d673d

What Python version are you running on ?

Python 3.11.x (above, no supported yet)

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

Other GPUs

Cross attention optimization

None

What browsers do you use to access the UI ?

Google Chrome

Command Line Arguments

--api

List of extensions

Console logs

out of vram

Additional information

none

@hungtooc hungtooc added the bug-report Report of a bug, yet to be confirmed label Aug 13, 2023
@catboxanon
Copy link
Collaborator

I think this was fixed on the dev branch with 0af4127 and ccb9233. When you can I would checkout the dev branch and see if that resolves it or not.

@hungtooc
Copy link
Author

I think this was fixed on the dev branch with 0af4127 and ccb9233. When you can I would checkout the dev branch and see if that resolves it or not.

oh thanks, my mistake, I will checkout first

@catboxanon
Copy link
Collaborator

catboxanon commented Aug 13, 2023

Just a heads up, I made two PRs to fix the (to my knowledge) remaining VRAM issues. Might want to wait until these are reviewed and merged too.
#12514
#12515

@2blackbar

This comment was marked as off-topic.

@catboxanon

This comment was marked as off-topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

3 participants