Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>11GB VRAM

Aaarrrgghh let me know when it's down to 4GB like Stable Diffusion

The prompt-based masking sounds incredible, with either pixel +/- or Prompt Relevance +/-

VERY impressive img2img capabilities!



It is stable diffusion but yes my fork does not have the memory optimizations needed to run it on only 4gb


You can get a used 2080Ti for under $300 on eBay


That's a lot of money for most people. It also means they have to have a PC to put it in.


Thank you for your sensical response... Very happy with my 6GB VRAM card and don't have $300 lying around to use on a git repo that will probably be slimmed down in a month or two


Or, if there is a Colab version, I’d happy to pay Google for premium GPU.


Well, just open a new GPU Colab and create a cell mit „!pip install imaginairy“ and you should be good to go …


It does work in non-pro colab apparently. Here you go: https://colab.research.google.com/drive/1rOvQNs0Cmn_yU1bKWjC...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: