Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

I have same problem and i have power ful 2080 ti that i got from illumicorp member. so what the heck shall ido to fix that. can someone give me an solution. please answer fast i dint want to wait years for an answer 

Hi daciansolgen3,

Just a few questions first:

1. What resolution is your input media?

2. Are you using the Split Frames option? If so, what are your settings?

3. In the error message, how much VRAM does it say is reserved by other applications? How much VRAM is reserved by PyTorch and how much does it say you have left?


I can't offer much help without answers to the above questions but I'll try and give advice.

If you're running out of VRAM with a 2080Ti then I can assume your input media is larger than 1080p. When trying to interpolate these large frame sizes in DainApp and get an out of memory message, you need to turn on the "Split Frames" option under the "Fix OutOfMemory Options" Tab.

Leave the x=2 y=2 defaults and 150px padding as they are for now and try feeding the frames to Dain. If it starts rendering with no error, I'd close DainApp, then restart it, re-select all my previous options and then reduce either X or Y splits to 1.

The idea is that you want as few splits as possible whilst avoiding the OutOfMemory error.

If you still get an error at X=2 Y=2 then add 1 to either axis and try again. Keep adding 1 to one, then the other until you don't get an OutOfMemory error.

Also, using the experimental Interpolation algorithm uses more VRAM than the Default algorithm so bear that in mind.


Don't run any other GPU based program at the same time as Dain as this will reduce your available VRAM as well as increase your interpolation time. Even if a GPU intensive program has been closed, VRAM may still be reserved for that program, effectively reducing the available VRAM for other programs including Dain.


"i have power ful 2080 ti that i got from illumicorp member"

I had to look up what Illumicorp is but I would try and stick to reputable sources for GPU procurement. There have been many scams over the last couple of years selling GPU's advertised as the latest models that have been lower cost GPU's with cooler shrouds from expensive models and modified BIOS's that mean they report themselves as expensive models. Pretty much the only way you can differentiate these is firstly the idea "If it sounds too good to be true it most likely is." then seller ratings and reviews and finally Benchmarks and/or a tear-down of the physical hardware.

Maybe the first thing I would do would be to Benchmark your GPU using something like Geekbench and compare it to average CUDA Benchmark scores.


Or this post might be an r/whoosh moment