My current setup is two gtx1080s. Is it possible to distribute the load between them instead of just using one of them? Or will there be such update to make this possible?
Animation Interpolation powered by AI · By
I found a rather convoluted way to make DAIN windows use 2 or more GPUs. It's a bit of a hack but it's working for me. Here are the steps:
1. Extract 2 DAIN packages to 2 different folders. Call them DAIN GPU1 and DAIN GPU2 just for your reference.
2. Split the video you wish to process into 2 different videos.
3. Set the environment variable CUDA_VISIBLE_DEVICES with value 0. For this go to This PC -> properties -> Advanced System setting -> Environment variables -> System variables
4. Start the first DAIN (DAIN GPU1) with the first half of the video. It should start processing with the first GPU.
5. Set the value of the CUDA_VISIBLE_DEVICES environment variable to 1.
6. Start the second DAIN (DAIN GPU 2) with the second half of the video.
7. You should now have both GPUs working. As fas as I can tell this works for even more GPUs. You just need to keep adding the CUDA_VISIBLE_DEVICES value and launch more DAIN instances.
8. Join both resulting videos with a video editing app of your choice.
I hope this helps.
I would say official support for multiple GPUs should be top priority. I am trying to convert a 3 minute film from 16fps to 64fps, and it is likely to take over 2 days on a single 1080ti. There are render farms on the cloud that support GPU rendering but without multiple GPU support this could get unfeasibly expensive.
For example, https://www.xesktop.com/ lets you login via remote desktop and run your own software. Their machine specs and prices are listed below. With multiple GPU support suddenly using DAINapp for longer films actually becomes a viable option. Instead of 48 hours and $288 , now I could get it done in 5 hours for $30.
GPU servers specs:
Node type 1
Node type 2