Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

I might be in over my head..

I'm getting an error that reads

 File "transformers\models\gpt2\modeling_gpt2.py", line 322, in forward

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 4.00 GiB total capacity; 2.56 GiB already allocated; 10.64 MiB free; 2.61 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Any ideas?

This error means your GPU ran out of vram when trying to load the model files, so try closing out of any other apps (if you have any other open), and you may want to try restarting your computer too for good measure, then rebooting the software.