Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

Problem downloading files

A topic by Taru created Jun 24, 2020 Views: 794
Viewing posts 1 to 1

I've been trying to download some of the games I got in the Bundle for Racial Justice and Equality and have been having immense problems with the larger files.  Games that are on the order of several hundred MB and upwards are either completely impossible to download or only come in after an inordinate number of tries. https://ko-op.itch.io/gnog (which is only 416 MB) only successfully came in after probably a dozen tries, with me watching it like a hawk. Larger games such as https://no-wand-studios.itch.io/the-fall-of-lazarus (2 GB), https://joure.itch.io/play-with-gilbert (4 GB) and https://joure.itch.io/stardrop (9 GB) just will not complete (thanks are due to Joure for splitting Gilbert and putting it somewhere else for me to grab. I feel awkward asking the same for Stardrop even though it looks awesome!)  Perversely, https://deimos-games.itch.io/helium-rain, at 1 GB, did come in successfully (after a few tries, natch).

When a download attempt fails, even if I'm watching it and attempt to resume immediately, it will not resume properly and I end up with a 0 byte "file".  Sometimes this failure will happen a few hundred MB in but occasionally - and this is the real kicker - it will give every appearance of success, slowly but steadily plodding in, and then literally fail in the last few seconds . Given that my ADSL is over an ancient piece of rotting GPO copper that BT (I'm in the UK) refuse to replace because profit, and I only get circa 300 to 310 kB/s down, sustained, pulling in large files takes a small eternity. Add to this the fact that I can't just download massive amounts of stuff whenever I like - off-peak hours are 8pm to 8am - every hour wasted is another brick in the wall of my mounting fury (as it were). And being a night-owl I'm currently trying to make use of every last minute :D

Looking at the actual links for the files I notice this is part of them:

[...]?GoogleAccessId=uploader@moonscript2.iam.gserviceaccount.com&Expires=[a 10 digit number usually starting with 15929]&Signature=[...]

and I'm wondering if the problem is this timing out while the d/l is still in progress because it's assumed that it will complete before the session expires. Only it takes longer than the allotted time and, if my connection wobbles (as it is wont to do sometimes... if it's raining, the phase of the moon is wrong or I've not kept up my dues with the Great Old Ones (may I be eaten first) or whatever),  it can't resume.

I should note that I've had no problem with large files from other places. I recently grabbed the entirety of The Sims 3 + all DLCs in a 17 GB archive from old-games.com. It might have dropped a couple of times but I was using wget -c --tries =0 in xterm (I'm a penguinista, run 'doze games in Wine). Even with the auto-generated, time-limited links on that site, if it expires and I have to refresh, wget recognises that the file is the same and happily resumes even though the URL has changed. Using wget to grab stuff off itch does not work, however, as it barfs on the links. Success or failure also doesn't seem to depend on whether I or my housemate are browsing the net at the same time or not, either. It's just as likely to die if it has all the bandwidth to itself.

If it matters, I'm generally using Firefox 68.9.0esr 64bit (on Gentoo Linux) but, in case there is an issue with plugins, I've also tried using Vivaldi, Opera and SeaMonkey and had the same trouble so I'm at a bit of a loss as to what to do...

(Apologies for long post)

This topic has been auto-archived and can no longer be posted in because there haven't been any posts in a while.