Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

Thanks! There is some logic to filter out these kinds of submissions on the server, to avoid showing them to other players (the submitter will still see their scores, however, as in your screenshot).

Having said that, the current filtering mechanism is not perfect, so some determined player could certainly get their bogus scores to appear on those histograms (side question: if someone goes through that much trouble, should I be flattered?). I could add a heuristic that tests programs on random inputs to make sure they're similar to the fixed inputs.

I don't know if this is possible with the way the game is set up, but what I would recommend is simply restarting the players solution on the second set of tests. The way I abused the system was simply hard coding the correct output for the first test above the actual solution, if the program where to run from the start for the second set of tests that hard coded output would cause the solution to fail.

I haven't looked at the code in a while, but it's likely that that is exactly what I'm doing on the server side. It would make sense to do the same thing on the client side (I don't recall if there was a reason for the discrepancy), although that could potentially break some existing solutions. Regardless, thanks for flagging this for me!

Just released an updated client that resets the simulation (and stats) when advancing to the next test set. This is similar to what is done server side, and should prevent this trick (except for the handful of "tutorial" assignments that don't validate solutions using random data).