Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

This is actually a very common problem in machine learning, including neural networks. I think what's happening here is that for your neural network's current weights, an optimal (or close to optimal) combination of weights has been reached. So in the next generation, when the weight's values are slightly "mutated" and then recombined, the optimal combination of weights for your neural network is being "overshot." 

One way of thinking of this is imagine you're trying to find the lowest point in a landscape. At 13.35, you've just about gotten to the lowest point possible (or at least very close to it). So each generation after that you're stepping over, or "overshooting" that minimum point. 

What you could do, to get even closer to that minimum point (or in this case, optimum fitness), is lower the mutation rate. That'll get you faster speeds, but downside of this is that getting to those speeds will typically take longer.

ok thanks. The mutation rate was at 75% now i shall lower it to 25%

(1 edit)

ok the mutation rate has been lowered to 5%... Speed is now 11 and staying there

gonna try 100% mutation rate.