Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics


A member registered May 05, 2016 · View creator page →

Creator of

Recent community posts

Evolution community · Replied to wayfu in Bugs

Great catch, thanks! I'll fix this with the next update.

I was finally able to finish this one:

This is great! Thank you very much for this post!

Evolution community · Created a new topic FAQ

I created an FAQ page with answers to the most commonly asked questions about "Evolution":

Hi Bruno,

I hope you're doing well!

I finally managed to finish the FAQs. If you would like to see me elaborate on any of the answers there, just let me know and I'll add more information.

Best regards, 


It's a quick an easy way to exit the library, inspired by the fact that in the short story, the bodies of the dead librarians are thrown over the railings into the endless abyss.

So what resolution does your monitor have and what resolution were you trying select?

The version number you see here on is not the actual version number of the simulator. just shows you how many files I‘ve uploaded for a certain platform. I only recently began to include Linux builds at all. All of those builds have the same version. 

That being said, I won‘t be fixing any Linux-specific bugs and the Linux build might be missing some features here and there. That‘s the only condition under which I‘m able to offer a Linux version at all. It‘s either this or no Linux version. 

This feature will be part of the next big update. I've already implemented it but there's a lot more that I'm adding in this update and I'm also quite busy with other projects and work at the moment, so it'll unfortunately take a little while before the new version gets released.

Thanks for the patience!

Go to the settings (the gear icon) and then click on the brain at the bottom. You can change the number of network layers and the number of nodes per layer.

Yes, I removed the evolution-core submodule and re-integrated all the scripts into the same repository. Man, was that submodule annoying to work with.

The only reason why the "evolution-core" submodule existed at all was because I played around with making a "true" 3D version of this simulator (possibly with AR integration) in a separate app and I wanted to reuse most of the existing scripts. The submodule allowed me to do just that but it was still a significant overhead. If I ever decide to make a 3D version of this, I'll just add it to the current app (it's probably not going to happen though).

Does this same creature design consistently glitch out in the current released version (other than Android)? If so, then the glitch wasn't introduced in the Optimization branch. There were definitely glitches happening before but they definitely increased drastically after my optimization attempts. 

I'm in the process of adding import and export functionality to both creature designs and simulation save files. I'll let you know once that's ready so you can send me this design to test with.

I fixed the weird glitches that happened on the Optimization branch yesterday. (I also upgraded the project to Unity 2018.2 and reorganized the scripts a little bit). It's all merged into the master branch at the moment, if you're still interested. Thanks again for confirming the TextMesh instructions, they worked for me as well.

The problem was that I had forgotten to also reset the MuscleJoint rigidbodies when resetting the bones, which caused them to disconnect and glitch out.

I'm now working on the next big release with tons of new features that I'm very excited about.  

The short answers to your questions are the following:

The muscles work by applying force to the bones that they are connected to. The magnitude and direction (expansion vs contraction) of that force  is determined by the neural network output. Internally they use the Spring Joint component of Unity's physics system, which essentially acts like a real spring. The muscles fire at a constant rate.

Thank you very much for all of your questions! (I've also seen the other batch but haven't had time to appropriately answer them yet) I'm currently  working on not only a big update for Evolution but also a FAQ page on my website where I will try to answer all of these questions and others that I've been getting in more detail, so that I can then just link to those answers when someone asks them again. If you or your students have any more questions, just drop them here in this forum or send me an e-mail and I'll add them to my list of questions for the FAQ. 

I'll let you know once that page is ready (I'm a little busy at the moment so it might take a while).

Here are two instances I just quickly found:

I definitely also remember there being more examples, since I specifically researched this question of potential payout fees a lot (around the beginning of this year).

I fully understand that the TOS have always given you the potential to apply any additional fees before making a payout, I have read them entirely. My point is that you have communicated something else, are now going against that and officially introducing new fees (perfectly okay) but also applying them retroactively (not okay).

Let's have an example. You are currently not charging a fixed hosting fee, but have instead implemented your "Open revenue sharing" model. You still have to pay hosting fees yourself, I think everybody knows that. Your TOS give you the option to deduct these fees from every payout but you have communicated that you don't. Based on this information people are deciding on a revenue share. Now, you could come along and replace the open revenue sharing model with a fixed percentage fee, but then saying "Well, we've always had to pay hosting fees, so now you're going to have to retroactively pay them as well" would again be unacceptable (even though technically in accordance with your TOS). It's a communication and trust issue.

As I've also tried to stress before, (at least to me) this is absolutely not about the amount of extra fees I've had to pay. As you said yourself, it was not that much in the grand scheme of things, which then again makes me understand even less why you didn't decide to just waive them as well instead of setting the precedent of introducing new fees against what was communicated and retroactively applying them.

Quick sidenote, since I just noticed it: The link for "Read more about PayPal fees" here is broken.

I agree that on average your payouts are definitely faster than most other platforms (I only experienced one very long and annoying delay when you were figuring out the whole tax stuff), but overall payout times are not really a problem at all. 

The part that is just unacceptable is retroactively applying payout fees after specifically stating multiple times that payouts have no fees associated with them. Just to be clear about this, I'm not talking about adjustments because of fraud or refunds. Those make perfect sense. It's also not about having to pay a payout fee at all if one exists (I understand that it's PayPal who's charging the fee).  It's about having trustworthy terms of service.

You cannot simply change your rules and fees and then retroactively apply them as if they had always existed (for the sellers - it doesn't matter if they already existed for you). All you're doing is making your site look less trustworthy. Again, I couldn't care less about the extra PayPal fee, since I can always readjust the revenue sharing slider to stay at an overall revenue loss of roughly 30% (payment processing fees + now payout fees + cut + PayPal currency conversion fees and lower rates).

I think you made a mistake here, not by introducing the new fees but by retroactively applying them. I really hope you guys can figure out ways for how to reduce these fees (e.g. talk to PayPal about payout fees, look into direct bank payouts at proper conversion rates...). I would love it if you were the ones to receive the 30% cut instead of PayPal but given the current amount of fees that are just passed onto the sellers that is not even close to being realistic at the moment.

I would highly recommend watching Grant Sanderson's video on neural networks. He visualizes everything beautifully:

It does take you to the room, I just tried it myself with the macOS version to double check. If you close the book and then pick it up again from the location that was shown to you in the search menu, it will be the exact same book that you just closed. 

If for some reason this doesn't work for you, then please let me know (including what exactly I have to do to reproduce the problem).

Hi Bruno,  it always makes me really happy to hear that this little simulator is being used for educational purposes, thanks for that!

The "fitness" score in this simulation is perfectly correlated to the creature's proficiency at the task. For example, the fitness score for the running task is computed as the horizontal distance from the start divided by a maximum distance value that a creature would have to reach in order to receive a perfect score of 100%. This maximum distance has been arbitrarily chosen by me. It obviously also scales with the simulation time of each generation, so that increasing that time doesn't make it any easier for the creatures to achieve a higher fitness score.

During the reproduction step, two creatures are selected and their chromosomes are recombined using 1-point crossover, which produces two new offspring. This step is repeated until the new generation has reached the chosen population size. Creatures with a higher fitness score have a higher chance of being selected as parents for reproduction and since the same creature can also be selected multiple times, the fitness score of a creature effectively contributes to the number of offspring that will carry part of the genetic information of this parent creature into the next generation.

I might allow the user to select between different selection and recombination strategies in a future update, in order to offer a few more options for everybody to experiment with.

Best regards, 

You can change the time per generation during the simulation from the pause screen.


I use TextMesh Pro to be able to inline images into the help text. It seems like I have to go though a whole process to upgrade the project to make sure TextMesh Pro continues to work properly. 

Internally the currentCreatureBatch array is always used to simplify the implementation. When you turn off batch-simulation, it's practically the same as if you manually set the batch size to the population count. The sorting has to be done on currentGeneration in oder for it to sort the entire population and not just the current batch before the selection step, but if you have batches turned off, both arrays contain the same creatures anyway, so it doesn't make a difference.

The SetActive(false) is definitely necessary. Removing it changes the behaviour of the creatures and therefore also any previously saved simulation. That's why you see the fitness drop from 11.5 to 8.5. Since the entire movement relies on Unity's physics system, it is incredibly easy to break accidentally (I wasted a lot of time on exactly this one SetActive(false) call when writing the optimization branch). So even if this was causing the problems you are referring to, I couldn't just remove the call but would have to work around it in another way. That being said, GameObjects in Unity don't lose their transform position values when you deactivate them, so the values should still all be valid at that point in time.

Thanks for helping me debug this by the way! I'm sorry I can't test the actual project at the moment, but I will as soon as I have time. The creature movement looks great by the way!

On the picture you posted the horizontal distance also seems to be way outside of the normal range. It would be nice if you could send me the saved simulation file (stored in the Application.persistentDataPath: to I'll probably have time to work on this project again in roughly a month from now.

Thanks for confirming the improved memory management!

Now, while I still believe that a regular creature is highly unlikely to reach 100% fitness without any bugs helping, if you want to keep your creatures to improve beyond 100% fitness, you could also just remove the upper clamping limit of 1.0 when the fitness values are calculated inside of each Brain implementation (it should be within the respective Update call). That way you could keep the MAX_DISTANCE value at whatever it is, which would allow you to compare your creatures' performance to designs made by other people who are running an unmodified version of the simulator. As far as I can remember, fitness values above 1.0 should not cause any issue with any of the remaining code.

By the way, now that you had to remove TextMesh Pro, does the Russian and Portuguese text in the help center still show? I think I read somewhere that TextMesh Pro is now part of Unity itself but I haven't had any time to try to port this project to the 2018 version yet.

There will be a lot of significant memory usage optimizations in the next update, so I'm hoping that that's going to fix it.

There is no limit on the number of save files. You can create new save files using the "Save" button or the autosave toggle during simulation. You can't create new save folders.

It's most likely due to Unity's physics system not being deterministic and consistent between different simulation runs. 

If you click on the filename or the little eject arrow, it opens up a dropdown view of all of your savefiles that you can then choose from. I'm going to have it open by default in the next update.

It's currently using the centroid (the average of all joint positions) for the fitness calculation.

I can't really give you that option within the release version of the simulator (other than maybe implementing different ones myself and just letting you select the one you want), but if you like, you can clone the source code from my Github and play around with that yourself. There's a Mutate method in the Evolution class that takes two parent chromosome strings and returns two offspring chromosomes, so that would be the only thing you'd have to change.

The project is not open-source - I'm currently not looking for collaborators and you can't redistribute the code so essentially all the regular copyright stuff applies - but feel free to experiment and play around with the code for any kind of personal or educational use! You can also send me an e-mail if you have problems with running the project or have any other questions. It might take a little while before I can answer because I'm a little busy at the moment, but I'll try to respond as quickly as I can. As a sidenote, I used Unity 2017.4.0 for the latest released version and haven't tested to see if anything is potentially broken in Unity 2018.2, so I'd recommend getting the 2017.4.0 version from the archives in case you want to edit the project.

Thanks! No, there is no manual learning rate or enforced convergence that I have control over. All algorithms are exactly the same no matter the size and complexity of the network. I have seen a few impressive results with networks that were significantly larger than the default setting, but I haven't found a clear guideline to consistently achieve that. 

Manual or auto saves? Also, are you simulating the exact same creature design with the same name (or unnamed) in all instances?

If you like, you can send me the full Unity player log to and I'll try to look into the problem for the next update.

On the topic of manually editing save files: Here's a fantastic video that offers a lot of insight into how the save files are structured and how you can even edit them by hand if you know what you're doing.


1. + 2. The distance to reach 100% fitness is set by me pretty arbitrarily. I just decided on distances that would be quite hard to reach with most creature designs (that aren't glitched out). So the fitness is just the percentage of this arbitrary distance that your creature managed to travel in the given time (where the max distance is also scaled by how much time you give the creature - so if you give them twice as much time, they would need to travel twice as far to get the same fitness score). 

The selection probability is proportional to the fitness (Fitness Proportionate Selection - Wikipedia). So if you have 4 creatures that all reach a 1% fitness score, they are each going to have a 25% chance of being selected for each necessary recombination step. 

3. The brain is a simple feed-forward neural network. Each node takes all of its inputs, multiplies each one with a weight (the weights are the things that are optimized during the simulation and they determine the behaviour), adds them all together and runs that result through a so-called activation function (1 / (1 + exp(-x)) which calculates the output of this particular node. 3Blue1Brown (on YouTube) made some really good videos on Neural Networks recently, so I'd definitely recommend to check them out. The whole topic is pretty math-heavy though.

4. I've definitely seen better results that way than having it the other way around (e.g. 5-20-100) but I've seen even better results with a more balanced node distribution. You're right, there are lots of deep neural networks, for example for pattern recognition, that are setup with decreasing weights per layer but in this case that doesn't seem to be the optimal solution. Maybe it can actually produce better results but only after a very long simulation.

I only store the brain of the best creature in its chromosome representation, i.e as a string, which is an immutable object in C#, so there is no way for me to accidentally modify it without replacing the whole thing. On top of that, I have also already tested and confirmed that the creature in the "best of"-screen always has the exact same brain it had during the simulation. 

In theory, the fitness should never decrease if the "keep the best creature" setting is turned on, but yes, I have also seen this happen. I currently attribute both of these problems to the fact that Unity's physics system (which the whole simulation is based on) depends quite a lot on random external variables - such as frame rate drops - and given its approximative nature isn't as deterministic as it should be. 

I'm trying to optimize the performance as much as I possibly can for the next update in order to get rid of any potential performance spikes that might interfere with the physics system. I unfortunately can't change the physics system itself so there is a certain performance limit that I can possibly optimize to. If you create tons of creatures with a bunch of body parts each, the physics system is by far the main reason for the extreme lag that you will see, and there's pretty much nothing I can do about that, which sucks. I can't even hardware-accelerate it, even though the fact that none of the creatures collide with each other is a perfect basis to parallelize those physics calculations on the GPU.

Yes, I'm aware of that. It's a Unity bug, which I also pointed out over in this thread. They say that they fixed it in the latest update but that doesn't really mean anything and I haven't had time to check yet.

These updates are great!
Possibly another small and easy to implement improvement: Add a quick link to a project's community page from the dashboard, for example in the "more" dropdown under "interact". That would be awesome.

When the "Keep best creatures for the next generation" option is selected, the chromosome / brain of the best creature is copied over to the next generation without any mutation. It might even be the best two creatures - I don't quite remember that right now. But they are definitely not altered in any way.

There will definitely be more updates in the future but it's also going to take a while before I have time to work on it again. I really only have a couple of weeks every year to spend on this. This project isn't and has never been my primary focus, but it's also not going to be abandoned any time soon :)

Answer to Question No. 3 over here: