Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

Compute Shaders or How I Stopped Worrying about AI and Loved Codex

A topic by The Comfy Collective created 90 days ago Views: 449 Replies: 4
Viewing posts 1 to 2
(1 edit) (+1)

What if there was something that could unlock the power of computer shaders without taking 10 years of development from an 8 person team? Does anyone else have experience with AI agents and compute shaders. What ideas do you have to use them? I think using them for sims is a perfect match. For example could you imagine how awesome pretty much anything from Oxygen Not Included to Dwarf Fortress or even Sim City/City Skylines would be with the power of computer shaders? Let's talk about this because it's like replacing a go-kart motor(CPU) with a jet engine(GPU). I think compute shaders in sim games is long overdue and the tech is here now to do it. Does anyone have any thoughts questions or anything to keep the conversation going?

(+1)

Technically speaking, a shader is just any program that runs on the GPU. But, based off of what you are saying, I'm assuming that when you say shaders you mean programs that can do things such as add a stylized look to the game or make it be cel-shaded. The uses for these kinds of shaders isn't genre specific, it's more art style related. Also, the games you listed probably wouldn't look much better with shaders because they are already highly stylized. (expect for City Skylines, but I don't think that it would look good in a cartoony style) I also don't understand how using shaders is replacing the CPU with the GPU, because all shaders already run on the GPU as well as everything related to rendering. Additionally, how is shaders in sim games overdue? The tech has been there for a while, but I think the problem is just that the sim player base doesn't want to play a heavily stylized city-building game. Also, really the only shaders that you will write are ones that add style to the game, which either are going to already exist on marketplaces or shouldn't be to outrageously hard to learn to make, (or at least 10 years hard) and even then you'd only be writing one or two, because after to many a game can start to look bad.

(+1)

I kinda interpreted the question as being about normally CPU related tasks.

I think the example in the OP about simulation games would be a good use. I think I’d be on-board but with the note that I don’t think it’ll replace the fact CPUs still do a lot of heavy lifting.

(+2)

I'm realizing my post was wrong because I didn't realize what becool meant by compute shaders. But, now that I know, I don't really know how this would help improve the game. Nor do I understand how Oxygen Not Included would have been improved by having computational tasks executed on the GPU. Also, the only benefit you get is from getting to be able to write your code in parallel, which is harder and takes more learning. I agree that I don't think that it will replace CPUs doing the work because 1) CPUs aren't "slower",  they just can't run code in parallel 2) Most games won't have the need to put CPU computation on the GPU. Also, using compute shaders can make some graphics cards incompatible, which you don't want.

(3 edits)

If done with minimal cpu readbacks you can get massive performance boosts especially with simulations that have lots of units or pawns. Parallel computing is ideal for processing tons of units. For example most games seem to start losing FPS when 100-200 units processed at once? With GPU compute and the right architecture with minimal cpu readbacks you can get in the hundreds of thousands with 60FPS, im running 20+fps with over half a million units at once. I know because that’s what I’m doing with my BeeCool project. Imagine having millions of dwarfs in DF? Or running City Skylines in fast forward at like 60FPS with a giant city? Although I think the compute shaders would be worth it the most on 2D actually because having the compute shaders and a lot of fancy 3D graphics might really hit the GPU hard. 

The only way to program it solo is with codex ai agent though. Compute shaders are hard to crack. I had to come up with a custom way to do it I used Unity and instanced rendering, so the simulation is done with compute shaders and the graphics are rendered all on the GPU at the same time with minimal CPU callback. Because that is what tanks performance so you have to be careful how and when you ping the CPU.


Below is a screenshot from my BeeCool project I have an in game counter this is at one hour, 24FPS 462,364 "bees".