Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

🎮 10 Years in Gaming & Web — Now Diving into DevOps for Game Development 🚀

A topic by Winkle created 17 days ago Views: 211 Replies: 7
Viewing posts 1 to 2

Hey folks,

For the last ten years, I’ve been deep in the trenches of game development — building for mobile, web, PC, and console, and web development as well. 

I’ve worked with pretty much every major engine: Unity, Unreal, Godot, Cocos, and many of the web-based frameworks like Phaser and Construct. On the multiplayer and backend side, I’ve also spent years with tools like Photon, Socket.io, Colyseus, PlayFab, and more.

In terms of web development, enjoyed with JS/TS based frameworks like React/Next.js, Angular.js, Vue.js, Svelte, Gastby, solid.js, tailwindcss, shadcn, MUI, Node/Express.js, Nest.js, Mongodb, PostgreSQL

It’s been an amazing ride, and I still love bringing games to life.

Lately, though, I’ve started diving into something a little different — DevOps for games. Right now, I’m building out a CI/CD pipeline on an Azure VM running Ubuntu 24.04, using Jenkins and Perforce to automate publishing an Unreal Engine project to Steam. Up next: bringing the backend into the automation loop.

This is actually my second time setting up a pipeline like this — so I’m still very much learning and growing into the role. My experience here is nowhere near what I have in game and web development, but honestly? I really enjoy it. I love the challenge of making builds smoother, deployments faster, and processes repeatable. It feels like a natural evolution of my love for building systems — just from a different angle.

I’d love to hear from others who’ve made a similar shift, or who live in this intersection of game dev and DevOps. 

 What’s been your experience? 

Any tools, gotchas, or golden nuggets of advice you wish you knew earlier? 

 Let’s swap stories — I’m here to learn, and hopefully share something useful back.

Excited to be on this path, and glad to be sharing it with you all. 

 Cheers!

Moderator

Some time ago I set up CI/CD for my projects. I’m using a custom made game engine made in C and OpenGL, not sure how this translates to bigger engines, but I’ll share my experience anyway.

I made a project where I had a develop and main branches. Whenever I made any changes in the develop branch, it would make builds for me to test, but also push the build on Steam so I can test Steam specific features work too.

Every so often, if everything worked well, I’d push all my changes to main, which would create, push and publish builds in pretty much every store I supported, which included Itch.io using butler.

Overall it was great fun, I enjoying making the builds more and more optimised. I did it all with Github Actions but I messed around with other build systems. I can definitely recommend playing around and see how much of your workflow you can automate. The ideal goal is to minimize the time it takes from you making a commit to the users enjoying your update.

Hey, thanks for your reply and awesome with your work.
I'd like to hear about your experience more detail.
can we connect on discord?
discord username: minian_toy

Hey, thanks a lot for sharing this — that workflow sounds really clean, and I appreciate you taking the time to explain it.

Your develop → test builds and main → release builds setup really resonates with what I’m aiming for long-term. Right now though, I’m still fighting through the foundations 😅

I’m currently setting up a CI pipeline on an Azure VM (Ubuntu 24.04) using Jenkins + Perforce to automate builds for an Unreal Engine project, with the final goal being Win64 builds published to Steam.

Here’s where I’m stuck and would really love your perspective:

  • I’m building on Linux, but targeting Win64 eventually.
    Right now I’m compiling a Linux client/editor first just to stabilize the pipeline — but I’m running into a mix of:

    • platform-specific C++ issues (Windows-only includes, missing headers on Linux)

    • Unreal AutomationTool / UBT failures after long build times

    • cases where things work fine on Windows locally, but break in Linux CI

So I’m trying to figure out:

  • Is building a Linux client first on a Linux CI server the right stepping stone, even if the real target is Win64 + Steam?

  • Or did you find it better to jump straight into Windows builds (Windows runner / VM) once CI basics were in place?

I’m also curious about your experience with:

  • How you decided when a pipeline was “stable enough” to trust

  • Whether you had long compile times + late failures, and how you reduced iteration pain

  • Any early mistakes or architectural decisions you’d redo if starting again

At the moment, I’m intentionally keeping things simple:

  • Only automating main → Steam

  • No dev/release branch split yet, until the pipeline itself is reliable

If you’re open to it, I’d love to keep discussing this here — or even chat in real time (Discord works great for me). I’m still early on the DevOps side, but genuinely enjoying the challenge and trying to learn from people who’ve already walked this path.

Really appreciate you sharing your experience — this is exactly the kind of insight I was hoping for 🙏

Cheers!

Moderator

I’m building on Linux, but targeting Win64 eventually. Right now I’m compiling a Linux client/editor first just to stabilize the pipeline — but I’m running into a mix of:

  • platform-specific C++ issues (Windows-only includes, missing headers on Linux)
  • Unreal AutomationTool / UBT failures after long build times
  • cases where things work fine on Windows locally, but break in Linux CI

I’m primarily working on Linux, so I started with the Linux builds first. Once that was done, I focused on Windows builds, to keep things separate. This is for a custom engine made in C and using Github, not sure how that translates to Unreal and Azure, but here’s some details:

  • When making Windows builds, I did that on a windows CI server (Github Runner), which helps deal with issues like platform-specific code as you mention.
  • Github does allow you to run Bash commands on the windows machine, which was more familiar to me.
  • I attempted cross compilation (mostly compiling Windows builds from a Linux machine) but it was a lot of headaches and I gave up.

Is building a Linux client first on a Linux CI server the right stepping stone, even if the real target is Win64 + Steam?

My advice would be to focus on a Windows CI server if your target is a Windows build. I’m assuming Unreal behaves better this way too.

How you decided when a pipeline was “stable enough” to trust

This happens over time. No matter how much effort you put on the pipeline now, it will only feel stable after you spend 6 months developing your game and you always have up-to-date builds ready without thinking about it. If you get occasional build errors, that’s your sign to revisit the pipeline and make it more robust.

You can’t fix everything from the get go. It’s absolutely normal to make a pipeline that kinda works, and once you get an error as you work on the game, return to the pipeline to improve it.

Whether you had long compile times + late failures, and how you reduced iteration pain

I’m working on a custom C engine, so compile times are always small. From what I’ve seen Unity and Unreal have longer compilation times. There are ways to cache parts of your project that don’t need to be re-compiled, but I’m not entirely sure how friendly Unreal is with that.

At the moment, I’m intentionally keeping things simple:

Honestly that’s the way to start. Just make a small pipeline that creates a release-able build, but doesn’t upload it anywhere. Once you have that, over time as it gives you successful builds, it will feel more dependable, and a next step can be to automatically upload it to different stores. It’s easy to be overwhelmed if you aim for too many things at once.

If you’re open to it, I’d love to keep discussing this here

I’m happy to keep chatting here, this way future readers going through the same experience can read this thread :)

Hey, thanks for your kind response.
I tried to build the linux client of project on linux manually before go to auto-build using jenkins.
But it was tricky and stressful, and finally failed. and no idea with that.
Is there any way to build manually on linux?
And I think win64 build is possible using jenkins, right? what do you think about that?

Moderator

I tried Jenkins once before locally. From what I remember, it wouldn’t let me cross-compile my project from Windows to Linux and vice-versa.

If you don’t have access to a Windows machine, the only suggestion I have is to find a cloud Windows machine to compile your project. I’ve used Github Actions before, I can recommend them as a learning experience. If your project is private you can use Github Actions for free, but they have limits.

I’m not aware of many alternatives. I tried Docker in the past, as I thought it would make a Virtual Box of another OS, so I’d be able to compile my project there, but I run into issues trying to cross-compile again, so I gave up.

At the moment, I've set up and validate CIpipeline on linus like repo access for perforce, pipeline flow, artifact stages, basic validation.
but I heard win64 build from linux doesn't support now, right?
So I am gonna keep Jenkins cicd on linux and trying to build win 64 and packaging on window vm.
Is it good option?