itch.io is community of indie game creators and players

Devlogs

Devlog 1: Research

UnFair
A downloadable game for Windows

Introduction

Hello and welcome to the first devlog of our game for the course "Game Projects"!

Our team consists of two programmers: Bram and Sam, and four artists: Heather, Silvia, Emma, and Lowie. We are happy to take you along on our gamemaking journey over the next couple of weeks.

We will be creating a 3D top-down social deduction or imposter type game. Our game is set in a very sketchy fair. Our players will be staff workers on the fair and need to keep everything running. But, at the start, one of them is secretly chosen to be an imposter! He or she wants to see the fair fail and tries to sabotage things. It is possible that the workers get a suspicion on who the imposter is. If workers work together, they can grab another worker and place them in the dunk tank! But don't worry, they will not be in there forever. There will be a meter that tracks all the repair and sabotage work. The more things go wrong, the more the meter goes towards the imposter’s side. If, by the end of the day, the meter is closer to this side than to the worker’s side, the imposter will win! If not, and the fair is still in working order, the worker who completed the most tasks successfully will be crowned employee of the month and win! 

In this first devlog, we will show a few of our research questions and how we were able to answer them this week.

 

Will using an AI for NPC's behavior be doable?

In our game we need NPCs with a decently complicated behavior tree. NPCs should be able to perform several actions and interact with the players. First, we researched group movement and crowd simulations, but this is not really what we need. Even though there has to be some kind of communication/group movement (like standing in a line in front of an attraction correctly), it is more important that NPCs can individually do actions. We also do not need a lot of them, more in the range of 20–30.

We tested out AI in both Unreal Engine and Unity.

Unity: Unity does not have its own AI system. We created a Behavior Tree system from scratch in C#, including a blackboard with keys, different node types, selectors, and sequences. As a test, we made the NPCs move around randomly on a navmesh. This took a pretty long time because there was nothing in place yet. Right now that everything is set up, it is a lot easier to add new custom nodes to the system to make the AI do things. There is no way to actually see the behavior tree, so when this tree becomes bigger, it will be hard to maintain and debug.

Unreal Engine: Unreal Engine has a dedicated system for behavior trees. We added a behavior tree and a blackboard with the needed keys. We also created a couple of custom tasks (in Blueprints) so the NPCs can walk around randomly, and when they get hungry, they will move to the closest food stand. This was a lot more convenient and easier to set up. Unreal Engine also has a system in place for perception, with sight, touch, ...

In conclusion, if we will use NPC's, it will be doable in Unreal Engine because of the graphical behavior tree and perception code inside the engine.

 

Should we use Unreal Engine or Unity as game engine?

We set up a pros-cons table in which we compared Unreal with Unity. While not every pro and con of each engine has something directly to do with our game, we chose Unreal Engine because we will first need either the behavior trees or the AI framework for our crowd simulations. Secondly, since we also focus on playing our game with game controllers, we opted for this choice because it has a more extensive and more customizable force feedback system, but also more console advantages than Unity. Thirdly, the good multiplayer framework that Unreal offers also gives this engine a better edge compared to Unity. And as a more personal motivation, we opted for this choice because the team has more experience with Unreal.


Should we use deferred or forward rendering?

Again, we set up a pro-con table in which we compared deferred with forward rendering. We also built a prototype in Unreal Engine, which you can see in the following pictures:

Left is deferred rendering, right is forward rendering.

On the right you can see that forward rendering has some issues with having more than 3 moveable lights in the scene. 

Our team decided to use deferred rendering. While it has some issues with translucency, there are ways to get around them, and it should not have a significant impact on our game since the amount of translucent materials we’ll be using will be minimal (if any at all). While forward rendering might be a bit faster, we prefer to keep the freedom that deferred rendering provides when handling shadow-casting lights.


From what games will we get our artstyle references?

We do not yet have a clear outline of what our own artstyle will be, but we did decide on which games we will use as our main source of inspiration. Throughout the week, we gathered pictures of stylized games with a low-poly look, as this is the most efficient style to use in our timeframe. We then voted on our favorites and made this ranking of our most preferred styles:

  1. Rescue Party Live!
  2. Townscaper
  3. Plate Up

And an additional game we would use as reference for the characters:

  1. Escape Simulator
Left is Rescue Party Live! ,  right is Townscaper
Left is Plate Up, right is Escape Simulator

What we can already conclude is that we will have a shader with a thin outline.


How will the structure of the UI look like?

We will have to focus a bit more on our UI structure than otherwise because our game will have a lot of input on the screen. The level will have a busy feel to it, with npcs running around, attractions moving, thought bubbles popping up, etc. Before we begin filling our scene, we thought it was a good idea to already discuss what our UI will look like.

First, we thought of having each player have their own tasks and having those tasks written out in each corner of the screen. After some discussion, we found it best to have each task show up at the locations on the map in some sort of bubble. This way, each player can choose which tasks to do, and the players are more immersed in the game.

After some experimenting and discussing with the team, we came up with this UI layout below. There is a sun icon that changes depending on how much time you have left, a bar that shows who is winning (imposter or staff), and a money counter for the final standing in case the employees win.


Which controls did we decide on?

As controlling devices, we will be using Xbox controllers to control our character and menu in-game. This means we will optimise our game for controller inputs and not for keyboard inputs. Alongside this we will also make use of the vibration functionality of our controller.

Our vibration functionality will be a key feature, because this will give you feedback on your tasks and mainly: it will reveal if you are an imposter or not. 

On following image you can find an overview of our upcoming controls

Our plan for next week

Next week we will start building proper prototypes which will include:

  • at least one attraction with a system on how it breaks down/how to fix it
  • further on a basic AI for customers that go on attractions, etc. 
  • making the system to let the imposter know he is an imposter
  • add multiple controllers
  • define the artstyle/begin the art bible
  • create a prototype shader
  • import all placeholder meshes to Unreal

See you next week!

Download UnFair
Leave a comment