Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

First Steps as a VR dev

Diving head first into the Unknown

The Ghost Howls XR Journalist Blog
Picture credits to The Ghost Howls

My first day on the job as a VR developer with zero experience could be best described as intimidating. The sheer amount of knowledge and technical groundwork I needed to cover as a fresh graduate seemed like a jagged mountain I was too fragile to climb. Having no senior level guidance to point me in the right direction, I frantically searched for a structured learning path that would give me a jump start and make sense of all the jargon and buzzwords associated with this field. What is XR? How is it any different from MR? Where does VR and AR fit in?  What even is the Metaverse? What does it take to build a VR experience from scratch?

Luckily, I came across Unity Learn's free-of-cost resources that offered a plethora of up-to-date courses, tutorials, and entire structured pathways to get you up on your feet. A majority of the concepts and takeaways I have highlighted in this post, I have learned from there.

How XR/MR tie into VR/AR

XR or Extended Reality, is the umbrella/generic term that encompasses everything VR, AR, and MR. More specifically, the term XR is most often used by developers, documentation, and technical writing as a placeholder for any VR/AR/MR experiences. Toolkits, SDKs, and documentations use XR extensively when referring to functionality that can be catered for VR/AR/MR.

VR or Virtual Reality refers to experiences where the environment you are viewing and interacting with is completely virtual(generated). This means that you are completely cut off from the real world, at least in terms of visual senses.

AR or Augmented Reality is where smaller components or chunks of digital elements are layered on top of your actual physical view of an environment. This essentially means you can see the real world around you, but it is also overlayed with 3D objects and other digital elements such as UIs. Think Pokémon GO!

MR or Mixed Reality, as the name suggests, is a blend of both VR and AR. MR experiences may vary in how much inclination they have towards VR/AR and may be best expressed as a virtuality continuum.

This is best summarized in this concise blog by the Interaction Design Foundation, as seen in the diagram below.


The XR Spectrum

The XR Spectrum


The Ambition of the Metaverse

It would be adequate to term the Metaverse as the next significant development in global communications, as highlighted:

" There will come a day when we look upon what we have accomplished, and know that we have done it. As of now, we do not know exactly what shape the Metaverse will take. That does not matter, either. What matters is that someday, a global network of spatially organized, predominantly 3D content will be available to all without restriction, for use in all human endeavors — a new and profoundly transformational medium, enabled by major innovations in hardware, human-computer interface, network infrastructure, creator tools and digital economies." - Tony Parisi

You can read more on the Seven Rules of The Metaverse by Tony Parisi, Head of XR Ads/E-Commerce at Unity Technologies, where he narrates a thoughtful definition of how really to understand the Metaverse and its implications. 

If you're as confused as I am, or to get a quick laugh, check out the more satirical take The Seven Present Laws of the Metaverse by the Ghost Howls.

Building Blocks of a VR app/game

Now that we have all that jargon out of the way, what does it take to make a VR experience from scratch? I use the term experience quite often as a placeholder for applications and games alike, because as I have discovered, the vast lot of skills, technologies, practices, and functionalities resonate, even with different use cases. The bottom line is, If you can make a VR game, you can make a VR app, and vice-versa. In-fact, a VR game is a VR app.

The Game Engine

Because VR thrives on complete virtual environments, a Game Engine becomes almost necessary to create experiences tailored to VR. We need a robust system that will handle all rendering, physics, utilities, and the dirty work so that we can focus on the experience itself. We use game engines such as Unity & Unreal to build the 'front-end' ( in the case of VR development, this would refer to the virtual environment, objects, scenes, interactables and interactions, User interfaces, client app experience, etc.) 

Lots of SDKs

Chances are if you want to implement a cool feature in VR, there's already an SDK or Software development Kit built for it. Want to add hand-tracked interactions with virtual objects? Use an Interaction SDK. Want to easily integrate avatars into your experience? Use an Avatar SDK.

There are so many SDKs and toolkits that achieve specific functionalities that there is no point in making your own. You are better off using well-documented and highly extensible software kits that have been developed carefully and evolved to perfectly cater to whatever you are looking to achieve. 

By extensible we mean that we can extend/customize these kits to meet our exact needs.

Take, for example, the concept of Sockets in Unity's XR Interaction Toolkit. Sockets allow us to hold interactables(objects) in place at a certain predetermined orientation as if docking them in. We can extend the Socket Interactor class inside of the XR Interaction toolkit to be able to detect the vertical swiping of a card to implement a card reader in VR. This, however, requires creativity and problem-solving as well as ample experience in understanding C# code and vectors in Unity. 

Back-end Services

In a similar stance, a lot of the time we require a backend service or a group of backend services to allow us to store and transfer data. This may be for user accounts, multiplayer features, score leaderboards, and other data/network-oriented services. Many organizations now offer industry-standard backend-as-a-service to wholistically take care of everything, such as the PhotonEngine, which provides backend cloud services as well as networking libraries for easy integration with game engines like Unity. 

If you are looking to add multi-user features to your VR experience, these services are your best bet!

VR Development Concepts

XR Hardware Landscape

Because the field of XR is rapidly evolving, the hardware landscape is constantly changing. 

My understanding of XR (or simply VR at the time) was that it was still in primitive stages, however, after experiencing Meta's demo app First Steps which showcased the capability and potential of VR devices, I was blown away. Hand tracking, eye tracking,  6DOF(degrees of freedom), positional audio, 120hz refresh rate, and beautifully crafted virtual environments make the experience surreal.

It is noted that HMDs or Head Mounted Displays ( devices that allow us to experience XR) are primarily of 2 types, Tethered HMDs, and Standalone HMDs. The key difference between the two is that tethered HMDs perform all rendering and processing via a PC that is physically connected to the HMD and the device is simply used to display the output, whereas standalone HMDs perform all rendering, processing, and calculations on the headset itself, all the while displaying content too.

The obvious trade-off between the two is that tethered devices can cater to more powerful experiences that are graphically and computationally demanding, while standalone are limited in their capabilities. This may change as progress is made to make HMDs lighter, more powerful, and economical, but most standalone HMDs are now able to function as hybrids (connecting to a PC or using standalone) such as the Meta Quest 2

Then some HMDs do only VR, only AR, or both (XR). It all depends on the base use cases of the devices and what they are tailored for(gaming, business, efficiency, and education).

Key Challenges when Building a VR Experience

When building a VR experience, the most fundamental characteristic of the experience to focus on is, inherently, immersion. 

Immersion

Everything revolves around immersion. Immersion in VR can be defined as the perception of being physically present in a non-physical/virtual environment. Ensuring that experiences are immersive requires us to solve some key challenges. Let's relay some concepts that are critical to immersion:

World-scale

Getting real-world scale is essential for immersive experiences in VR. Objects and environments in VR must be the right size and proportion to give a sense of immersion. As a reference, 1 unit in Unity translates to 1 meter in real life. Both programmers and designers must be aware of the world scale and its importance in VR immersion.

Locomotion

Locomotion? That's just a fancy word for movement. More specifically, the movement of the user inside the virtual environment. How do we ensure that a user inside of a virtual world can freely move and explore the environment around them? 

Locomotion is a major design decision in VR that ties directly to the immersion of the user. Locomotion techniques in VR include:

  • Room-scale: The user moves as they would in real life, intuitive but difficult in practice because of physical space constraints, also requires 6DOF(degrees of freedom)
  • Continuous movement: Controller-based movement using analog sticks. That sounds like a good idea, in theory, however, moving in VR without actually moving your head/body produces Vection (the illusion of movement). This can cause simulator sickness/nausea and is not good from a UX standpoint
  • Teleporting: teleporting the user at the press of a button/trigger is a comfortable alternative that takes care of both physical constraints and Vection. Most VR apps primarily use teleportation for this reason.

Interaction

Interactions in VR are closely tied to gameplay mechanics, and player agency in Video Games, hence, essential for immersion. Users in a VR experience must have some sort of impact on the virtual environment, and we achieve this using interaction. Interactions in VR can be broken down into two components, interactables, and interactors. 

  • Interactables: Virtual objects that a user can interact with in some form, whether it's poking, grabbing, shooting, pulling, turning, selecting, etc. 
  • Interactors: Mechanics/other objects that allow the user to interact with the virtual world. They allow users to hover over, select, grab interactables, etc. Your virtual hands are interactors!

Comfort & Accessibility

Comfort and accessibility in a VR experience generally raise quality-of-life questions such as:

  • Can the user enjoy the experience in a seated position?
  • Can the user use a single controller to experience the app?
  • Can the user experience the app without controllers?
  • Does the app have audio/visual queues and audio/haptic feedback?
  • Does the app use fade-in/fade-out techniques so the user is not strained into new scenes abruptly?
  • Is the app optimized to provide consistent fps across the board?
  • Does the app offer settings to enable/disable different locomotion features and distance interactions

Performance Optimization

As we previously talked about limitations in the processing power of HMDs, performance optimization becomes essential when developing apps for VR. Depending on the type of experience and its target platform, accounting for constraints and navigating around hardware limitations while building an immersive app is critical.

Some metrics to keep in mind include but are not limited to:

  • FPS or Frames per second Low fps can be immersion-breaking and nauseating in VR. Most HMDs have target fps that your app must achieve to be deployed on official stores ( e.g. 72 fps for the Meta Quest 2)
  • Poly Count or Tri countEvery 3d object modeled and rendered has a polygon count (a higher count means better-looking objects), however more polygons and many objects bring a heavy load for rendering. Hence, finding a middle ground between performance and aesthetics is essential.
  • Draw Callsthese refer to calls to the graphics API to draw objects, draw calls are performed in batches 

To optimize performance, we want to minimize draw calls, minimize poly count, and optimize textures, particles, lighting, and post-processing.

Here's a list of some tips and techniques typically used for optimizing performance:


My Journey

I have linked below my learning path and frequently update it as I progress:

A Structured Learning Path for VR Development with Unity

VR Test Room

Support this post

Did you like this post? Tell us

Leave a comment

Log in with your itch.io account to leave a comment.