Posted November 05, 2021 by AlterHowdegen
#Unity #VFX #Rendering #Shader #2D
Schildmaid MX's visual goal is to use retro-style sprites in a modern environment, enriching pixel art (for purists: I'm talking mostly about pixel art created from 3D renders) with modern effects. To achieve this we are using Unity's Universal Render Pipeline, Post Processing Stack v2 and Visual Effects Graph along with custom shaders and effects.
Using the URP's 2D Renderer (for 2D lighting) in conjunction with the Post Processing Stack v2 (for custom post processing effects) prevents stacking cameras - so instead the different layers are composited in the scene view and finally rendered by an additional camera. Intermediate steps like inputs for distortion or a fixed resolution layer for pixelized VFX are rendered into Render Textures first and then placed in the scene as 16:9 quads with alpha or additive blending.
Speaking of VFX, Schildmaid MX is using Visual Effects Graph - which is not supported by the 2D Renderer. Luckily, there is no need to use the same renderer with all cameras in a setup like this. The VFX Graph, along with some other traditional particle systems and effects, is rendered with a camera using a standard forward renderer into a Render Texture. This provides the additional bonus of being able to render at a fixed resolution to pixelize the VFX layer while still retaining the ability to freely rotate and scale various effects, keeping a consistent look.
The intermediate layers are all rendered at the game's fixed resolution of 480 x 270 (or lower, for example for the distortion direction), while the final composition camera renders at the resolution set up in the options, to allow for smooth scrolling of slowly moving sprites. So while all sprites' pixels have the same size and no rotation, the final composition is not snapped to an artificially lower resolution pixel grid.
The above screenshot shows the various stages that make up the final composition:
There is only a single VFX Graph for all the GPU particles emitted by projectiles, exhaust sprites, explosions, hit effects etc., controlled by a single input texture. A camera renders a single layer containing specialized sprites into a Render Texture, then the graph samples that Render Texture in screen space to get the data neccessary to control the particles' color, direction, speed and lifetime. The Render Texture's green channel controls speed or damping (added speed above 0.5f, stronger damping below 0.5f) while the blue channel controls the direction (0f to 1f mapped to a circular direction). Since there are not enough channels to also include full colors from a single texture, the red channel is mapped to a look up texture, created from a scriptable object containing a color scheme.
This setup allows an artist to very easily add new effects to the game, including particles moving out from or towards a point or area, areas stopping particles or changing their direction, spawning GPU particles with higher or lower speeds and longer or shorter lifetimes by simply adding a sprite or CPU particle system with relatively low particle counts - without ever having to deal with VFX graph. Same goes for the distortion system, which is a lot simpler than the above VFX graph though.
All these modern effects come at a performance cost, of course, so they can all be switched off individually to ensure a stable framerate on lower spec PCs and, at some point in the future, consoles.
(jn)