Gravity Voyager Making of: Starry Background and Node-based Image Compositing

(Warning: Nerd Content Ahead)

In making of my game: Gravity Voyager, one of the biggest challenge is making a feel of dreamlike, fantastic starry background. Finally I think that I have achieved the goal and now I want to share some experience, particulary node-based image processing.

Firstly what is image compositing? When you are adjusting hue/saturation, curve of a photo, or combining two image into ones in Photoshop, you are doing image compositing. Of course, I made the starry image resource used in my game used a lot image compositing trick. But rather than Photoshop, I used … Blender, a free 3D software package to make it.

Why using Blender instead of Photoshop to make it? Because Photoshop is a layer-based image editing appilication. If things goes complicated, it is almost impossible to manage the relationship between images/filters/adjust operations. See this:

This is the final composition node graph of the starry image resource. If I had used Photoshop to make it, it would drove me crazy. Using node-based instead of layer-based compositing makes the flow of image

This is the image resource of starry background used in the game (Note: the background is actually transparent. The purpose of using black backgound is just making things obvious).

Then I'll show you how the image was made in Blender briefly. The scene of this Blender project is fairly simple: a transparent plane with some spheres, an orthograph camera facing down toward negative-Z axis. I used particle system to add spheres on the plane:

The material of the spheres represented stars is like this, also done using node graph:

It is a relative simple node setup and I'll begin my explaining of node-based composition based on it. You can interpret the concept of node as a processing unit. It takes some input and generate some output based on the input you give. Blender provides variaty nodes, such as RGB curves, MixRGB, Blur. And it isn't necessary that a node should have both input and output. The nodes such as Render Layers, Color don't have input because itself provide information and is the begining of the node flow and Material Output, Composite node don't have output because itself is the end of the node graph.

A node can take another node's output as its input, and generate output as the third node's input. This kind of relation of nodes represented by using lines connected with each other. A node setup should be read from left to right. However before you read the node setup of the star material, we should discuss what kind of result should we aimed for.

We should apply some color to these stars. The physical accurate result can achieved by using a Blackbody node. A Blackbody node takes a numeric input: Temperature (in Kelvin) and provide a color as output. Lower temperature input will result in a more reddish color and higher temperature input will give a more bluish color. If the temperature you give as input is 6500, it will result a exactly white color. It is behaved like real stars (red drawfs have low surface temperature and reddish surface color, on the contrast blue giant have high surface temperature and bluish surface color) and so we should use a Blackbody in our node setup.

But apparently the color and brightness of the stars should be different with each other. We introduce a random variable range from 0 to 1 and denote it as X. We hope the temperature of the Blackbody node input range from 1000 to around 14000 (so that red, white and blue stars will appear at same time) and the brightness of stars also should varied.

Then the formula of final color can be given:

Color.RGB = Blackbody(14000 _X + 1000)  
Color.A = 2_ X + 0.05

And now we can put nodes into the node editor and connect with each other accroding to the formula above. Have a look at the final node setup again:

You can see at the begining there is a Object Info node. It gives each star a random number ranged from 0 to 1. Then it pass through a set of mathematical operation as the formula above. Finally be convert to a color by a Blackbody node and be the color input of the Emission material node.

You must notice there is a RGB Curve node. Usually a RGB curve node work exactly as Photoshop's RGB curve adjustment do. But in there, it change the distribution of random variable X so that more stars should be appear white, rather than blue or red. If you are a math guy, you may know it is a inverse of Comulative Distribution Function (CDF) for the Normal Distribution. We denote it as InvNormCDF(). InvNormCDF(X) which X is a random variable with uniform distribution will result in a new random variable with normal distribution.

Then the formula following represents the meaning of our node setup.

Color.RGB = Blackbody(14000 InvNormCDF(X) + 1000)
Color.A = 2
InvNormCDF(X) + 0.05

If we render the scene. The result ("raw" image) should be like this:

![image][5]

As you can see, the color of stars is varied. You may wondering why there is no blue star. In fact, some star is blue but its intensity is too strong that out of color represent range of PNG file so it appear white in our PNG file. But in Blender inner image format that with 32 bits per channel so the color of the image was represented correctly.

Processing the "raw image" into the image that was used in the game also done by using node graph. If I explain the nodes and relation between each other in the compositing node setup one by one, this article will be too long and borning, I'm sure that you dislike it. So instead explaining in detail, I'll show you some picture of the node setup:

These node's name is quiet self-expanatory. Glare node is a filter node that generate glare on the input image. RGB Curve node make some color adjustment using curves, similar to the function in Photoshop. Blur is a filter node too and I think it don't need to be explained. Alpha Over is blend two images into one by their Alpha channel, just as put a image on the top of another image in Photoshop, or put one layer on the top of another layer. Multiply node is combine two image that each color channel of pixel are multiplied by another pixel's, work exactly like "Multiply" operator of Blend Mode function in Photoshop, Illustrator and other image/video editing software packages.

-

Now let's see the final starry effect:

As you can see, there is aurora in our scene. To make the aurora, I wrote some custom shader code to achieve the effect like this. Shader is a small program running on the GPU. It determines how a model/sprite should be rendered.

I'll show the essential part of the shader code below.

Properties
    {
        [PerRendererData] _MainTex ("Sprite Texture", 2D) = "white" {}
        _UpperColor ("Upper", Color) = (0,0,0,1)
        _DownColor ("Down", Color) = (1,1,1,1)
        [MaterialToggle] PixelSnap ("Pixel snap", Float) = 0
        _Alpha ("Alpha", Range(0.0,1.0)) = 1.0
        _Pow("Power", Range(0.0, 1.0)) = 1.0
    }

This part tell Unity that there are 4 varibles should be exposed in the Unity Editor: A texture, two color: Upper and Down, a opition for whether turn on pixel snap, an number indicate image alpha channel, a number (power) that control how the gradien goes.

Now the shader's information in Unity editor's inspector is look like this:

Next let's check out this line of code:

Blend One One

It specifics the factor of alpha blend between our sprite and background is both 1. The default of alpha blend factor is 1 and (1 - alpha of source), i.e. Normal alpha blend method. It will makes our aurora and starry sprites more clear and bright.

fixed4 frag(v2f IN) : SV_Target
{
 fixed4 c = fixed4(
     lerp(
         _DownColor.rgb,
         _UpperColor.rgb,
         pow(
             IN.texcoord.y, 
             _Pow
         )
     ), 
     tex2D(
         _MainTex,
         IN.texcoord
     ).a
 );
 c.rgb *= c.a * _Alpha;
 return c;
}

This part is the so called Fragment Shader. It is the most important part in the shader code. It takes Vertex Shader output as input and process it and generate a color for a fragment (its concept is similar to pixels but not exactly same). GPU calls fragment shader once time per fragment.

In this fragment shader, it takes a input (INPUT) that type is a struct called "v2f". The struct v2f contains a important member: texcoord, represents Texture Coordinate (UV). It returns a fixed4 (4D mid-precision float array) represent a 32 bits per channel RGBA color.
The RGB part of return color is based on linear interpolating (lerp) between Upper and Down color we already given along texture's y axis (vertical direction). The interpolant of the linear interpolating is the value of fragment's texture coordinate's Y component raised to the power "_Pow"". And the alpha part of fragment's color is using tex2D() function sampling aurora sprite texture's alpha channel in fragment's coresponding texture coordinate. Finally let the color's RGB channels multiplied by given _Alpha factor and return it to GPU. This is how the shader works. It turn this "raw" Alpha only image resource:

into something like this:

Now we get the desired render result finally. The expaining above maybe complicated. But if we draw a node graph according to the given code, things will be seen clearly:

To conclude, intead using layer-based software like Photoshop, you may try using node-based compositing software. It is easy to organize and manage a large and complex project. Hope this article can give you some inspiration.