Post Processing Volume, get yours for free from the Unity asset store!

Here I am, back with a new blog post about post processing volumes! I know, my post is a few weeks over due, and I am sure you have all been waiting impatiently for it! But to make it up to all of you, I brought gifts! Amazing shiny gifts.

Post Processing Volume

We created a post processing volume, similar to what can be found in Unreal Engine, that is available on the Unity asset store for free! Just for you! That means 0 euros (which is approximately 0 US dollars) to create fancy post processing changes moving in and out of the volumes you place around the level. We created this asset based of the system we have in place in our own game. We use the post processing to give the game that additional look and feel. To add more effect to specific areas we developed a way to change the effects in certain areas. We used this system and created a more general setup that you can now download and test out yourself.

How do I do this?

Here is how it works:.... magic! No seriously, what you do is just head over to the asset store and download and import the post processing volume asset. Then you basically check out the videos to see what the volumes do, and how to set them up. [video width="1280" height="720" mp4=""][/video]   For the weird 0.5% of you guys who prefer to read instructions rather than watch a video, here is what you do. To the camera you add the PostProcessVolumeReceiver. This will add the PostProcessingBehaviour script where you add the profile you use as standard in the level. The script also adds a collider to the camera to detect whether you have entered a volume or not. Then you add some volumes. The idea is to place these volumes in locations that need a different post processing effect. Making tunnels darker, changing color grading underwater, or adding excessive bloom just because you can. All you need to do is add an empty GameObject and add the PostProcessVolume script. You can choose between a sphere or box as your volume, and set the size. The outer size defines where the changes in post processing kicks in, and they will fade in until you are inside the inner volume, where your new values are fully in effect. You can set these new values with the sliders in the script, or load them from a profile. Just add the profile and click the button to copy all the values to your volume. [video width="1920" height="1080" mp4=""][/video]   A nice workflow is to create a new post processing profile, add it to the camera and change the values to your liking. This way you can see the effect in the editor. Once you have the look you want inside the volume, just place the profile in the slot on the volume, click the button and done. And that is it, we can’t make it much simpler than that. So try out yourself, let us know if you like it and if we need to make some changes, and show us what you made with it.  

Tube lights in Antigraviator

While looking for a solution for volumetric lights in Unity I stumbled upon the „VolumetricLighting” git hub  which was created by the team behind the Adam demo. While we ended up not using the volumetric features, we do use the so called „tube lights”. In this article I’ll go over why and how we use tube lights in Antigraviator.

Lights, Lights and Tube Lights

Let’s start with why are we using it instead of regular lights such as a point light. One of the reasons is that while point lights have a small specular highlight, with tube lights we can achieve an elongated highlight on objects.
2 point lights and 2 tube lights on the same position, same intensity and same range. Tube lights give an elongated highlight making it much more interesting.

Ambient Tube Lights

Another possible way to use it, is as an ambient light. Depending on the situation you are in, the specular highlight that comes with it can be a blessing but also a curse. In our case it is a blessing as it can be seen on the city tracks, it gives nice highlights on the ship and on the track too.
Tube lights on and off. Gives nice highlights.
Sometimes however, you just want a bit of ambient lighting without highlights. For example lighting up a darker corner in your scene or sometimes even for faking bounce light a bit. Of course this is not accurate at all and it has its limits but in some cases it can add to the scene. If the camera angle makes it possible or if you don’t have shiny surfaces tube lights are perfectly fine for this purpose especially since you can disable the source mesh that is emitting the light.
Faking bounce light with tube lights. Might not be the best solution for every occasion but certainly possible.
And finally it is a quite cheap solution for area light approximation as long as they are not overlapping too much.

Tube Light Drawbacks

This however comes with drawbacks which leads us to the cons of tube lights. One of the biggest drawbacks is that it casts no real shadows. This can be somewhat resolved by using the shadow plane feature of the light, which limits/cuts off the light’s influence. You can use up to 2 shadow planes. It is extremely handy if you don’t want the light shine through walls for example. But for more complex shadows it is just not good enough.
Shadow planes are best used for blocking the light from shining through objects since tube lights are not casting shadows.
Another thing that would be pretty useful is to be able to add a mesh as a shape for the light. This would open a bunch of possible ways to use these lights, one of them being neon sings for example. Another limitation tube lights have is that it only works in deferred rendering and it only affects object that are rendered in deferred.


As I said in the beginning we ended up not using the the volumetric feature but they look very good and it can add a lot to your scene especially if it’s an indoor scene so I highly suggest you to check out the git hub page and play around with them, especially since it’s free. In my opinion these assets together with the new Post-Processing stack were one of the biggest contributors to why the Adam demo looked so good. Save Save Save Save Save Save

One Texture Environment

When building environment textures for a game, it is always a good idea to think about its re-usability. Tiling textures or trims are great examples for this. If used wisely and combined with smart shader tricks, they can greatly reduce resource usage, increase performance and save time at once. This of course depends on the environment style you are choosing to do. In our case, clean sci-fi is probably the most optimal for this kind of approach, mainly due to simplicity of man-made shapes and panel layouts. While building this menu environment scene, I wanted to go overboard and challenge myself by using one optimized texture for most the scene. It allows for easy adjustments since most of the environment depends on a single texture. It is a good educational practice, however I soon realized that there is almost always a need for a bit of custom details, such as decals and text to break up the tone. Color and surface variation was done through a shader. While building a reusable texture, or basically a sheet of details for an environment, it is crucial to plan things ahead. It is important to have different varieties of details, while making sure they belong in the same style. Particularly in Antigraviator, things like panels of different sizes, bolts, cables, cylindrical parts, vents, etc. can all come in handy, on which UVs of props can be directly mapped onto. Speaking of optimization, it is interesting to realize that each channel of any RGB(A) texture can be used as different black and white mask. Entire sets of textures can be made from separate masks in these channels. As a practical example, the blue channel of a normal map contains information that is meant to add a bit of depth to the texture, but has nearly no effect. I made use of this and replaced it with an AO map instead. A good practice is to work with masks instead of final texture information. This can be any black and white mask, for example one to separate different materials on a mesh, such as plastic/paint or metal parts. To utilize these tricks, you of course would need a custom shader, this is where Shader Forge or any other custom shader solution comes in handy. The workflow for this usually consists of making a lowpoly mesh first. In case you can afford a few more triangles, chamfering edges once and using face weighted normals can be a great way of getting rid of obvious sharp edges while improving the silhouette. After this is in place, it is mostly a matter of mapping different parts of the prop onto the detail texture sheet in a logical way. For further reading, this technique was inspired by talented artists on here and here.

State of the Game: March Edition!

In state of the game I will not be talking tech. It is a post for our followers who want to know what we have been up to and about what is coming next. If you recently became a follower of the game this post should help you catch up to what has happened in the previous month.


Let me start by thanking the Steam community for Greenlighting our game in a mere 8 days. This of course means a lot to us and goes to show that many of you see the potential of this game and might want to buy it!


Now lets talk about the game a bit and where we are at with the development. Our goal is to release 12 race tracks to start with in 4 different worlds. Let me give you guys a scoop and tell you about these 4 worlds. The first one you have seen in the demo and is a city based world. This weekend we released screenshots of the 2nd world which is a desert based world. The third world will be a snow/ice based world and the fourth will be a space station. Development wise we are about 70% to 80% done with the first two worlds and we just started work on the 3rd one. We have started work on an AI system so you don't have to fly solo just trying to beat the record time. This also means that for multiplayer, where we always had the idea it would be head-to-head, we will probably go for up to 6 players. For our split-screen functionality it means we will also be supporting 4 player split-screen! Another item we worked on is vehicle customization. We don't have too many components to configure your vehicle with at the moment but the general framework is in.


Lets talk a bit about the future. Developing a game is not cheap. We all know that if you pay peanuts, you get monkeys. Getting funds for the game is important if not only so that we, the developers, can buy food at the end of the day. This in turn is important so we don't die of starvation and can actually finish the game. This is why we are taking the game to Kickstarter. Around the end of April we will launch on the platform and run a 30 day campaign. With this campaign we will also release a new playable alpha. The alpha will include 3 tracks in 3 different worlds, AI racing and 4 player split-screen. So most likely no online multiplayer yet. Well I hope you guys are up to date again with our development. If you have feedback, things you would really like to see in the game or want to adopt a walrus: PS: The part about adopting a walrus is a joke. Also note that in no way or form we are harming walruses during the development of this game!

Shader Forge in Antigraviator

We have been using Shader Forge for Antigraviator for almost every shader in the game. In this article, I’m going to talk about how we use this add-on to achieve certain effects and better performance. From an artistic point of view Unity is severely lacking a built-in material /shader editor, something similar to the one Unreal Engine has. Luckily there are some awesome people in the community who took the time and the effort and made such an asset for Unity. Unity recently announced that they are giving a free Accelerator Pack with the Unity Plus subscription. This pack contains Amplify Shader which is a node based material editor. However, Amplify Shader is in an early stage which means it’s still under heavy development but in exchange it is cheaper (50$) compared to Shader Forge (90$) if you don’t have the plus subscription. Arguably the king of these assets is Shader Forge. It is very easy to use, it’s fast and intuitive to work with. There is almost no limit to what you can do with it. If you are familiar with the material editor in Unreal, you are going to have an easy time learning Shader Forge.

First Steps

Let’s start with something simple. The standard shader of Unity doesn’t support RMA or Grey packing of textures so the first thing I did in Shader Forge is to make a basic PBR shader. Grey packing is putting greyscale textures into the red, green, blue or even alpha channels of an RGB(A) image. This way we can have less textures thus increasing performance.

Ship shader

Let’s move on to a more complex shader. We wanted to have the ability to change colors, skins and other effects on the ship. For this we created a shader that is using black and white images to mask out the primary, the secondary and the decal colours on the ship. To mask out parts of the mesh we need a grey scale or as I mentioned before a grey packed image, then using lerp nodes, we can blend between different colors and/or textures. Lerp is not the fastest of nodes but we haven’t experienced any performance drop because of the shader we use on the ship. If we want to blend more than 2 colors we can just plug in the output of the first lerp node, into a new lerp node.
Another possibility is, to use the channel blend node which is basically 2 lerps in one. In our case, it is not possible due to the way we make our masks. Now it should be clear how we build up the shader for our ship; layering colors on top of each other using lerp nodes. The only thing that you have to pay attention to is the order in which you blend the masks. Another cool thing about our custom ship shader is the dissolve effect. It is a very simple shader to make but it looks very interesting.
As you can see it is very easy to setup but if you are struggling with it, take a look at this video by Joachim Holmér, the creator of Shader Forge. Instead of having the “standard” dissolve effect, we wanted to have something that’s more sci-fi, so I quickly created a hexagon pattern in Substance Designer. Using this mask, the dissolve effect has a completely different look. This means you can plug in any mask you want and have a different effect.
Different dissolve effects:
[ms_video mp4_url="" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
[ms_video mp4_url="" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
[ms_video mp4_url="" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]

Tri-Planar mapping

Another cool thing you can do with Shader Forge is tri-planar mapping. This works perfectly if we want to project color or roughness variation on a surface without having to worry about the UVs.
Tri-planar mapping is basically projecting textures onto a surface based on their normal direction. As I said the good thing about it is that it’s completely UV independent. Using a greyscale texture we can easily add some colour and roughness variation to the meshes. We can change the scale, the contrast, the color of the variation, we can even add some dirt in the crevices using the Ambient Occlusion map. This way we can texture a lot of assets only using one trim normal texture. More about how we do this in the next article by Dovydas.
Here are some other tips and tricks for Shader Forge:
    • Use the Set and Get nodes if you have to reuse the same node groups multiple times. It makes your graph much more organised.  It is somewhat comparable to Unreal’s material functions.
  • Don’t forget to clamp your values if you are using add, multiply, power or subtract, this way you can avoid weird looking colours or artefacts.
  • Exposed parameters can be accessed in scripts so you can animate/change your textures as you want.
  • Using an “If” node you can make a quick selector to cycle through your grey packed textures.
  • Using the Normal Direction node, you can create a simple smart shader that applies a certain material only on top of the mesh. You can rotate your mesh the way you want, it's always going to show on the top. Break it up with a noise map to have some variation.
[ms_video mp4_url="" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
  • Using some basic math nodes you can create a simple loading animation shader by making radial gradients.

Winning back frames in Antigraviator

As you could read in our previous blogpost, our amazing artists have done an impressive job polishing up the graphics for the release of our demo. After improving graphics, we had to make sure unity could actually show them on most systems at an acceptable frame rate. So, after losing all our frames to detailed models and fancy postprocessing effects, we had to win them back. Luckily there is a nice tool to help you with just that: the profiler.

The profiler

Even though several Unity webpages still say it is only available in the Pro-version, the profiler is also available in the personal edition. Hidden behind the window button, it opens a nice look into the heart of your game. You may be familiar with the small stats in the editor, that already shows you some nice information. The profiler is that stats-window on steroids. When you first open it up it will look disappointingly empty. It’s when you run the game in the editor with the profiler open that all the magic happens. Unity will spit out a stream of data on several aspects of the game: CPU, GPU, Memory, Physics, even networking. If you don’t see a particular one you might have to add it by clicking Add Profiler. Each one is subdivided in different categories, shown each with a different color in the graph. Clicking any of the aspects gives you more detailed information in the lower half of the window.


When optimizing there are two main targets: CPU and GPU. Memory and possible network issues must not be forgotten, but are usually not that critical. When selecting the CPU Usage, we get the Hierarchy Overview. This table shows all processes and functions eating up your precious CPU milliseconds. It shows how long each one takes, how much percentage of the total time that represents, and we can open them up to split the functions out even further. Here you can hone in on your expensive functions and try to optimize your scripts. We use a spline as the bases of our tracks, and determining where the player is on the track turned out to be a very inefficient function. So rather than calling it 5-6 times, we only call it once every frame, and we might even lower that depending on where the player is. This way we won several frames. If the Hierarchy view is not your thing you can also swap to the awesome TimeLine view, where all the functions are shown chronologically. You can also take a look at the different thread uses there. Two functions you might see coming back quite often are Camera.Render, which is all the work done on the CPU linked to rendering the scene, and Gfx.WaitForPresent. This last one is the time the CPU must wait for the GPU to finish rendering the frame. If you see this turn up, you know your framerate is GPU bound. You can also see this in bar above the detail view, where both CPU and GPU time are shown. There is no point in focusing on optimizing the CPU further if the GPU is already taking twice as long. If the Camera.Render is costing you dearly, take a look at batching to reduce the overhead on the CPU.


Clicking on the GPU Usage shows in detail which part of rendering the frame takes how much time and how many draw calls are needed. Even more interesting is the Frame Debugger, which can be opened if you are standing on the current frame. It will show step by step how the frame is build up in the scene view. This way you can get some insights into what Unity renders and which objects are batched together and drawn in the same DrawCall. To optimize the GPU we have several options. Look into occlusion culling, and cull distances to draw only what you need, use LOD’s to reduce the number of vertices to GPU needs to handle, bake lights and shadows where possible and set up your levels intelligently. Also, try to figure out if you are limited by fillrate, or by memory bandwidth. If the game runs much faster at lower resolutions, you will be limited by fillrate, the amount of pixels that can be rendered per second. How to solve it, I don’t know yet, as we are still trying to improve on the GPU. We did do some tests with rendering the scene to a texture and showing the texture full screen. Even though this causes some more overhead, reducing the texture resolution improves the framerate significantly. We might try to create a type of dynamic resolution scaling with this technique, or look into a more ingenious approach. The first results were promising in any case. I hope some off you reading this can use it to improve the frame rate of your own games, and if you have tips and tricks of your own please let us know in the comments.  

Pushing Unity’s rendering capabilities in Antigraviator

  As you might know, Antigraviator is being developed using the Unity game engine. The last few weeks we have been trying to improve the visual presentation of our game. I am going to share a few tips and tricks learned along the way, mainly intended for developers. If you are familiar with the engine, you will probably agree that Unity out of the box does not look visually impressive. However, it is important to realize that it is improving and that there are external solutions. Things like plugins and scripts, that greatly help enhance the visuals.

Precomputed GI

This is a good solution for global illumination, or bounced lighting, especially for exterior areas. This can be used in combination with standard baked GI. The advantage of this is that it can be manipulated in real time and does not clutter texture memory, unlike standard baked GI. This would become an issue if you tried to bake large open landscapes and structures into lightmaps. It can get quite performance heavy, but here is a good step by step tutorial on how to optimize the performance and precompute times of this feature.

Reflection probes / light probes

If you are making use of the physically based shading/rendering, reflection probes placed around the level will help materials appear more correct. Especially metallic surfaces, by providing them with reflections off of nearby objects, as the name suggests. Light probes are another feature that helped us with grounding in dynamic moving objects in the scene.

Post process

This is a crucial step that often tends to be overlooked. There are multiple solutions, but one we found easiest to use and produce great results can be found here. There is no real universal setting for any scene, each of our levels in the game have their own post process settings. Having no completely dark or white pixels on the screen is a good guideline to keep in mind, unless it is a lightsource. The tool offers anti aliasing, SSAO, SSR, depth of field, motion blur, HDR bloom, lens dirt, tonemapping, chromatic aberration, vignette, and a few more in one package.

Antigraviator Alpha Demo V1.1

This weekend we released a new demo version of the game. The main reason for releasing this demo to all of you is because we want to get the game through Steam Greenlight before the changes that were announced by Valve. I just updated that release on our website to version 1.1. The only changes are support for azerty keyboards (what can i say, we are a Belgian company) and a way to check which controllers are used by the players.

About the gameplay

The game is most fun in split screen multiplayer. The idea is that you pick up power globes along the track. These globes will fill up your power meter at the bottom left on your ship UI. Now it costs 2 power to boost and 4 power to activate a trap. We are still working on making this concept clearer so bear with us for now.

The controls

Just to be sure i am listing how to control the game. I will start by listing the keyboard controls (or just show you a picture). Next up are the controller controls: At this time we have support for PS4, Xbox One and Xbox 360 controllers. Other controllers might work but are not guaranteed. Another important note is the use of the triggers. These work as air brakes and will make it easier for you to navigate thru the corners. That's it for now, we hope you enjoy playing this demo as much as we had making it! Head walrus Out!

The story so far…

As we are getting ready to start kicking the development of Antigraviator in full gear, we thought it would be good to share a bit about the history of this game. From now on we will do regular blog posts to keep you up to date on the journey that is Antigraviator. But lets get started by telling you the story do far...

Who are we?

Well you might have read it on the front page of this website already. I guess you won't mind me repeating it? If you do, feel free to skip this part. We are a group of four students. We study digital arts and entertainment at Howest in Kortrijk, Belgium. Yes, that's the country with the hellhole city somewhere in the middle. Now Kortrijk is far away from this hellhole so not to worry. We are currently in our senior year of this bachelor degree. We even formed a company now called Cybernetic Walrus.

The early days

Antigraviator started out as a school project. We had the idea of doing something different then most other groups and decided to make a racing game. More specifically a futuristic racing game, but we wanted to add a twist. The first idea of the game was to have one player racing and one player activating traps on the playing field to slow down the other player, then the roles would be reversed. However, it quickly became apparent that it was more fun with 2 players racing at once. So the idea changed but the traps remained. Above is a screenshot of an early prototype of the game. By pressing the correct buttons on an Xbox controller, you were able to close a gate and the other player would need to avoid it. This was about a year ago. The game kept evolving and we started to introduce a more physics based movement.  From this point on it is all about iteration, trying some new traps, track designs and many other things I am likely forgetting at the moment. Till it became something like this as you might have seen. Maybe you even downloaded the demo from our website and played this version.

The Future

Enough about the past! Let's look at what is to come. I would like to start by saying that this is not definite, all changes and features discussed might change during the development. I am even going to say that I hope they do. I hope we will get good feedback from you, our community. Changing this demo into a full game means more tracks, well we have 12 planned for release and those will be set in 4 different worlds. We plan 3 types of ships. More traps of course! Online multiplayer in addition to the split screen we have now. An AI to race against for when your friends don't have time to play or for those who go without friends. You will be able to customize your ship with different parts and change the look with some unique skins. So keep an eye out on our blog, follow us on Twitter and Facebook and subscribe to our newsletter so you don't miss out on what is going on. Oh and one last thing, we upgraded the visuals of our game. Head Walrus singing off!