While looking for a solution for volumetric lights in Unity I stumbled upon the „VolumetricLighting” git hub which was created by the team behind the Adam demo. While we ended up not using the volumetric features, we do use the so called „tube lights”. In this article I’ll go over why and how we use tube lights in Antigraviator.
When building environment textures for a game, it is always a good idea to think about its re-usability. Tiling textures or trims are great examples for this. If used wisely and combined with smart shader tricks, they can greatly reduce resource usage, increase performance and save time at once. This of course depends on the environment style you are choosing to do. In our case, clean sci-fi is probably the most optimal for this kind of approach, mainly due to simplicity of man-made shapes and panel layouts. While building this menu environment scene, I wanted to go overboard and challenge myself by using one optimized texture for most the scene. It allows for easy adjustments since most of the environment depends on a single texture. It is a good educational practice, however I soon realized that there is almost always a need for a bit of custom details, such as decals and text to break up the tone. Color and surface variation was done through a shader. While building a reusable texture, or basically a sheet of details for an environment, it is crucial to plan things ahead. It is important to have different varieties of details, while making sure they belong in the same style. Particularly in Antigraviator, things like panels of different sizes, bolts, cables, cylindrical parts, vents, etc. can all come in handy, on which UVs of props can be directly mapped onto. Speaking of optimization, it is interesting to realize that each channel of any RGB(A) texture can be used as different black and white mask. Entire sets of textures can be made from separate masks in these channels. As a practical example, the blue channel of a normal map contains information that is meant to add a bit of depth to the texture, but has nearly no effect. I made use of this and replaced it with an AO map instead. A good practice is to work with masks instead of final texture information. This can be any black and white mask, for example one to separate different materials on a mesh, such as plastic/paint or metal parts. To utilize these tricks, you of course would need a custom shader, this is where Shader Forge or any other custom shader solution comes in handy. The workflow for this usually consists of making a lowpoly mesh first. In case you can afford a few more triangles, chamfering edges once and using face weighted normals can be a great way of getting rid of obvious sharp edges while improving the silhouette. After this is in place, it is mostly a matter of mapping different parts of the prop onto the detail texture sheet in a logical way. For further reading, this technique was inspired by talented artists on here and here.
In state of the game I will not be talking tech. It is a post for our followers who want to know what we have been up to and about what is coming next. If you recently became a follower of the game this post should help you catch up to what has happened in the previous month.
GreenlightLet me start by thanking the Steam community for Greenlighting our game in a mere 8 days. This of course means a lot to us and goes to show that many of you see the potential of this game and might want to buy it!
AntigraviatorNow lets talk about the game a bit and where we are at with the development. Our goal is to release 12 race tracks to start with in 4 different worlds. Let me give you guys a scoop and tell you about these 4 worlds. The first one you have seen in the demo and is a city based world. This weekend we released screenshots of the 2nd world which is a desert based world. The third world will be a snow/ice based world and the fourth will be a space station. Development wise we are about 70% to 80% done with the first two worlds and we just started work on the 3rd one. We have started work on an AI system so you don't have to fly solo just trying to beat the record time. This also means that for multiplayer, where we always had the idea it would be head-to-head, we will probably go for up to 6 players. For our split-screen functionality it means we will also be supporting 4 player split-screen! Another item we worked on is vehicle customization. We don't have too many components to configure your vehicle with at the moment but the general framework is in.
KickstarterLets talk a bit about the future. Developing a game is not cheap. We all know that if you pay peanuts, you get monkeys. Getting funds for the game is important if not only so that we, the developers, can buy food at the end of the day. This in turn is important so we don't die of starvation and can actually finish the game. This is why we are taking the game to Kickstarter. Around the end of April we will launch on the platform and run a 30 day campaign. With this campaign we will also release a new playable alpha. The alpha will include 3 tracks in 3 different worlds, AI racing and 4 player split-screen. So most likely no online multiplayer yet. Well I hope you guys are up to date again with our development. If you have feedback, things you would really like to see in the game or want to adopt a walrus: Info@Antigraviator.com PS: The part about adopting a walrus is a joke. Also note that in no way or form we are harming walruses during the development of this game!
We have been using Shader Forge for Antigraviator for almost every shader in the game. In this article, I’m going to talk about how we use this add-on to achieve certain effects and better performance. From an artistic point of view Unity is severely lacking a built-in material /shader editor, something similar to the one Unreal Engine has. Luckily there are some awesome people in the community who took the time and the effort and made such an asset for Unity. Unity recently announced that they are giving a free Accelerator Pack with the Unity Plus subscription. This pack contains Amplify Shader which is a node based material editor. However, Amplify Shader is in an early stage which means it’s still under heavy development but in exchange it is cheaper (50$) compared to Shader Forge (90$) if you don’t have the plus subscription. Arguably the king of these assets is Shader Forge. It is very easy to use, it’s fast and intuitive to work with. There is almost no limit to what you can do with it. If you are familiar with the material editor in Unreal, you are going to have an easy time learning Shader Forge.
First StepsLet’s start with something simple. The standard shader of Unity doesn’t support RMA or Grey packing of textures so the first thing I did in Shader Forge is to make a basic PBR shader. Grey packing is putting greyscale textures into the red, green, blue or even alpha channels of an RGB(A) image. This way we can have less textures thus increasing performance.
Ship shaderLet’s move on to a more complex shader. We wanted to have the ability to change colors, skins and other effects on the ship. For this we created a shader that is using black and white images to mask out the primary, the secondary and the decal colours on the ship. To mask out parts of the mesh we need a grey scale or as I mentioned before a grey packed image, then using lerp nodes, we can blend between different colors and/or textures. Lerp is not the fastest of nodes but we haven’t experienced any performance drop because of the shader we use on the ship. If we want to blend more than 2 colors we can just plug in the output of the first lerp node, into a new lerp node.
Another possibility is, to use the channel blend node which is basically 2 lerps in one. In our case, it is not possible due to the way we make our masks. Now it should be clear how we build up the shader for our ship; layering colors on top of each other using lerp nodes. The only thing that you have to pay attention to is the order in which you blend the masks. Another cool thing about our custom ship shader is the dissolve effect. It is a very simple shader to make but it looks very interesting.
As you can see it is very easy to setup but if you are struggling with it, take a look at this video by Joachim Holmér, the creator of Shader Forge. Instead of having the “standard” dissolve effect, we wanted to have something that’s more sci-fi, so I quickly created a hexagon pattern in Substance Designer. Using this mask, the dissolve effect has a completely different look. This means you can plug in any mask you want and have a different effect.
Different dissolve effects:
[ms_video mp4_url="http://i.imgur.com/pJ17Dzm.mp4" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
[ms_video mp4_url="http://i.imgur.com/N1BmZ4M.mp4" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
[ms_video mp4_url="http://i.imgur.com/Jgexc4l.mp4" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
Tri-Planar mappingAnother cool thing you can do with Shader Forge is tri-planar mapping. This works perfectly if we want to project color or roughness variation on a surface without having to worry about the UVs.
Tri-planar mapping is basically projecting textures onto a surface based on their normal direction. As I said the good thing about it is that it’s completely UV independent. Using a greyscale texture we can easily add some colour and roughness variation to the meshes. We can change the scale, the contrast, the color of the variation, we can even add some dirt in the crevices using the Ambient Occlusion map. This way we can texture a lot of assets only using one trim normal texture. More about how we do this in the next article by Dovydas.
Here are some other tips and tricks for Shader Forge:
- Use the Set and Get nodes if you have to reuse the same node groups multiple times. It makes your graph much more organised. It is somewhat comparable to Unreal’s material functions.
- Don’t forget to clamp your values if you are using add, multiply, power or subtract, this way you can avoid weird looking colours or artefacts.
- Exposed parameters can be accessed in scripts so you can animate/change your textures as you want.
- Using an “If” node you can make a quick selector to cycle through your grey packed textures.
- Using the Normal Direction node, you can create a simple smart shader that applies a certain material only on top of the mesh. You can rotate your mesh the way you want, it's always going to show on the top. Break it up with a noise map to have some variation.
[ms_video mp4_url="http://i.imgur.com/dryPGvV.mp4" ogv_url="" webm_url="" poster="" width="50%" height="50%" mute="yes" autoplay="yes" loop="yes" controls="no" class="" id=""][/ms_video]
- Using some basic math nodes you can create a simple loading animation shader by making radial gradients.
As you could read in our previous blogpost, our amazing artists have done an impressive job polishing up the graphics for the release of our demo. After improving graphics, we had to make sure unity could actually show them on most systems at an acceptable frame rate. So, after losing all our frames to detailed models and fancy postprocessing effects, we had to win them back. Luckily there is a nice tool to help you with just that: the profiler.
The profilerEven though several Unity webpages still say it is only available in the Pro-version, the profiler is also available in the personal edition. Hidden behind the window button, it opens a nice look into the heart of your game. You may be familiar with the small stats in the editor, that already shows you some nice information. The profiler is that stats-window on steroids. When you first open it up it will look disappointingly empty. It’s when you run the game in the editor with the profiler open that all the magic happens. Unity will spit out a stream of data on several aspects of the game: CPU, GPU, Memory, Physics, even networking. If you don’t see a particular one you might have to add it by clicking Add Profiler. Each one is subdivided in different categories, shown each with a different color in the graph. Clicking any of the aspects gives you more detailed information in the lower half of the window.
OptimizingWhen optimizing there are two main targets: CPU and GPU. Memory and possible network issues must not be forgotten, but are usually not that critical. When selecting the CPU Usage, we get the Hierarchy Overview. This table shows all processes and functions eating up your precious CPU milliseconds. It shows how long each one takes, how much percentage of the total time that represents, and we can open them up to split the functions out even further. Here you can hone in on your expensive functions and try to optimize your scripts. We use a spline as the bases of our tracks, and determining where the player is on the track turned out to be a very inefficient function. So rather than calling it 5-6 times, we only call it once every frame, and we might even lower that depending on where the player is. This way we won several frames. If the Hierarchy view is not your thing you can also swap to the awesome TimeLine view, where all the functions are shown chronologically. You can also take a look at the different thread uses there. Two functions you might see coming back quite often are Camera.Render, which is all the work done on the CPU linked to rendering the scene, and Gfx.WaitForPresent. This last one is the time the CPU must wait for the GPU to finish rendering the frame. If you see this turn up, you know your framerate is GPU bound. You can also see this in bar above the detail view, where both CPU and GPU time are shown. There is no point in focusing on optimizing the CPU further if the GPU is already taking twice as long. If the Camera.Render is costing you dearly, take a look at batching to reduce the overhead on the CPU.
GPUClicking on the GPU Usage shows in detail which part of rendering the frame takes how much time and how many draw calls are needed. Even more interesting is the Frame Debugger, which can be opened if you are standing on the current frame. It will show step by step how the frame is build up in the scene view. This way you can get some insights into what Unity renders and which objects are batched together and drawn in the same DrawCall. To optimize the GPU we have several options. Look into occlusion culling, and cull distances to draw only what you need, use LOD’s to reduce the number of vertices to GPU needs to handle, bake lights and shadows where possible and set up your levels intelligently. Also, try to figure out if you are limited by fillrate, or by memory bandwidth. If the game runs much faster at lower resolutions, you will be limited by fillrate, the amount of pixels that can be rendered per second. How to solve it, I don’t know yet, as we are still trying to improve on the GPU. We did do some tests with rendering the scene to a texture and showing the texture full screen. Even though this causes some more overhead, reducing the texture resolution improves the framerate significantly. We might try to create a type of dynamic resolution scaling with this technique, or look into a more ingenious approach. The first results were promising in any case. I hope some off you reading this can use it to improve the frame rate of your own games, and if you have tips and tricks of your own please let us know in the comments.
As you might know, Antigraviator is being developed using the Unity game engine. The last few weeks we have been trying to improve the visual presentation of our game. I am going to share a few tips and tricks learned along the way, mainly intended for developers. If you are familiar with the engine, you will probably agree that Unity out of the box does not look visually impressive. However, it is important to realize that it is improving and that there are external solutions. Things like plugins and scripts, that greatly help enhance the visuals.
Precomputed GIThis is a good solution for global illumination, or bounced lighting, especially for exterior areas. This can be used in combination with standard baked GI. The advantage of this is that it can be manipulated in real time and does not clutter texture memory, unlike standard baked GI. This would become an issue if you tried to bake large open landscapes and structures into lightmaps. It can get quite performance heavy, but here is a good step by step tutorial on how to optimize the performance and precompute times of this feature.
Reflection probes / light probesIf you are making use of the physically based shading/rendering, reflection probes placed around the level will help materials appear more correct. Especially metallic surfaces, by providing them with reflections off of nearby objects, as the name suggests. Light probes are another feature that helped us with grounding in dynamic moving objects in the scene.
Post processThis is a crucial step that often tends to be overlooked. There are multiple solutions, but one we found easiest to use and produce great results can be found here. There is no real universal setting for any scene, each of our levels in the game have their own post process settings. Having no completely dark or white pixels on the screen is a good guideline to keep in mind, unless it is a lightsource. The tool offers anti aliasing, SSAO, SSR, depth of field, motion blur, HDR bloom, lens dirt, tonemapping, chromatic aberration, vignette, and a few more in one package.
This weekend we released a new demo version of the game. The main reason for releasing this demo to all of you is because we want to get the game through Steam Greenlight before the changes that were announced by Valve. I just updated that release on our website to version 1.1. The only changes are support for azerty keyboards (what can i say, we are a Belgian company) and a way to check which controllers are used by the players.
About the gameplayThe game is most fun in split screen multiplayer. The idea is that you pick up power globes along the track. These globes will fill up your power meter at the bottom left on your ship UI. Now it costs 2 power to boost and 4 power to activate a trap. We are still working on making this concept clearer so bear with us for now.
The controlsJust to be sure i am listing how to control the game. I will start by listing the keyboard controls (or just show you a picture). Next up are the controller controls: At this time we have support for PS4, Xbox One and Xbox 360 controllers. Other controllers might work but are not guaranteed. Another important note is the use of the triggers. These work as air brakes and will make it easier for you to navigate thru the corners. That's it for now, we hope you enjoy playing this demo as much as we had making it! Head walrus Out!
As we are getting ready to start kicking the development of Antigraviator in full gear, we thought it would be good to share a bit about the history of this game. From now on we will do regular blog posts to keep you up to date on the journey that is Antigraviator. But lets get started by telling you the story do far...