Last weekend Caley and myself took part in the coolest game-jam we’ve ever attended: The roomscale Vive VR jam! Valve and HTC came by Vancouver and set up 3 complete VR rooms of about 12’x10′. There were about 100 attendees making several VR games, and everybody shared the rooms for testing during the two days of the jam.
My personal goal for the jam wasn’t to make an actual game though. I decided I want to get my engine up and running with the Vive! Even though Scrap Metal was the last real 3d game we made, and Shellrazer and Viking Squad are mostly 2d, the guts of the engine are still all in 3d. Of the about 100 people at the jam, there were only 3 people working on their own tech, the rest were all using Unity and Unreal. As a programmer this makes me sad.
Getting the engine converted to be able to render with the Vive was actually very straightforward. The VR library that steam provides is free to download, and it even works with the Oculus DK2! I was able to test small things using just my laptop and the DK2, and when I created a build to test on the Vive setup, it worked straight-away! I had a few problems with the transform matrix coming from the controllers (different coordinate spaces), but once that was solved it was all working!
Working with the Vive, we learned a few things very quickly. First off, you have to run at 90 frames per second. Anything lower than that, and you will start feeling sick very quickly. Most console games nowadays run at 1920×1080, either at 60, or sometimes at 30 fps. For 60fps, you have about 16 milliseconds per frame to fill 2,073,600 pixels. The Vive has two screens of 1200×1080, and each needs to run at 90 fps, meaning you have 11 milliseconds to fill 2,592,000 pixels. To really put that in perspective, for a 1080p game at 60fps you need to fill just over 124 million pixels per second, for a Vive game at 90fps that number is 233 million(!). So, you need to have some serious consideration for what you’re rendering, and how it’s being rendered!
The second thing I learned was that the controllers are a HUGE part of the VR experience. I drove down to Valve to test their prototype VR setup about half a year ago, and at the time they were only showing the headset. That experience blew my mind, because I could feel my brain being conflicted about whether this was real or not. Of course I could reason ‘this is not real, I am just in a room with a headset on’, but at the same time every nerve in my body was resisting when I was asked to step off a virtual ledge. This weekend was the first time I got to try out the controllers, and it blew my mind all over again. When you put the headset on, you can see the controllers in VR on the floor below you. It’s just so natural to bend your knees and grab them and not even think about it twice. That is amazing! And now suddenly you have all this interactivity in the scene. You can grab things, you can throw things, etc. I think the controllers are an essential part of making VR actually work properly. Selling just the headset won’t cut it (looking at you Oculus!)
Alright, so what did we end up making? Well, I just made a little toy where you can play around with our particle systems. But in my own engine! :) Each controller has the same 5 different particle effects you can rotate through by pressing the touch-pad on the controller. All there is to do is to play around with the particles! Somehow it’s pretty fun and satisfying though. :) Here’s a photo of Colin Northway of Northway Games playing around with the particle systems:
I forgot to record the actual output to the Vive in a video, and since I don’t have a Vive here, I modified the code a bit to place a few controllers and move them around in circles. So this is an example of what he might have seen, except then in 3D and using the actual vive controllers. :)
Caley used Unity to make a super sweet VR Tennis game in Unity where a ball-launcher shoots balls at you, and you try to hit them with a racket and try to bounce them back over the net to score a point. He also forgot to fraps it. Maybe he’ll do another blog post in the future if we can somehow record some footage on a real Vive at some point.
In conclusion, it was a super fun weekend, and I cannot wait to get my hands on an actual Vive to experiment with for our future projects. Thanks again to Valve, HTC, Radial Games, Cloudhead games, Nvidia, Unity and Unreal for putting this on!
This week Jesse is on a sweet trip to Japan with our good friend Ryan Clark of Brace Yourself Games, so I’ll take over and do another Tech blog. This one is about the Texture Tool I’ve build a while ago. This tool fills a very important role in our pipe line, but because it’s all behind-the-scenes it doesn’t really get much attention. This is its time to shine!
First a bit of history so the requirements of this tool become clear. When I first started building my engine, we were making N+, a game that doesn’t have a lot of textures, so no special care was taken to manage the textures in an optimal way. Then, when we started building Scrap Metal, and later Shellrazer, the number of textures increased drastically. The need for an extra step in our pipeline became clear. We needed a tool that could convert all of the source art to a format that is more optimized for the final game, while giving us the control to convert/change/compress the textures as we saw fit.
The first step to optimize the source data is to create texture atlases. The thing with many separate textures is that each of them needs a separate draw-call, which in turn means that you get draw-call bound very quickly. A common way to fix this issue is to combine multiple textures into texture-atlases. This way you can batch all the calls that use the same texture atlas into one draw call, and therefore drastically reduce the number of calls needed to draw your frame, generally resulting in a faster frame-rate. Here’s a screenshot of our texture tool showing how it combined a bunch of textures into a texture atlas:
How do you decide which textures to combine into an atlas? If you just randomly start combining textures, there’s still a chance of doing lots of draw-calls if you require a texture that’s in atlas A, and then a texture that is in atlas B, then another texture from atlas A, and so on. So you will want to combine textures into one atlas if they are likely to be drawn in order. In our case, our puppets use many separate textures, but they will all be rendered in one go, so they could easily be combined. Luckily, Jesse creates a new directory for each character he creates, so the first filter we added is to group textures by the sub-directory they are located in.
There are also textures that generally belong together, such as textures that are used only in the world map. So we added a way to set the category of a directory (recursively), or an individual file. This allows us to easily combine all ‘user interface’ textures into the same texture atlas, so that the user interface can be drawn in a single draw call.
The last filter we need is of course how you want the texture to be compressed. Some textures require RGBA8888 (such as visual effects that do multiple layers of overdraw), and some are fine to be compressed using DXTC5. The entire atlas texture is compressed, so all the textures within that atlas require the same texture compression. The required compression is again set on a directory basis, or on an individual file basis.
Another wish for this tool was a way to easily re-size textures manually. Some textures are used in ways where cutting their size in half isn’t noticeable in the final result. For example textures that were blurred can easily be reduced to 50% or even 25% of their original size. Keep in mind that all of this is happening while building the game package, so none of the original art is resized or altered. The resize factor can be set manually per texture.
Because some of the art we use is actually drawn on a higher resolution that it will ever be shown at in the game, I came up with a way to try and reduce all textures to exactly the size they require on screen. The way this works is that each entity is drawn at the size they appear in the game, and the size at which the textures were rendered is recorded as this is going on. It keeps the average size each texture was rendered at, as well as the standard deviation and min/max values. In the end, the recorded sizes, as well as a recommended resize factor is exported. These values are visible in the image above: The bow image is resized to 57.43% of its original size.
When I checked the game with NVidia’s awesome NSight, you can see that it draws large chunks of level data in single draw calls:
The lanes are drawn using multiple separate textures, but because they were combined in a texture atlas, we were able to draw it in one draw call.
Same goes for the puppet of the shield guy. One draw call is all it needs to draw the entire puppet, which consists out of 30 or so individual textures.
All in all, this tool saves us large amounts of texture space, and it really optimizes the way we draw our frames.
Oh and as a reminder, because Jesse’s in Japan, we won’t do a dev-stream today. We did one yesterday though! Check it out here and here.
This week it’s time for a rather mundane task: taking out the trash. Every project I’ve worked on seems to collect a lot of assets that were used in a test or generally assets that are just no longer actually used in the game. In most of those cases, it was actually rather difficult to figure out what is being used, and what isn’t. Often I’d delete an asset, and then two weeks later suddenly somebody notices that it is missing in an obscure part of the game.
I started thinking about how to fix this. Luckily the Slick file system uses a standard way to reference files: Every file is stored relative to a data root. For example, if a texture is located in C:\Games\VikingSquad\Data\Textures\MyTexture.png, the game code and data will always reference it as DATA:Textures\MyTexture.png. The conversion of a full path to an aliased path is done in the editor, and it guarantees that all data files are properly referencing other data files.
So, I figured, I should be able to create a quick C# tool that scans every data and source file for this DATA: tag, and stores the links. It should therefore be able to see whether a file is referenced or not, and as an added bonus it should be able to see which resource is referencing another resource.
The code simply recursively scans all directories and files (excluding specific files and directories, such as PSD files, or svn directories), and for each file it scans the binary file data for the string “DATA:”. When it finds the string, it tries to read the rest of the null-terminated string to determine which file it’s referencing, and stores this reference in memory. The tool scans the entire data directory, and it also scans the entire source code directory, so that any hard-coded references are found as well. Very handy!
Well, I coded it up, and it works really great! The scanning process takes about 10 to 15 seconds to complete, but it scans every single source and data file we have (about 7000 files). The final result isn’t very pretty to look at, but it does give lots of useful information. Click on the screenshot below to see what I mean:
In this case, I’ve selected a background file called DEBUG_NICK. It has 1 parent (meaning, one other resource is using this file), and 6 children. At the bottom you can see which resource is the parent (in this case Debug_Nick.destination), and which are the resources used by this background (in this case a few png’s, a few puppets, and a lighting setup).
To find out which resources are no longer used, you can simply sort by the NrParents column, and go through all of the resources that have 0 parents. If you found a texture that looks like it’s a test texture, you can find it in easily by typing the name in the text-box at the very top, and the list will filter based on the typed name. Then you can see who is still referencing is, and fix the problem. When you’ve determined an asset can be removed, you can delete the asset right from this tool, which will call the proper SVN delete command.
In conclusion, it still takes a bit of manual labour to go through the assets and determine if they are used or not, but it’s a hell of a lot better than guessing whether it can be deleted. We can now delete with confidence.
Alright, that’s it for this week. Keep throwing out the trash!
Also, remember: As always we’ll be Dev-Streaming today at 4pm-6pm PST. We’ll be working on the game and talking to our followers and answering any questions that come up to the best of our ability! It’s been a lot of fun and really rewarding doing the dev-stream so come on down and say hey!
This week a bit more tech stuff. When we were play testing our game, we had a problem with the screen being too busy/attention grabbing at times. It was hard to see what to focus on, as your eyes would get drawn to parts of the screen that didn’t necessarily matter to the game play. A trick we used in Shellrazer was to use a blurred background to make the foreground elements stand out more. So I decided to add it to Viking Squad to see how it looks.
The process is fairly easy. We mark the elements in the world as ‘foreground’, ‘background’, or ‘normal’. Now that the elements are marked, we use a number of steps to get the blurred effect we want:
Step 1: Background pass. All background elements are rendered to a render target, and the render target is blurred. The render target is 2/3rd the size of the full frame. The reason it’s smaller is to save video memory, and once the image gets blurred you can’t really tell it’s at a lower resolution anymore.
Step 2: Foreground pass. All the foreground elements are rendered to a render target that is half the size of the full frame. Note that the render target is cleared with RGBA 0x00000000, so that the alpha information stays correct for later. The render target is then blurred.
Step 3: Normal pass. (This pass us using MSAA) In the normal pass, first the background pass texture is rendered as a full screen quad (zwrite off). Then the entire scene is rendered on top of this. After the entire scene is done rendering, the foreground pass texture is rendered over top as a full screen quad using alpha blending.
After these steps, the rest of the post processing steps are performed, like Color Grading, bloom, etc.
The result is subtle in a static image, but when moving around it’s a lot better than before! Mouse-over the images below to see the before and after (click to go fullscreen)
A small addition, but the combination of all these little bits will make the game more and more pretty. Hopefully. :)
Alright, that’s it for this week. Until next time! And don’t forget our dev-stream later today!
This week it’s time for another tech-post. However, it won’t be about a shiny new rendering technique. Instead it’s about an equally important part of the process of creating a game: the build machine. In my IGS/GDC talks about developing N+ and Scrap Metal I’ve stressed how good it is to have a build machine that can reliably and repeatedly build your game from scratch and package it all up into the exact package you need. There are many different packages you can use to setup your build-machine, but most of the ones I (admittedly briefly) tried were either too complex for my needs or couldn’t do exactly what I wanted. So I built my own! :)
The Slick build system is written entirely in C#, and in essence is just a way to run tasks on a remote machine. So it can build the game, but it can also do non-build tasks such as running tests or re-sizing textures. The system is broken up in to three main parts that communicate through a TCP/IP connection. The three main parts are the Master-Server, the Drone, and the Client. Each of these has a different role, which I will talk about below. First, a rough overview of the build system:
The Master Server
This is the program running on the main build computer. In our case this main build computer is a spare computer we had, running Windows 8.1. It has a few shared directories setup where it puts the completed builds, so that anybody in our work network can access the completed builds. All the master server does is wait for drones and clients to connect, and answer to their needs. It is the center point of the build system. Here’s a screenshot of the master-server program:
On the left you can see the currently connected drones, and because one is selected, it will show the tasks it can do, and the queue of tasks. The queue also shows tasks that were completed in the past, including if they failed or succeeded.
This is the actual program that performs tasks. There can be multiple drones with different sets of tasks they can perform, all connected to the same master server. The drone is a console application, so it’ll look like this when it’s running:
The tasks each drone can do are defined in an XML file. Below is an example of an XML file that can be used by our windows based drone to build the game in rtm (release to manufacture) mode, and it will send emails when builds succeed or fail:
If you look closely at the XML, you’ll see that this build scrip first calls SVN to clean-up and update two directories (one for data, one for code). Then it calls ‘BuildGame.exe’, which is a separate app I created that actually creates the builds, copies the files to the correct directory, creates a versions.txt file, etc. Come to think of it, this is the actual build-machine, really. When it’s done it copies the completed build into the shared directory on the build server. The drone is written as a console app in C#, and runs on windows as well on OSX (using Mono).
The Client is the interface that allows anybody to request builds. It looks like this:
When the client is started, it connects to the master server to request the list of drones. This list of drones is shown in the list view on the left. After you select a drone, the tasks that drone can perform are listed in the ‘Possible Tasks’ list, and the Queue is shown below. To request a Windows RTM build, you simply select ‘Build Windows RTM’ and click ‘Enqueue Task’. The Queue will then show the date and time the request was made, and have the status set to ‘Pending’. Once the build is complete, the Status will change to either Success or Failed, and the information can be viewed on why it might have failed.
So that’s it! We use this build machine for our regular build tasks, such as creating builds for PC and PS4, uploading new builds to Steam, and even to run our automated texture resize task, which resizes textures based on how big they appear on screen in the game (which saves TONS of memory).
Alright, that’s it for this week. Keep those build-machines building!
Welcome back followers of the fearsome! This week we’ll be taking a quick look at our home base in Viking Squad! It’s gone through quite a few changes but as we get closer to completion we wanted to really nail it down. The main hub initially only had a few options that our players could […]