The Laptop Saga

Back in Feb, I did a silly. I took my laptop to the pub in my nice PlayStation bag and I took my eye off it for a split second. Within moments, a gentleman helped himself to the bag and buggered off. Lukcily I was insured, so although I lost my nice Asus UX305, it wasn’t the end of the world. They couldn’t source another, so I got the money and looked for a new one.

I first replaced it with an HP x360. The laptop lasted a week. Seriously, this was quite frankly the WORST laptop I had ever had. The colours were washed out, the fan made an annoying buzz, the sound card cut out and for £550, it was quite frankly sub par. OK, I may have had a dodgy one, but type in x360 into google, and look at HP support forums, and you will get the idea, So that went back to the shop.

I had an old, not too bad Lenovo, so I grabbed a SSHD and put it in, to give the old girl a bit of life. It was not to be. Windows update… that’s right you ehard me WINDOWS UPDATE killed my pc. It did a very long update and then refused to boot. I put the old hdd in it and it did the same thing. Booo, although that Laptop was 6 years old and had done me well.

So finally I grabbed the Acer Swift 3. I have had it a grand total of a day and so far it is great (I really hope it doesn’t break down now). Nothing super powerful, but enough to make 2D games and stylized 3d games.

  • Core i3 6100U
  • 8GB Ram
  • 128GB SSD
  • Intel HD 520

The processor is a bit worse, but it does the job… and it was £150 less. So a bit of a saga, but hey, worked out in the end


Unity Graphics Tutorial – Introduction to Shaders and Future Tutorials/Articles

Hey look, once again I am talking about graphics. And once again, I will be talking about shaders. Why? Well it is 2017, it STILL seems to be an area that some game developers are scared of, we seem to be relying on more node based “auto-generation” shader tools such as Shader Forge that seems to be creating crazy complicated shaders that may look pretty but could be so much simpler, and all the books I read seem to be cookbooks that just contain recipes on how to create certain effects and never actually explain how the function is actually working.

Also I have an objective of actively getting better at visuals in games in general by approaching them in a way of trying to help get stuff to look pretty but so it doesn’t destroy our performance. Also I want to be able to use these at the start rather than do the “throw the most expensive technique and make it look good quickly then optimize it later” attitude that I have come across a few time.s it is not clever, you can quite easily get yourself into a corner. The game can look pretty as hell, but if it runs like shit, all you have is a fancy tech demo that only runs on expensive hardware.

Anyway, as part of me getting better at visuals, I am revising all my shader knowledge and going to be sharing it here. I know the R-word can give us war-time style flashbacks to exams, but actually it is a super important thing to do when you are working on a craft. And you can find time to do it, for example, I have a bit of down-time on a Sunday afternoon and making the best use of it rather than sitting in my pants and playing Horizon: Zero Dawn for hours on end (which admittedly is hard not to do as that game is REALLY good, congrats Guerilla).

Right, flavor part over, let’s get on with the show. In this article I am going to give a super high level introduction to the graphics pipeline, Vertex and Pixel Shaders and how that ties into Unity.

In the beginning

Once upon a time, many moons ago, back when Spyro was cool and not part of the whole Skylanders thing and Nintendo had this mysterious device called “The GameCube”, Game Developers used to use something called the Fixed Function Pipelin. As a Game Dev, you would take care of updating your game logic and sending your textures and triangles to the GPU. This device would then do all the work on them using the FFP. This was cool, but they were fixed. You could make the gpu do things like “turn lighting off” or “show green fog”, but these were fixed functions that the GPU could perform, meaning they were not very flexible and you couldn’t customize them easily.

Stuff needs to look prettier

The fixed nature of the Fixed Function Pipeline though made the amount of techniques available to Game Developers limited. Although there as a wide range of cool stuff that could be done with the functions in the pipeline, it still wasn’t enough, especially for those chasing realism. Soon the GPU evolved from configurable implementations of a complex fixed function pipeline to highly programmable bits of hardware where developers could implement their own algorithms. Programmable shaders.

The Graphics Pipeline

So you have an idea of what a shader is right? A program that is created by Game Developers to implement their own cool graphics algorithm on the GPU.

Before I jump into them in further detail though, it would be downright irresponsible of me not to talk about the Graphics Pipeline. Below is a diagram of the geometry processing and rasterization stages of a typical GPU

Cool, huh?

There are a lot of resources out there explaining every single of these stages, so I am not going to spend a very long time explaining what these all do (right now). I will stick the ones we care about for this particular article.

Vertex Shader

As you may or may not know, when you load a 3D model into a game it is typically described using triangles. These triangles are in turn defined by vertices. These Vertices, or each individual Vertex, are data structures that contains things like the position of the point in 3D space, color of the vertex, the texture coordinate and the normal vector. Vertex shaders manipulate this data. They can potentially change the color, change the position in the 3D space or change the texture coordinates to scroll a texture across the object. One basic example of Vertex Shader usage is to create a flag by displacing the vertices of a flat mesh using a sin wave.

Pixel Shader

Also known as a fragment shader, a pixel shader calculates the colour of a pixel on the screen based on what the vertex shader passes in, bound textures and user-added data. This works out what colour and transparency that pixel should be for the current primitive. The pixel shader can be used to take care of stuff like per-pixel lighting and bump mapping and can be used to achieve cool special effects like Depth of Field, simulating fire and creating shadows.

Shader Languages

So you can write these pixel and vertex shaders, but how do you do that? Well it depends on the target environment, i.e. the graphics API you are using. OpenGL uses a shader language called GLSL where as Direct X uses a language known as HLSL. Unity uses a variant of HLSL.

What about Shaders in Unity?

Although it is useful to know about the Graphics Pipeline and Vertex and Pixel Shaders, Unity makes things a tiny bit easier for us. I briefly talked about lighting in the Pixel Shader section. Writing shaders that interact with lighting can get very complex and Unity has a solution for this with Surface Shaders. They essentially make it easier to write any shaders with lighting easier than if you wrote it in a Fragment and Vertex Shader. The code is still written in HLSL but in a way that removes a lot of the repeated code required when writing lighting in Vertex and Pixel Shaders.

In other terms, Surface Shaders have a built in concept of how light should behave where as Vertex and Pixel Shaders do not. So, if you are doing stuff with lighting, you probably want to use the Surface Shaders.

What’s next?

We will actually make a Surface Shader in Unity. This was here to  whet your appetite and kind of give you an inkling of what shaders are and where they fit into everything. If you want to go deeper or get a more technical and more complex explanation of the rendering pipeline and Pixel/Vertex Shaders then it is worth having a peak at the following:

Key Point Across Tutorials and Articles

Although I mainly focused on shaders in this article, one of the points I am going to make when we look at graphics techniques is emphasizing the fact that there are many ways to get to where you want to go. Unity can be a little bit bad for getting sucked in to using “clever” graphics techniques that are available at a tick box or from the asset store. As someone once said, “there is more than one way to skin a cat”. Although a graphics technique maybe the next triple-A buzzword, it doesn’t mean it is necessarily correct, and graphics is full of tips and tricks for getting stuff to look pretty in a perfomance friendly wy. You can even drag out little tips and tricks used from the PS2 days if you want. Also thinking about these tips and tricks from the get go rather than just using the knowingly expensive methods will help you not get backed into a corner later down the line.


Start of 2017 and Level Building with Tiled

So despite January being a pretty shitty month regarding personal life with break ups, etc (yeah game devs are humans too, funnily enough :P), the plus side is that I devoted a lot of time and energy into my project that I have been working on for like a million years. “But wait!”, I hear you cry, “your game was an isometric 3D cool looking thing. What is this pixel art thing I see above?” Well I got to a point with the old project where I realised that quite frankly, it was too much for a project I was working on in my free time. I mean, come on. I am working on this awesome game in my day job 😉 Anyway the pixel art is not final, it will be top down 2D but it was me learning how to draw and my favorite artist is also getting involved (and not just with Art, with story, design, and all manner of cool things! I am even trying her how to code… kind of :P), along with others so I have a toe firmly up my backside and have people who are skilled at other things to help out. Hopefully we will have something playable in a couple of weeks to see if it is fun and what we need to change to make it fun.

Anyway, on to the technical stuff. Yes the above is still Unity with a combination of 2D toolkit, but we are doing some cool stuff with loading levels, which will also allow us to “hot reload” levels whilst the game is playing. What this means is I can give a standalone build to a designer and Tiled and he/she will be able to hot reload the level.

So OK, what even is Tiled?

Tiled is a powerful Map Editor that allows you draw maps using tile sets. It has been used in a variety of games which I THINK includes Titan Souls and Hack and Slash. If you want to know how it works, Games From Scratch has a really good tutorial on how you can make cool Tile Maps. Essentially it allows us to make layered tile maps that we can use to make cool levels. Note: What I am about to explain is the specific way that I am using Tiled and Unity. There are many ways to do it, but this is the setup for my specific project. This is also unlikely to be super detailed, it is meant to be an overview of what I am doing, not a tutorial.

Above is my setup for my really really basic test level. As you can see I have 4 layers. The top, the middle and the bottom grid layers. Any tiles in the background and foreground layers are visual layers (the foreground layer been drawn in front of level characters such as the player, and anything in the middle level is deemed as a collision object as it is in the same level as the player and other objects are. The Object layers is actually where we set up THINGS or objects in the level. These also have metadata. If I select a waypoint for example.

You will see it has a name, a type and a load of custom properties that I use to hook up the enemies.

I then export this to JSON and it looks kind of like this:

You can view the whole Tiled JSON format here.

OK, I have a load of data, how do I actually get the game to understand it

Obviously, we first load the file as Json. I used MiniJSON to do this and then deal with it. You could do some clever auto-mapping using Unity’s in-built parser or Newtonsoft, but for this I like to have control of what data I actually need and what I am going to do with it.

First I have a prefab that is a tile. This contains three tk2DSprites that represent the foreground, background and middle. We build a number of these tiles the correct distance apart using the width and height of the map defined in the JSON. We then go into the giant data array, convert it to a 2D array using the width and height. and use the indices to set the sprites. All of the spritesheets/tilesets map one to one in Unity and Tiled. The tileset you see in Tiled is exactly the same as the one I use in Unity and 2D toolkit. The indexes in the sprite sheet in the data ALMOST match up to the 2D toolkit sprite indices (out by 1). Using the data level in each layer we set the correct image on the tile. Simple huh. We also set any of the tiles that are populated in the middle layer to “non-walkable” in our A* Pathfinding system.

The slightly more interesting thing is the Soldier. Tiled’s object system is very powerful and can be used for a load of cool things. Unfortunately in our use case it represents object positions in pixel space rather than the tile systems grid space. I have set rules for the designers that the objects must be one tile size large (which is 32 x 32) in our case and must be snapped to a tile position. You can draw objects anywhere in the object layer of any size, however that is not the way our game is using the tool. Like I said, if you look in the image above in the player object we get the x and y in pixel space. How do we find the grid position? Divide it by the tile width and height respectively. Using this data, it loads prefabs and populates them in the correct places on the map. These objects then use the propertiesd ata that we set up in the metadata in Tiled:

To populate things like what waypoints belong to which enemy, what AI behavior should that enemy use, etc, etc.

Anyway, I know that was quite a bare bones overview, but I hope you found it interesting at least and a bit of help if you are using Tiled with Unity or any game engine!