Low Resolutions was my first solo attempt at making a larger scale game. The intent was for Low Resolutions to be a 10-15 minute experience that explored difficult choices and the impact those choices have on the people making them. My aim with the project was to create as much as possible myself to push myself to learn a range of new areas (audio and animation).
For the second project in Studio 2 this trimester our students were tasked with adapting an artwork from the Gallery of Modern Art in a week. Originally, as discussed here, I wasn’t intending on joining in the project. However, as soon as I saw Red Shift by Shane Cotton I immediately changed my mind. Red Shift caught my eye from across the room. That ability to grab my attention made my decision up for me. I would make a game based on it 🙂
Coming up with a concept
Adapting an artwork is something I had previously done with My Poor Retinas. That was quite a different scenario though. In that case there was also a reference videogame to work from. With Red Shift those additional constraints weren’t present. I could make, in theory, anything.
To work out what I wanted to make I wanted to first consider the elements of the artwork that stood out to me. In particular:
- The dark and threatening backdrop. On the left the sheer cliff face and on the right a small flock of lightly coloured birds flying as dark stormy clouds loom above them.
- The tension inherent in the artwork. To me, Red Shift doesn’t feel happy or hopeful. It feels like the birds are holding on to survival and that their hold is doomed to fail.
Putting the player on the Precipice
Those were the elements that I wanted to capture but just rebuilding the artwork in Unity would not recreate that same feeling of tension. Seeing an artwork in person is a very different interaction to on a computer. Red shift is large and looms over the viewer. I needed to somehow capture that tension in the game.
In the end I chose to switch perspective to look from the top down onto a rugged and barren landscape. The birds would fly slowly in circles above the landscape. A thin layer of fog would slowly rise up obscuring the terrain. The aim was for that fog to create the sense that the birds were under threat. I didn’t feel that would create enough of a sense of threat though.
I felt that for the player to actually feel the tension they needed to have some agency in the world. The player needed a hand in the fate of the birds. My solution for this was to require the player to intervene. If the player did nothing then one by one the birds would fall and fade away. Once all the birds were gone the game would end. If the player held down space then it would reverse the bird’s fall and keep them flying forever … as long as the player didn’t let go.
Iteration on the feeling
The initial design of the player being able to not just halt, but reverse, the descent of a bird lessened the feeling of tension. The player could let go for a time and then always recover. During a showoff of the various games Kira suggested having the spacebar halt, but not reverse, the descent. I changed the bird behaviour accordingly and it made a noticeable difference. Now every second the player let go the birds moved closer to their death. There was no rewinding time.
Did it work though? For me at least the answer is yes. It immediately felt more tense. Once more people playtest it then I’ll find out how well it worked for others.
Communicating to the player
Precipice is a very minimal game. The player does a single thing, there is no win state and the entire game can be over in just 100 seconds. That meant there was very little time to communicate to the player what was going on. I chose to use a single text prompt at the start to teach the player the one and only control. With a suitable font that prompt felt nicely integrated with the game.
The more critical part was communicating the threat the birds were under and communicating when a bird was falling. My solution here was:
- Apply a glow effect to the active bird
- As a bird falls it moves closer to the ground and fades out (transparency increases and the colours fade to grey)
- As a bird falls the fog also rises, and rises rapidly
The glow was straightforward (I already had a glow effect asset from the Unity Asset Store) and the fog rising was a simple variable change. The transparency and fading to grey was the trickier part.
Enter the shader
In my previous postmortem I mentioned that I had avoided writing shaders and that in future I would take advantage of any opportunity to write a shader. Good news past me, the opportunity presented itself with Precipice. In particular I needed a shader that could:
- Handle textures that used alpha cutoff (pixels with an alpha below 50 were transparent).
- Desaturate the texture based on a parameter that was controlled in code.
- Make the texture progressively more transparent based on the same parameter used to desaturate the texture. While not breaking the alpha cut off behaviour.
I can’t post the full code of the shader here (it interacts with the glow effect which is from the Unity Asset Store) but the process I followed was:
- I created a new standard shader and then researched how to implement alpha cutoff and the desaturation. My starting point was the answers on the forum here.
- That solution handles the transparency but doesn’t fully implement the cutoff behaviour. That required some more research into how that is implemented but I was successful. The texture was drawing correctly and could be desaturated.
- The final trick was to respect cutoff while still fading out the texture. This ended up being a simple matter of reducing the alpha only when it was above the cutoff point.
It was a slow process and it did require a lot of iteration, testing and research but it was successful. In the end the shader worked and it communicated the status of the birds clearly to the player. Only one thing remained, audio!
Creating the audio
The final element to add was the audio. My approach here was to have multiple tracks. One ambient background track and one track per bird. As each bird fell farther their song would fade out until in the end only the background track would remain. That part was straightforward and easily accomplished using WWise.
I created the individual audiotracks using Garageband on my iPad. For each track I wanted it to sound tense in isolation and also for it to fit in with all of the other audio. The process I used for this was:
- The ambient track was created first. I used some string elements (violin, bass etc) to create a slow tense track.
- The individual bird tracks used other instruments that could be made to sound tense (strings, piano and drums).
- When designing and recording each bird track I had the ambient track as well as all of the other tracks playing at the same time. This made it easier for me to make sure the audio ‘fit’ suitable together.
The end result was ok but I do feel it ended up a little too discordant. I think the most problematic part was re-using strings (albeit different ones) for one of the birds. In future I would have each track be simpler and also more distinct from the other tracks. That way when the tracks are combined the result would be less discordant and the loss of an individual track would be more noticeable.
Overall, I’m very happy with Precipice. I learned a lot more about shaders which was a goal I had set after My Poor Retinas. The audio, while not fully effective, was on the right track and I have a better idea of what I’ll do next time. Precipice is available on itch.io now.
Recently our Studio 1 students have been working on their first project ‘Wherefore Art Thou’. The project set them the challenge of reimagining the game Berzerk in the style of a particular artist. I decided to join in the project as well and was assigned the artist Bridget Riley. This combination resulted in My Poor Retinas (both the name of the game and a lament for what developing the game did to my eyes).
Coming up with a concept
The initial challenge was to work out what the game was going to be. My two touchstones were Berzerk and Bridget Riley’s art pieces. Let’s take a look at the art pieces first.
Bridget Riley has been one of the major contributors to the area of ‘op art’ (optical illusion based artworks). The comment elements of her work include regular geometric patterns (typically black and white) that are then distorted. Throughout many of her pieces they convey this sense of reality itself being distorted and twisted. They are uncomfortable to look at because of what they do to our eyes but that discomfort also helps, for me at least, create that appeal. The two key elements I wanted to incorporate from Bridget Riley’s work were:
- Sense of reality itself being distorted
- The starkness of a regular geometric pattern of black and white objects
Let’s now take a look at our second touchstone, Berzerk.
In Berzerk the player is trapped in a maze of rooms. Within each room there are multiple robots. The robots have different capabilities. Some can attack and others can move. The player can flee the room at any point and does not need to kill the robots. The player will gain points for killing each robot and for killing all of the robots. As the player’s score increases the difficulty of the robots increases and is signaled with a change in their colour.
If the player takes too long in any room then the character Evil Otto spawns. Evil Otto cannot be killed and will slowly move towards the player. Evil Otto can also move through walls. If they hit the player then the player is killed. The player will die from a single shot from any robot or from running into a wall. The robots can also kill each other from friendly fire or can die from running into a wall or Evil Otto.
I wanted to keep a lot of the elements from the original game, in particular:
- Enemies that increase in capabilities (and as a result difficulty) based on the player’s score
- The dangerous nature of walls
- The neverending maze environment
- Evil Otto as a means to put the player under time pressure
Combining Berzerk with Bridget Riley’s Artwork
The result of combining these elements was the following plan of attack:
- The environment would be a black and white checkerboard
- Walls would be represented as a distortion (a pinching?) of the environment
- Enemies, the player and Evil Otto would not be visible in the conventional sense. Instead they would distort the environment under them.
- The environment would be procedurally generated
- The mechanics/behaviour of enemies from Berzerk would be preserved
This plan of attack was reasonable given the timeframe (3 weeks) but would also stretch me. The key challenges that I’ll examine more closely were:
- Procedural generation of the maze
- Creating the distortion effect for player and enemies
Procedural Generation of the Maze
I wanted to try and reproduce the same style of maze as the original Berzerk. That meant I would need to know how the original worked. Thankfully, two of my colleagues were also participating and one of them (Adrian Forest) had found the details of how the original game handled it’s procedural generation. I won’t repeat the details here and instead will link to this description on how it works.
I followed the same process outlined to work out where my walls would be. But how to achieve the actual distortion of the environment? For that I used the Unity Terrain system. I created a terrain and applied the checkerboard texture to it. I then procedurally generated the heightmap based on where the walls were. The general process for this was:
- For each point in the heightmap determine if it was near enough a wall to be effected
- If it was near enough work out how close it was. The closer it was to the centreline of the wall the greater the effect to apply.
- The ‘influence’ of all pillars on the point were calculated as a number between 0 (no influence) and 1 (full influence). That number was then used to set the height map value.
A similar process was used to make the height drop off at the edges of the map as well as to have the entry/exit points clearly marked. I’ve also described the process in the video here.
This process worked and it did generate the required terrain shape … but there was a problem … the default terrain texturing 🙁 In Unity the terrain system generates texture coordinates using the X and Z coordinates. It does not incorporate the normal. As a result when I looked top down on the game it was not possible to see any change. It looked like the picture below:
As you can see the distortion is not visible. I needed to use an alternate solution that took into account the slope of the terrain when generating the texture coordinates. The solution was to use a system known as Triplanar Shading. Shaders are an area I am very weak at so I made a choice to grab an asset (Terrain UV-Free Triplanar Texturing) rather than create my own. The result was this:
The distortion in the environment is clearly visible now. Dodging writing a shader just because I’m not good at them was a poor choice however. I’m not going to get better at them by never using them and I should know more about them. On the next project I’ll specifically identify opportunities to work more directly with shaders.
Creating the Distortion Effect
The distortion effect was another opportunity to make use of shaders that I didn’t take advantage of. Before I talk about how I did create the effect let’s talk about how I should have done it. What I should have done is:
- Capture the environment to a texture
- Create a shader that uses the texture and the location of the player/enemies and renders the object by distorting that texture
That would have involved a single shader (perhaps two including Evil Otto) and would have required only a single render to texture. This approach would have had good performance and allowed for the distortions to be very tightly integrated with the terrain. It also would have been a good opportunity to learn more about shaders. I didn’t take advantage of this. I’m not particularly experienced with shaders and don’t enjoy working on them. That’s a rubbish reason for not learning more about an area of technology that is both powerful and useful. To reiterate my earlier point, I need to learn more on them.
So how did I solve the problem? My solution (outlined in more detail here) was to have the player, every enemy and Evil Otto each have their own camera. That camera looked down at the environment and captured it to a texture. Image Effects on the camera applied the appropriate distortion and that texture was then drawn on a quad that could only be seen by the main game camera.
This process worked but it had a number of issues:
- The distortion did not always merge seamlessly with the environment. In particular for Evil Otto.
- The camera for each entity had to be precisely positioned to match the scale
- The performance was terrible on low end devices due to the number of extra renders required.
The solution? STOP AVOIDING WRITING SHADERS 🙂
Overall, I’m really happy with My Poor Retinas. It came together well and there were no major pitfalls during the development. I’m really happy with the system for the maze generation and how the enemies behave. I am unhappy with the extent to which I avoided writing shaders. My big takeaway from this project is next time there is an opportunity to write shaders then write them. I don’t need to love them, I don’t need to be great at them, but they’re too useful for me not to learn how to be better at them.
For the second project in Studio 2 this trimester our students are making a game inspired by a piece from the Queensland Gallery of Modern Art (GOMA). Brendan and myself have also decided to make a game as well alongside our students. I chose Red Shift by Shane Cotton.
Originally I wasn’t planning to make a game. I had planned to prioritise my other projects instead. However, Red Shift grabbed my attention from across the room. Shane drew inspiration from tribal lore, in this case his own tribal lore and the story of Taiamai:
“The work alludes to the story of Taiamai, a great bird that once arrived in the artist’s region of Northland, bringing so much mana (power) to the people that a rival chief attempted to capture it. The bird escaped by melting into the rock, where its spirit remains.”, Artist’s Statement for Red Shift by Shane Cotton
However, Shane did not intend for Red Shift to be a direct translation of the story into a visual format. Instead, it was trying to capture a sense of tension and being adrift. The overall tone of Red Shift is dark with a dark grey scale terrain on the left hand side. The sharp cliff face on the left hand side is based on photos of the Grand Canyon. It feels dark and threatening but at the same time less threatening than the cloud/fog filled void that comprises the majority of the picture. Both the cliff and the fog serve to accentuate the height of the birds. The birds themselves are very muted colours. They stand out against the background but it feels as if the environment has drained them. For now they’re flying above it, holding on, but what if they stop flying? What happens when they run out of strength? For me this is where the tension in Red Shift truly shines through.
This tension is what I want to capture in ‘Precipice’, my interpretation of Red Shift. I want the player to feel the darkness and tension but how best to approach that? If I simply recreated the artwork and had the birds move that wouldn’t convey the same feeling. The player will be a disconnected observer unable to influence what happens. If they have no agency in the world I think they will struggle to feel that tension. My solution began with this:
For Precipice I’ve chosen to rotate the player’s perspective. Rather than a cliff on the side I wanted the player to be high above a sharp and threatening terrain. From this high viewpoint the player will see the birds slowly circling around above the dark terrain. You can see the result of this in the video below.
Where does the sense of tension come in though? The birds will slowly circle the environment moving up and down. The player will control a single bird at a time. As long as the player holds down a key the bird won’t fall. Every second the player lets go the bird will fall further and further down. After a short time the bird will disappear and the player is control of one of the remaining birds. This continues until there are no birds left at which point the game ends.
The tension here comes from the player knowing they must always hold down a key for the birds to live. The moment they let go the birds move rapidly closer to oblivion. Eventually the player will always need to let go though. As the birds fall they will desaturate as their colour/life is drained out of them.
From an audio standpoint each bird will have a unique song. As long as the bird is alive the song will play. As a bird moves closer to death it’s song will fade. When all the birds are gone there will be silence. All of the songs will rely on standard elements such as strings to convey that feeling of tension.
I’m hoping the audio and the visuals will be sufficient to convey the tension to the player. However, it may not be sufficient. Other possibilities I’m considering are increasing or changing randomly the keys the player must hold to force them to continually act to continually be on alert to avoid the birds dying. Once the remaining elements are in this will be something that I iterate on based on feedback from playtesting.
More news on Precipice will be coming shortly 🙂
Previously, I looked at several changes which developers needed to make for their apps to work smoothly on iOS 9. That post was based on the most recent beta version at the time. Now that iOS 9 is released there are a few more changes that have surfaced which you may need to make for your apps.
In this post, I’m going to look at some of the new changes which have since emerged that can initially cause confusion.
iOS 9 is currently quite far along in its development and is generally expected to be released in September. Five beta versions have already been released to both the public and developers and the new versions have stabilised nicely. Now is the perfect time to make sure that any of your apps are iOS 9 ready. In this post I’m going to run through any changes that I have needed to make for iOS 9 in my apps. As I find any additional changes I will continue to update this post with those changes.
It’s very common when building an app (or a game) to make use of third party code. In many cases that code may have been made freely available with a requirement to display an acknowledgement/license. Apart from being general good behaviour it is also typically a legal requirement to do so. In this post I am going to look at how to do that plus some catches that have resulted from recent changes that Apple have made. The solution here is slightly modified from JosephH’s excellent solution posted on Stack Overflow here.
The website is now (assuming you are reading this) back up and running. The problem was due to a bug in a WordPress plugin I use to combat spam. That bug took down the site. Unfortunately that coincided with some very busy weeks for me which meant I didn’t realise until a kind reader let me know via Twitter.
After some table modification via phpMyAdmin the offending plugin was disabled and the site was back up, thankfully with everything intact. My apologies to everyone for not picking up on the issue sooner.
I will try and post more regularly, if for nothing else than to ensure I spot bugs like the site going down sooner!
Security is an increasingly important consideration for many mobile apps. This is particularly true if the mobile app in question contains sensitive information (financial, legal etc). Providing additional protection for sensitive information is both important and expected. To this end I have added a new set of security related code to my iOSCoreLibrary (available here). The security code allows you to easily add both passcode and TouchId authentication to your iOS app with minimal changes. In this post I will discuss how it was implemented and as always the code is fully available for commercial and non-commercial use.
Graphs are a common feature of many iOS applications and I use them extensively in my latest application. I needed a graphing system that could generate a variety of types of graphs (pie charts, bar graphs, stacked bar graphs and line graphs). I also needed a graphing system that was flexible enough to work on iPad, iPhone and could generate high resolution graphs for printing and export. Finally, the graphing system needed to handle, as gracefully as possible, data sets that ranged from a few data points up to a few thousand.
There are other graphing systems out there but I wanted tight control over the visual appearance. I also wanted to learn how to build a graphing framework that could handle these constraints. Which brings me to this blog post. I have made my graphing code available on GitHub here. The code is free for you to use (commercial or non-commercial) and to modify. Attribution is appreciated but not required.