Last updated 11th Nov 2018
Born and raised in England, I am currently studying IT: Computer Games Engineering at the Technical University Munich, in Germany.
Where possible, I like to spend my spare time working on passion projects, or participating in game jams. I primarily focus on game design + development using Unity, although I also have a background in sound design.
Here I have put together some information on some of the projects I have worked on, including a quick summary of the gameplay, and what my role was on the project. I have also included some background context, including some quick examples of technical and design challenges the project presented. If these are too detailed, feel free to skim over them. Similarly, if they are not detailed enough, please do not hesitate to get in touch and I would be happy to further elaborate.
I also have a classic CV, if that’s what you’re into.
Thank you for taking the time to have a look at my work
In development October 2017+
What it is:
Pixel.Fockr is a heavily stylised, voxel based, cyberspace-shooter.
The prototype I made in my first semester, was primarily focused on developing the core gameplay mechanics, namely combat and building/upgrading. Every ship consists of individual voxels (‘pixels’), which can either be weapons, shields or body pixels. The player has the ability to build their own custom ship from scratch, using the pixels they have shot off enemy ships and collected. They may also upgrade existing pixels, which allows weapons to upgrade attributes like fire rate and bullet speed.
Both the player and enemy ships are built entirely of these pixels, allowing both the player and enemies to shoot individual pixels off each other. Each ship also has a heart, which every pixel must be in some way be connected to, allowing whole sections of a ship to be destroyed if destroying an individual pixel cuts off contact with the heart for other pixels. Shooting the heart directly results in the whole ship exploding.
Role in project:
This was a solo project in which I did everything, except for the temporary music. Since working on the project, I have been in close contact with a sound designer who has began overhauling the game’s sound effects, which should be integrated soon. And since a couple of weeks, another programmer/designer, David, with whom I’ve worked with before, has joined the project.
Since one of our initial assignments in our first semester was to create a space shooter, I wanted to take this genre and use what I had learnt in Unity so far to create something which hopefully brought something new to the table. I was very keen on this concept, because it has the potential to give the player the agency to be creative in their ship builds, and try out different builds and tactics. There is also something clean and satisfying about the core gameplay-loop of destroying enemies, picking up parts of their ship, and then using these parts to improve your ship so you can destroy larger numbers of more powerful enemies.
While getting a functioning prototype of the core combat mechanic took a matter of days, over the following months I ended up rewriting much of the core code multiple times, each time finding more efficient and clean means of achieving the same goal. My initial setup would not have allowed for larger sized ships or the over-head I required to apply the post-processing which later helped make the game look less like programmer art.
This project offered plenty of interesting challenges, especially since optimisation is so important when you have a large number of objects on-screen, each consisting of many individual ‘pixel’ child objects.
Since creating the prototype in February 2017, I have since returned to the project to continue development. I am currently in the process of vigorously refactoring and rewriting the entire code to be more efficient and above all be more coherent, consistently structured and commented, in order to allow David to be able to join the project. We currently have a roadmap for the project, and I am working on a thorough Game Design Document. We hope to further develop this into a first-playable demo, and eventually a shippable product.
Take Your Pills
Originally created for Ludum Dare 40 on December 5th, full version released 4th May
What it is:
Take Your Pills is a puzzle game, in which the player must venture through randomly generated mazes containing traps and enemies, whilst taking a variety of exotic pills in order to unlock the exit. As each trial is completed, more and more pills are required to advance, with the combination of accumulated side-effects making each maze trickier than the last.
There are 20 pills with 20 side effects in the game, including:
Switching the camera view from top-down to first-person
Switching the world view to isometric
Regenerating the labyrinth’s layout, with the same pills and enemies, but in new locations
Making the walls glow red and hurt the player on extended contact
Switching the player’s controls to tank controls (left/right rotate)
Role in project:
This project was originally created for a Ludum Dare Game Jam in a team of 5, in which we all worked on programming and designing the game. I also did the sound design and music.
Afterwards I then worked on the game on my own, and was later joined by one of the original team members, David.
Together, we worked on bringing the game to a state in which it could be released.
The original version was created to the theme of ‘the more you have, the worse it is’. Based on David’s original concept of navigating a maze and picking things up which modify the game logic in unique ways, we worked together, as a team of programmers, to create the game. The version initially submitted for the game jam had a basic, 2D form, with around 8 different pills. Before the deadline, I also spent a lot of time working on the sound design and graphics/presentation of the game.
Since I liked the concept, and wanted a new project to further my game development experience, I later decided to revisit the project and rework/extend it. This presented a number of challenges. Firstly, the code was written by different people, under time-pressure, with varying amounts of commenting. I began by analysing every line of code, in order to then decide which systems could remain, and which were not generalised enough for extension, and would have to be revamped/rewritten from scratch.
While we already had pills which modified basic gameplay elements such as the player’s input, I knew I wanted to integrate pills which modified more core-mechanics of the game, such as switching to first person, or dimming the lights. Since the game was 2D and had no lighting, I set about converting all existing code and assets to work in 3D space with Unity’s lighting engine.
While coming up with ideas for new pill side-effects was straight forward, coming up with side-effects which could synergise with any other combination of side-effects, without being too unfair or undermining the effects of other pills, was very tricky.
From a programming perspective, the key challenge was making sure that each pill’s side-effect was integrated as cleanly as possible, to ensure that it could not possibly have unforeseen effects when used in combination with other pills. To quickly test different combination of pills, I made a simple debugging and testing editor window.
Making sure all the 2D sprites rendered correctly in both the isometric and first-person view was also challenging. I ended up ensuring that every game object in the scene inherited from a custom script which handled how sprites are displayed from the various perspectives. It also handled the pausing and unpausing of objects, allowing me to pause the game without setting the timescale to 0, and thus losing all update functionality.
Other challenges included:
Smoothly resetting any side-effect the player could acquire over the course of 2 seconds, upon level completion
Seamlessly unloading the previous level and generating the next level while the player enters the lift between levels
Ensuring that pills marked by the player using the pill marking system, were rendered after any side-effect post-processing, so symbols were still legible with, for example, the blurriness side effect active
Integrating Highscore tables using an external API (dreamlo)
Ensuring that online Highscores are only updated if the new score has the same name and is also submitted from the same computer
Download link: https://shytea.itch.io/pills
Created for a university project, showcased on the 10th July
What it is:
Beat Puncher is a rhythm game, created for our University’s ‘Dance and Acting in the Virtual Studio’ module. Gameplay wise, it is heavily inspired by a recent VR hit, Beat Saber, in which the player must hit incoming notes using VR hand controllers, in time with the music. Our version uses the Kinect to track the player’s motion, allowing them to move their arms to punch incoming notes, as well as stomp their feet in time to the beat.
Role in project:
We were a team of 3 working on this project.
While we all contributed to the design and programming of the core gameplay mechanics, I was also responsible for writing the method to synchronise the spawning of notes to the music, and for the level and UI design.
For the note synchronisation, I settled on a system which parses 3 MIDI tracks (left hand, right hand, feet) to know when each note should be hit. While the feet track only consists of a single note, denoting when the player should stomp, the hand tracks consist of 6 different notes, denoting 3 different spawn positions on either side of the player. While notes mostly spawn on their retrospective sides, having hands determined by the MIDI track and spawn positions determined by the MIDI notes, allows us to occasionally cross over hands to the same side. The MIDI parsing system uses the external audio library ‘NAudio’, to parse each MIDI file, looking for only pre-defined notes and converting them into note spawning events, which are then passed on to the track manager. It also parses the BPM (tempo) of the track, which determines how fast the visuals move, and how soon the notes spawn in before they have to be hit. Since the system uses standard MIDI files, creating note sync-tracks which are perfectly in time with the song was very straight forward. And since it can be done using any basic Desktop Audio Workstation, in the future it could easily allow for a player to create their own custom tracks, simply by importing their own audio file and 3 custom MIDI files.
Having the terrain move in time to the music was achieved by sampling the its frequencies. I split the number of samples provided by Unity’s GetSpectrumData method into 6 frequency bands, and then averaged the results of each band. After this, smoothing is applied, to produce cleaner results, and the results are normalised to their maximum peak value. I then created a method which is able to use this data to generate a curved mesh, in which the front row’s height represents these values. Snapshots are then created from this first row every beat, as determined from the MIDI file. These snapshots are then slowly moved away from the camera, giving the impression of moving through some kind of cyber-mountain range which directly represents the playing track.
To achieve the scene’s look, I used a traditionally written shader to add emission/glow to the edge of the player’s model, and used Unity’s shader graph to add some slight heatwave deformation to the sun’s sprite. I also used bloom post processing to contribute to the scene’s neon vibe. Then when the player activates ‘over drive mode’, I shift to a post processing profile which also includes chromatic aberration and colour toning.
Flappy Bird 2
Created in 46 hours for the Summer Semester Game Jam 18 (Friday 15th June - Sunday 17th)
What it is:
Flappy Bird 2 is a short, atmospheric, meta-narrative driven game. Created to the theme ‘nightmare’, you play as a programmer in a game jam who, sleepless and desperate, slips into a nightmare in which they are forced to face their own buggy code and self-doubt as a programmer.
Role in project:
I came up with the original concept at the beginning of the game jam, and teamed up with three others to flesh out the concept and create the game. We were a team of 2 dedicated shader programmers and 2 gameplay programmers, myself included, with all of us contributing towards game design. As well as programming, I was also responsible for sound design, including voice acting.
After the deadline was over, I returned to the project over a couple of days to finish some voice recording we didn’t have time for, do some bug fixing and apply some overall polish to the project.
While most projects I have worked on in the past have been centred primarily around their gameplay mechanics, and have then had visual style and atmosphere added to them later if time allowed, I wanted this project to put atmosphere first, and gameplay mechanics second.
Originally, it was the atmosphere of a programmer’s lucid nightmare which I pitched, stating that gameplay would be secondary and very simple. However, after brainstorming how the core concept could manifest itself as gameplay, we came up with the concept of being able to click on objects and manipulate snippets of loose code to reprogram that object’s properties. This mechanic, whilst neat, was definitely not simple, and resulted in us, very ironically, having to push ourselves to implement everything in time. The irony was tangible as I found myself recording voice lines for the game’s antagonist, a manifestation of programmer’s self-doubt and imposters syndrome, at 4 in the morning with the deadline fast approaching.
One challenge from a game design perspective was effectively tutorialising the gameplay mechanics in order to give the player the tools they need to overcome the game’s basic puzzles, without handing the player the solution and denying them the satisfaction of solving the puzzles themselves. We intentionally start the player falling endlessly through the level, to make it apparent that something is wrong and that they need to take action in order to take control of the situation. Since the game makes a drastic transition from 2D to 3D during the fourth-wall break, I also added basic controls which are displayed after the antagonist’s opening monologue, so the player knows that they are now in a first person game. It is normally at this point that they look around, see the floor constantly rushing at them, and notice that the crosshair displays the name of that object. At which point, players almost always put the information they have been presented with together, click the floor, and drag ‘true’ to BoxCollider.Enabled() to allow them to land on the floor (it is worth noting that this game was definitely made with programmers in mind as the target demographic). Our shader programmers made an especially eye-catching shader for the draggable code snippets, in order to clearly differentiate them as intractable.
This then effectively conveyed the workings of the game’s core gameplay mechanic (dragging code snippets between gaps in objects’ code) in a way which still gives the player the satisfaction of figuring things out, rather than simply clearly writing out how the mechanic functions. However, what we could have conveyed more clearly, is that code snippets dragged out of an object’s code stay on the user’s screen and can be carried between pieces of code in a sort of inventory, something which is later essential to progress in the game. If I were to redesign the opening of the game, I would try and convey this better by having the player take the ‘true’ statement needed to enable the ground’s collider from another piece of code. For example, having an error message pop up when the bird crashes in the opening, which contains the line ‘stability = true’, and forcing the player to drag true out, triggering the fourth-wall break and the beginning of the real game while demonstrating that code snippets can be dragged out, stored on the player’s screen, and then replaced in other pieces of code.
Download link: https://shytea.itch.io/flappybird2