My project Invision is a sound-based puzzle game, a very prototype-centric piece I look to take this concept forward as I aim to make an enlightening experience.
This project is made solo by me including lots of in-depth programming, technical artistry, and sound design elements. I made use of metasounds in Unreal Engine for this project to be able to manipulate the tone that the crystal notes are played at in combination this the Unreal Niagara system to give the visual effect of sound waves. These sound waves are controlled via mesh placement to have programmatic control over their location in order to program puzzle mechanics.
I also animated the player character in this project using Inverse Kinematics and Unreal Engine’s Control Rig including the use of simulated physics bones to add even more feedback to the player.
Invision Visuals and Modularity
One thing that specifically was worked on here was getting the project ready to be shown off. One of the main parts of this is updating the visuals of the game. This proved harder than expected working with unreal rendering settings, lighting settings, post-processing effects, and the use of shader programming with particle systems and blueprints. This work had my head spinning more than once. But finally, I got a finish I was satisfied with.
What did I change? As stated it is many different parts working together material expression, lighting settings, rendering settings, post-processing settings, and the particle system used.
The main things that influenced its look are the use of additive material, a point light at the base of the crystal, and a bloom post-process over the top to make it feel more glowy. The style you can see above is the result of a lot of tinkering. During this time I made sure to make all of the crystals update through modular processes so all of the effects would update when a crystal changed colour. Here is an image of my state machine to change a crystal’s light colour based on its material value.
Here is a video displaying the crystal modularity and updating:
Physicalised Sound Emission System
The fundamental mechanic that was required for my game concept to work is what I call the Physicalised Sound Emission System. This system works by creating invisible meshes that act as a guide for the particles, representing the sound wave. These meshes can be used to track the position of the sound waves.
This system is composed of multiple parts working together; there is the Crystal Emitter, Wave Emitter, and the Crystal parent class.
Using these classes I am able to emit waves and detect any interaction with the emitter using efficient sphere collisions as they cause overlap events with the crystal class. I then have code in the crystal class that makes sure the collision is with the specific collider, verifying it is the correct class and then it can continue the code.
See example video here:
This month has gone well, there has been no change in the plan and I am on schedule. Game jams can serve to be a good break from long project work and will continue to be good for research and development too, trying out personal ideas and developing problem-solving skills with my portfolio. This month I continue to prototype and experiment with different ways of creating my base mechanics.
Invision Mechanic Fixes and Optimisation
One of my issues with developing my puzzle mechanics this month has been the bugs and crashing of Unreal Engine without explanation. However, in trying to solve this problem in different ways again and again I found out that what unreal was having a problem with was an undetected infinite loop. What was essentially happening was when a crystal reflected a sound it was also trying to reflect the sound that it then created, this of course, caused a very quick exponential infinite loop with a simple check that it was not its own wave. However when two crystals reflected each of their own waves it also created an infinite loop of bouncing waves off of each other creating infinitely more waves, which led to a decision. Once I discovered this I decided that a slight change in the understanding of mechanics would be in order rather than what I had previously devised. I decided that reflectors would not reflect each other for if they did it would create a sort of looping sound wave even if I stopped it from going on infinitely. It is not the intention of the puzzles in my game. The intention is to use sound almost mathematically like using electronic circuits to send signals.
On the optimisation front, I made sure when creating waves from now on that I would be using an object pool of dynamically created meshes to make sure the game is optimised enough so it doesn’t crash lower-end devices. The way it works is when it creates a new wave it adds it to an array of active waves and when I get rid of it it adds it to the inactive array, before creating more waves I always check first if there is an inactive wave to use instead of creating a new one and wasting memory.
Control Rig
With April came my birthday hooray! Happy birthday to me, and what better to give myself on my birthday than a month of doing tech art stuff! First up was making a control rig for my custom-made player character and then animating time.
So first up, creating a control rig involves creating controls that guide the skeleton of the character during animation. A control rig is made by using advanced programming techniques in bespoke coding graphs made for Unreal Engine. Typically control rigs are made in animation software however as Unreal supports the use of advanced control rigs I opted to use this. Also because I had previous experience with the tool. The most recent use of it was from the vertical game jam in December but I previously did a study on the use of Inverse Kinematics in Unreal Engine in my dissertation research report. Suffice to say, this was my mode of choice when it came to animating.
Creating the control rig took slightly longer than usual because of the unique bat-human anatomy I created. It also meant I wasn’t able to use premade animations, so custom-made animations were the only option. My solution to creating bat wings on the character animate realistically is by treating them as a replacement for a character in full-body IK which is an unreal function that averages the bone positions between an area to achieve a fluid result.
Useful Resources Full Body IK: https://dev.epicgames.com/documentation/en-us/unreal-engine/control-rig-full-body-ik-in-unreal-engine
I used this in addition to the standard IK methods employed in making control rigs such as two-bone IK with a pole vector control on the knee to guide the knee position as seen in the image of the control rig above. Once working It allowed me to much more easily animate the different animations required for my player character: Idle, Walk, and Run.
Video showing my control rig working:
In addition to this control rig I custom-made a physics mesh for the player character allowing certain bones to animate based on physics simulation instead of kinematic animation. This has effects like making a character feel more life-like such as having a tail that swooshes around behind the character. In this instance, I made the tail swoosh, and one of the ears move based on physics. I would have made both ears physics-based but due to a mistake in blender weight painting, I wasn’t able to in the end and I didn’t want to re-import the mesh because it would take too much work for a small change.
Now with the control rig done and the physics bones done, I set a day aside to do some animation. I decided to watch some animation theory videos on YouTube as research to have it all fresh in my mind and it helped a lot to think about the fundamentals of good animation.
Video showing animated character:
After creating the animations I programmed in easing between animations using blendspaces and not only did this but also used this same smoothing technique on smoothing the rotation of the player character and the transition between running and walking as well as idle and walking and running and idle. Ending up with a very clean and good-feeling control scheme. Ultimately I wanted the player to feel like a playful bat, and that is what I have achieved and I am happy with it.
Audio Production and Programming
I created a sound effect by editing, distorting, modulating, and using effects on sound effects until I got something I liked. I began with a bad-quality recording of some crystal bowls and through learning how to use audacity I could make a clean crystal sound effect for my game. To my surprise, I made it exactly a G node as I found out by using my guitar tuner app on my phone with my sound effect. This worked out perfectly as I needed my sound effect to be on key so that when I change the pitch using semitones I can switch between different notes. The ability to pitch shift with semitones comes from the new meta sounds in Unreal. Knowing my key and my semi-tones I was able to use meta sounds to create 7 unique notes by changing the number of semitones increased by. For example, as I started with G I needed to increase by two semitones to get to A and 4 semitones to get to B but only 5 semitones to get to C, because of this, I created the sound effects manually rather than creating a function.
Now I know my note is a g key. I can use it in Unreal as the basis for all of my notes by changing their semitones. Now it is time to get the sounds in the project programmed in. The way I did this was to use my parent class of BP_Crystal as my system worked based off of it. I started by creating an audio component inside of the crystal class, this is used to play sound physically as an object in space. I then used my previously created light-switching state machine that changed the light colour based on the material to now also change the sound asset that is attached. This meant that the different colour crystals produced different sounds when the sounds were triggered. So how does it trigger them? As I created the audio component this part was easy, it was a matter of simply telling it to as and when I needed it, in my case I wanted the crystals to produce a sound when they made a wave so that is what I told it to do.
I started using an ITD Panner to give a more realistic feeling to the audio, what an ITD Panner does is it delays the sound between both ears to try to trick your brain into thinking the sound is coming from a specific angle. It is unperceivable consciously for most people it is more of a subconscious difference.
Randomising footsteps was another mechanic I wanted to implement into the project to add more immersion and since sound is the guiding factor of the game it is incredibly important. To do this I sourced some Creative Commons Zero sound effects online created an array of the footstep sounds and used a random get to get a random sound from the array. To make it even more random I added some pitch variation to make it sound different every time it is played. The only thing left to do is to simply place the sound as an Anim Notify on an animation and it will play for every footstep. Of course, I made the walking steps quieter than the running ones too.I also went on to add a sound for the door at the end of the level as a way to give the player confirmation that they finished. I am happy with the result of the sounds in the end though I would like to add more ambient sound in the future.
Leave a Reply