​​​​​​​A lifelong fascination with video games and the technical aspect of their development inspired me to use the recent abundance of free time learning how games are constructed. I started with YouTube tutorials from developers such as Brackeys to piece together a combat scene between a Viking and Orc. With no previous experience using C# or Unity, I experimented with simple code and settings until a coherent scene emerged. My primary focus was character/enemy movement, and triggered events like animations and health tracking.
With my newfound [lack of] knowledge, I set out with the intention to build a pong-esque 3D tennis game. I distracted myself with instantiated projectiles and having fun sending them across the court. From a course on StackSkills, I learned how to create variation in game-elements such as trajectories, then applied those concepts to achieve subtle differences in sound effects to make the scene a little less-sterile.
While exploring materials, I came across a tutorial that described the fabrication of dynamic textures. I wondered if it was possible to create a in-game mirror using the method, but could not quite achieve the result I was looking for. While troubleshooting I found out about reflection probes and realized they are much more in line with what I was trying to achieve. Since reflections are dependent on the geometric relationship between the viewer's eye and the reflective surface, it was necessary to associate the reflection probe to the reflection surface in the same manner. I wrote the following [short] script to make the mirror reflection behave mathematically correct, and showcased my effort in a simple astronaut scene.

//Inverse position of objB relative to objA. Reflection probe is positioned opposite the viewer with respect to reflection surface.
relativePos = mirror.transform.InverseTransformPoint(viewer.transform.position);
//make this transform position relative
this.transform.localPosition = relativePos;
After designing the mirror asset in the previous scene, I was curious what tools Unity offers for building environments. Through the use of toolkits from Unity's asset store, I have been channeling my inner Bob Ross to create a happy little landscape in addition to experimenting with camera settings to create a realistic visual experience. Interesting to see how various lighting, textures, and models contribute to the style of the final scene.
The nature of touch screens allow for a huge variety of input methods, but most UI don't take advantage of that trait before littering the display with buttons. My solution considers user ergonomics, touch fields, taps, holds, and movement as factors for intuitive commands. A touch field on the right side of the screen accounts for change in touch location and associates that with look direction. It also listens for any rapid taps, which executes an attack if the condition is met. A touch field on the left side of the screen controls omnidirectional movement by tracking the start and current position of a touch, generating an angle from those coordinates, and transforming the character in the direction of that angle. Animations for movement are then called when that angle falls within specific ranges. The script I wrote in C# that handles touch controls can be viewed HERE.
Back to Top