Sparking Steel

A VR game built as a final year project for my BSc in Computing. In this game, the player controls a giant robot from inside the cockpit, battling an opposing mech.

Type

VR Game

Engine

Unity

Hardware

Samsung Gear VR

Context

Final Year Project

UX for VR games

Interaction design for any game requires responsiveness, clarity of purpose, and intuitive controls. In VR, those requirements have extra physiological consequences. When something feels wrong in a VR environment, users experience discomfort, nausea, or headaches.

Solving motion sickness through cockpit design

One of the most common sources of discomfort in VR is locomotion sickness, which is the disconnect between movement happening on screen and the body remaining still in the real world.

My solution was to anchor the player visually inside a cockpit. From their perspective, they can see the seat around them, the structural frames of the mech, and the interior of the robot they're piloting. This physical context gives the brain a stable reference point, reducing the sense of unmoored movement. This 'anchor point' technique is used in many VR games, such as VR roller coaster experience

I validated this approach through user testing with a sample group of around a dozen people, ranging from 18 to 40 years old. Motion sickness and nausea were not raised as issues by any participant.

Stable framerates to maintain comfort

The Gear VR is capable of 60fps (frames per second) which is the minimum acceptable frame rate for a comfortable VR experience. At lower frame rates, the world begins to look jittery, which quickly becomes disorienting and uncomfortable.

Making the game comfortable to play placed strict constraints on how game resources could be used. Assets, geometry, and rendering were all carefully managed and reused wherever possible to stay within the performance budget.

What testing revealed

User testing surfaced two consistent pieces of feedback that required design responses before the project was complete.

Controls were unclear

Several participants were unsure how to control the mech, particularly at the start of the game. In response, I added an in-game control instructions screen positioned in front of the player visible within the cockpit itself.

The objective wasn't obvious

The enemy mech was initially hidden behind buildings at game start, leaving some players uncertain of where to go or what to do. I addressed this through level design changes, repositioning elements so the enemy is visible from the moment play begins, giving players immediate visual direction.

What I would change with more time

Given more time and a larger participant pool, there are two areas I would have prioritised.

  • Accessibility testing — particularly around colour contrast, to ensure visual elements such as HUD components and UI text are readable for players with colour-blindness
  • In-game options and menus — adding a persistent controls screen and customisable settings so players can reference and adjust inputs at any point during play, rather than only at the start