top of page

XR Hand Interactions

To test hand interactions that could be used in the work environment of a ship bridge, I created multiple prototypes for hand interactions using Meta Quest 2's hand tracking. The application was built using Unreal Engine's visual scripting language.

Unreal Engine, Hand Tracking and Visual Scripting.

Team

Solo Project

Duration

4 Weeks

Ways of Interacting with Spatial Interfaces

In this project, developed in collaboration with the talented Ocean Industries Concept Lab team, I focused on exploring and prototyping interaction methods for mission-critical environments such as a ship's bridge using XR technology.

With a focus on hand-based interactions, offering the ability to quickly change the level of detail to remove any visual obstructions. Also, it explores eye-tracking to enhance interface navigation and monitor operators' focus levels. Additional location-based interactions like trigger-area leverage the physical layout for seamless interactions from various positions, proving highly valuable while navigating a ship.

These interaction methods, prioritizing safety and operability, were made to improve performance and reliability in high-stakes environments by minimizing interface-related errors.

Leveraging Unreal Engine and the versatility of Quest 2, I prototyped various interactions. This approach enabled iterative testing and refinement of each interaction function, allowing me to find the most effective modes of interaction for specific situations. 

 

Using this iterative methodology allowed me as a designer to quickly fine-tune the functionality of the spatial interfaces and optimize their usability in various scenarios.

bottom of page