top of page

Proxy-Controlled VR Game

Virtual and augmented reality can change how we relate to our bodies and the people around us. In my project, I used a multiplayer setup to see how these different realities can help create a unique shared experience between two people.

 

Thank you to Matti Niinimäki for a fresh perspective on interaction design.

Embodied Interaction (2023) at Aalto University, Finland

Team

Solo Project

Course

Embodied Interactions

Duration

6 Weeks

Duo Dimension

Duo Dimension is a two-player game centered on teamwork and designed for a single VR headset. While one player immerses themselves in the digital environment through the VR goggles, the other uses a controller to interact with the same digital space. Together, they create a connection between the digital and physical worlds.

(Spoilers Ahead!)

Duo Dimension includes seven levels: an introduction and a central hub that links to the other stages. To mitigate FOMO for the interactor, who handles the controller, the game adopts a minimalistic visual style. The atmosphere is elevated by AI-generated skyboxes from Blockade Labs, which enrich the observer's experience. These skyboxes underscore that while the observer can explore various digital dimensions via the VR headset, the interactor remains in the same physical space.

Visuals

0. Transition - Emotional Disconnect

Designed to create an emotional and perceptual gap between roles. Intense graphics and soundscapes put the observer into a "digital trance," highlighting the emotional disconnect for the interactor.

7. Summary: Achievements

A voice-over highlights the collaborative achievements and puzzle-solving complexity, reinforcing team accomplishments.

6. Forces: Tangible Manipulation

Expands upon previous understandings, linking the interactor's controller movements to changes in a digital object. Effective communication is critical for puzzle-solving success.

5. Swapping: Role Reversal

Players switch roles to foster a mutual understanding. The new interactor, formerly the observer, benefits from pre-existing knowledge of object behavior.

4. Movement: Intro To Kinetics

Serves as a prelude to a more advanced level, teaching players about autonomous movement. The observer must now communicate dynamics, not just location, to the interactor.

3. Haptics: A Feeling Of Touch

The interactor locates an object invisible to the observer, allowing them some autonomy. The object's closeness influences the controller's vibration, a feature the observer cannot experience.

2. The Mirror: Self Awareness

Focused on the observer's self-realization when they look into a mirror and see the podium atop their head. The interactor places a ball on this podium, generating a tangible player-to-player connection via a digital cue.

1. Introduction: Gameplay Mechanics

Explain the rules and provide instructions. Background changes to signal player role transition. AI voice by ElevenLabs frees up development time for other game aspects.

7 Levels

Collisions

Collisions activate various game functions, like sound effects or level transitions. These can be object-specific (e.g., player's head with level orb) or generic (e.g., any cube collision plays a sound).

Speed

Object speed is calculated by tracking location changes over time. This is used to trigger events when speed thresholds are met, such as activating special sound effects in the "gong" level. A pre-built node for this functionality also exists.

Distance

Distance calculations between objects enable features like object respawning. This uses vector math to trigger actions based on distance thresholds, useful for objects falling off platforms.

Location

VR object locations adapt to player height for an immersive experience. Initial vector locations are stored to reset objects that fall off platforms or go out of reach.

Game Logic

Collisions

Through collisions various game logic is triggered such as sounds, loading a new level, destroying an object, etc.

Since collisions can be invisible it allows for specific locations and objects to trigger certain actions. Sometimes the use of collision needs to be triggered by a certain object (the head of the player colliding with the level loading orb), other times the collision can be triggered by any object (movable cubes being hit will play a sound).

Speed

Checking the location of an object from one tick to another tick allows for the calculation of the distance traveled. between each tick to measure the speed of an object. By checking the speed of an object its possible to trigger an action only when the speed threshold is met, such as in the "gong" level

Distance

Calculating the distance is often used to check if an object has fallen off a platform, making it respawn using by resetting the position (as well as removing any physics movement/rotation). By getting the vector position of two objects, and subtracting them from each other you can find the distance between them, which allows certain actions to be triggered only once the distance is short/long enough.

Location

The various locations of objects is very important for VR especially because the height of people is different, the world needs to adapt to their height. When the level-loading orb spawn's in, it gets the head of the player (z-axis of VR headset) and sets the height based on that.

When  the balls are first spawned into the world, their vector location is stored in a variable, in case an objects location needs to be reset, if it rolls of a platform, or is being thrown out of reach.

Game Logic

Initial Exploration

Testing Unreal Engine 5 To validate the feasibility and fun factor of the concept, I developed a basic level in Unreal Engine 5.
 

First Level: Mind and Body Coordination
The level design engages both players: the 'observer' (mind) guides the 'interactor' (body) to manipulate blocks, aiming to guide a bouncing ball into a goal.
 

Sensory Feedback
Haptic cues were added to the controller to inform the interactor of block interaction. Each block emits a distinct sound upon collision with the ball. Completing the objective—getting the ball in the goal—triggers an arpeggiated melody, offering auditory feedback to the 'interactor' on their successful teamwork.

Entry #1 - Monday 13. March

Sketching Out Initial Concept

00:00 / 00:18

Audio sample made in Logic. 5 block sounds and final sound.

1. Spawning ball and Physics Issue

After configuring Android SDK and NDK, I set up a ball spawner with a bouncy physics material and enabled cube collisions. However, the physics proved unreliable, and too many balls hurt performance.

2. Sound Integration & Optimization

I created custom sounds in Logic and implemented them via Unreal's Blueprint function. The rate of ball spawning influenced sound rhythm, requiring further tweaking. To improve performance, balls were set to self-destruct after 10 seconds.

3. Path tracing and Substepping

To ensure consistent ball movement, I visualized their paths and optimized physics sub stepping for uniform calculation intervals. Some rendering issues appeared in the left eye of the path tracer.

4. Setting Start and Finish

Using Blender, I modeled funnels to serve as start and finish lines. The next steps include implementing the 'Finish' function, triggering the final arpeggiated sound, activating controller vibrations, and halting ball spawns.

Getting familiar with Unreal Engine

Key Takeaways from the First Playtest
Not realize this during my concept creation, but both players could move, allowing for movement, although withing a smaller digital space to ensure reliable controller tracking within a 1-3m radius.
 

Other Observations and User Feedback:

  • Light conditions affect tracking.

  • Real-time multi-sensory feedback improved communication.

  • Repeated block-moving became monotonous.

  • Collaborative effort was high but challenges were low.

  • Physics resulted in accidental goals.

  • Audio lag disrupted reliable feedback.

  • Lack of in-game guidance.
     

Solutions Moving Forward:

  1. Tracking & Space: Shrink the in-game area to improve tracking by reducing the distance between the VR headset and controllers.

  2. Multi-Sensory Feedback: Introduce diverse feedback mechanisms, including distance-based haptic signals.

  3. Interactivity: Introduce a variety of tasks and interactions to keep engagement high and combat monotony.

  4. Challenging yet Simple Puzzles: Implement puzzles that require exploration rather than clear instructions to enhance the collaborative challenge.

  5. Ball Physics: Implement a destroy mechanism for balls that hit the floor to prevent unintended goals.

  6. Audio Delay: Investigate alternative speakers or software solutions that may have shorter audio delays.

  7. Game Intro: Add a voice-over introduction to explain the game mechanics and concept prior to starting the game.

#2 Entry - Monday 26. March

Playtesting

Test Session with Elias: Primary issues encountered included a noticeable lack of in-game instructions, errors related to the orientation of the controller in relation to the headset, and performance lag when new levels were loaded.
 

  • Solutions:

    • To combat the lag, I introduced a lightweight transition level to unload the previous level and load the new one more smoothly.

    • Numerous performance enhancements were applied specifically to make the game optimal on the Oculus Quest 2. This involved refining physics sub-stepping and compressing textures for better load times.
       

Unprompted Test with Friends: Both managed to complete the levels without requiring any external guidance, validating the effectiveness of the in-game instructions and interactive cues.
 

  • Observer Feedback:

    • Felt the teleportation delay was too brief: Extended from 5 to 7 seconds.

    • Confusion about entering the portal: Introduced a pulsating sphere in the portal for increased visibility.
       

  • Interactor Feedback:

    • Experienced a sense of disorientation and lack of context: Instituted a role-swapping mechanic midway to diversify the experience.

    • Expressed a desire to visually review gameplay: Added the feature to record the session for later viewing.
       

Emotional and Psychological Observations: The interactor used the word "empty" to describe their feelings post-gameplay. This speaks volumes about the emotional disconnect caused by the very technology that aims to bring a shared experience. It's clear that despite using multisensory cues, there's still significant work required to bridge this emotional gap.

#3 Entry - 7. April

Multiplayer Testing Of All Levels

Digitally Induced Cartesian Split

René Descartes was a French philosopher who believed in separating the human mind and body. His concept of mind-body dualism suggests that the mind and body are two distinct entities that interact with each other. This is called The Cartesian Split 

This idea of a separate mind and body has sparked much debate, especially in the realm of technology, where virtual and augmented realities can create a disconnect between the mind and body.

As we continue to explore the potential of technology, we must consider how the relationship between our bodies and the physical world is affected. Using technology intentionally can bridge the gap between mind and body. 

Digitally Induced Cartesian Split

René Descartes was a French philosopher who believed in separating the human mind and body. His concept of mind-body dualism suggests that the mind and body are two distinct entities that interact with each other. This is called The Cartesian Split 

 

This idea of a separate mind and body has sparked much debate, especially in the realm of technology, where virtual and augmented realities can create a disconnect between the mind and body.

 

As we continue to explore the potential of technology, we must consider how the relationship between our bodies and the physical world is affected. Using technology intentionally can bridge the gap between mind and body. 

A Future Of Individual Realities

The difficulty in sharing XR experiences stems from the fact that a person in VR perceives a distinct reality compared to those in the same physical space. Unlike traditional screens that facilitate easy content sharing, this technology isolates individuals. The high cost of accessing XR technology also exacerbates the gap between these distinct realities.

Through a two-player VR experience splitting the mind and body, I want to explore this disconnect. The person interacting with the world can not see the reality they interacted with as they don't have a VR headset. In contrast, the person seeing the world cannot interact with the world they observe, as they don't have the controllers.

 

Using a single VR headset broadens access to innovative technology. Allowing multiple users to interact and observe decreases entry barriers and enables more people to experience virtual reality. Aiming for an inclusive approach promotes accessibility and fosters community, bridging the gap between virtual and physical spaces.

A Future Of Individual Realities

The difficulty in sharing XR experiences stems from the fact that a person in VR perceives a distinct reality compared to those in the same physical space. Unlike traditional screens that facilitate easy content sharing, this technology further isolates individuals. Additionally, the high cost of accessing XR technology exacerbates the gap between these distinct realities.

Through a two-player VR experience splitting the mind and body, I want to explore this disconnect. The person interacting with the world can not see the reality they interacted with as they don't have a VR headset. In contrast, the person seeing the world cannot interact with the world they observed, as they don't have the controllers.

 

Using a single VR headset broadens access to innovative technology. Allowing multiple users to engage in both interaction and observation, lowers entry barriers and enables more people to experience virtual reality. Aiming for an inclusive approach promotes accessibility and fosters community, bridging the gap between virtual and physical spaces.

bottom of page