Seamless Transitions

Academic Research & Spatial UX Design

Role

UX Researcher & Interaction Designer

Works With

VVISE Lab

Time

Q1-Q4 2024

Role

UX Researcher & Interaction Designer

Works With

VVISE Lab

Time

Q1-Q4 2024

Role

UX Researcher & Interaction Designer

Works With

VVISE Lab

Time

Q1-Q4 2024

📍 Project Context

Traditional VR controllers, while intuitive for some interactions, lack the precision required for detailed manipulation tasks. Users frequently need to switch between VR and desktop environments, creating workflow disruptions and usability barriers in professional VR applications.

How might we leverage users' existing familiarity with mouse interaction to create more precise and seamless VR experiences, particularly for seated, precision-focused tasks? I designed and validated a novel mouse-based VR interaction system that seamlessly transitions between 2D and 3D interactions, maintaining the precision of traditional desktop interfaces while preserving VR's immersive benefits.

Key Insight

Our real interface wasn’t buttons or menus — it was how users moved, selected, rotated, and scaled within a vast spatial environment.

Key Insight

Our real interface wasn’t buttons or menus — it was how users moved, selected, rotated, and scaled within a vast spatial environment.

Key Insight

Our real interface wasn’t buttons or menus — it was how users moved, selected, rotated, and scaled within a vast spatial environment.

🔍 User Research & Behavioral Insights

Through preliminary interviews and observation studies, I identified key pain points:

  • Precision Gap: VR controllers suffer from hand tremor and tracking instability

  • Workflow Disruption: Constant switching between VR and desktop breaks immersion

  • Learning Curve: New VR users struggle with 3D controller paradigms

  • Professional Needs: CAD, design, and data analysis require pixel-perfect accuracy

And Research Questions:

Can mouse-based interaction outperform VR controllers in precision tasks?

How do users perceive the usability of hybrid 2D/3D interaction?

What are the optimal mappings for mouse actions in 3D space?

💡 UX Design Principles

From these insights, we established three core principles to guide all spatial interactions:

  1. Precision without friction: Allow next-level control directly inside VR.

  2. Continuous cognitive flow: Users should seamlessly move between broad exploration and micro adjustments.

  3. Familiar yet immersive: Leverage familiar desktop skills (mouse precision) in a spatial context to minimize learning curves.

🔬 Behavioral UX Solution

Instead of screens, we designed dynamic behaviors and adaptive rules that formed the true spatial interface:

Smart 3D ray-casting

Mouse movements were intelligently projected into 3D space, with auto-adjusting ray lengths based on collision context. Enabled pixel-level selection accuracy on intricate models.

Smart 3D ray-casting

Mouse movements were intelligently projected into 3D space, with auto-adjusting ray lengths based on collision context. Enabled pixel-level selection accuracy on intricate models.

Smart 3D ray-casting

Mouse movements were intelligently projected into 3D space, with auto-adjusting ray lengths based on collision context. Enabled pixel-level selection accuracy on intricate models.

Seamless 2D ↔ 3D morphing

The cursor smoothly transitioned between acting as a traditional 2D pointer on panels and a 3D ray for spatial manipulation. No explicit mode switches; the behavior adapted based on hover targets and context.

Seamless 2D ↔ 3D morphing

The cursor smoothly transitioned between acting as a traditional 2D pointer on panels and a 3D ray for spatial manipulation. No explicit mode switches; the behavior adapted based on hover targets and context.

Seamless 2D ↔ 3D morphing

The cursor smoothly transitioned between acting as a traditional 2D pointer on panels and a 3D ray for spatial manipulation. No explicit mode switches; the behavior adapted based on hover targets and context.

Velocity-based scaling

Slow movements triggered micro-adjustments, fast movements navigated large spaces. Inspired by steering laws to support fluid multi-scale interactions.

Velocity-based scaling

Slow movements triggered micro-adjustments, fast movements navigated large spaces. Inspired by steering laws to support fluid multi-scale interactions.

Velocity-based scaling

Slow movements triggered micro-adjustments, fast movements navigated large spaces. Inspired by steering laws to support fluid multi-scale interactions.

Head movement compensation

Real-time filtering stabilized the cursor despite natural head jitters. Reduced user fatigue and increased trust in the tool.

Head movement compensation

Real-time filtering stabilized the cursor despite natural head jitters. Reduced user fatigue and increased trust in the tool.

Head movement compensation

Real-time filtering stabilized the cursor despite natural head jitters. Reduced user fatigue and increased trust in the tool.

🚀 What We Delivered

We delivered an intuitive 3D interaction system that builds on familiar 2D mouse behaviors and extends them seamlessly into spatial environments. Core features include ray-casting controlled by mouse movement with visual feedback and collision detection, smooth 2D/3D transitions using sticky edges and visual cues, and adaptive interaction techniques like velocity-based scaling and scroll-based depth control.

Technically, the system was built in Unreal with OpenXR for VR support, optimized for Oculus Quest 3, and included custom mouse-to-3D mapping and real-time hand tracking. The interaction model supports object selection, translation, rotation, depth adjustment, and teleportation with simple, familiar inputs.

Phase 1 : A Fitts’ Law experiment.

The primary metrics for this comparison were time, accuracy, and throughput.

Phase 2 : A more comprehensive evaluation of object manipulation capabilities. The primary metrics for this comparison were time, rotation difference, and position difference.

🔬 User Testing & Results

Improved Precision

Error Rate ↓ 50%

From 6.5% with controllers to 3.5% with mouse.
Users made significantly fewer mistakes during precision tasks.

Improved Precision

Error Rate ↓ 50%

From 6.5% with controllers to 3.5% with mouse.
Users made significantly fewer mistakes during precision tasks.

Improved Precision

35 s

From 6.5% with controllers to 3.5% with mouse.
Users made significantly fewer mistakes during precision tasks.

Increased Throughput

Throughput ↑ 18%

Higher bits/second means smoother, more efficient input control.

Increased Throughput

Throughput ↑ 18%

Higher bits/second means smoother, more efficient input control.

Increased Throughput

Throughput ↑ 18%

Higher bits/second means smoother, more efficient input control.

Reduced Task Time

Task Time ↓ 20%

Average completion time dropped from 1.5s to 1.2s
Enabled faster, smoother precision operations.

Reduced Task Time

Task Time ↓ 20%

Average completion time dropped from 1.5s to 1.2s
Enabled faster, smoother precision operations.

Reduced Task Time

Task Time ↓ 20%

Average completion time dropped from 1.5s to 1.2s
Enabled faster, smoother precision operations.

Selection Accuracy

SDₓ ↓ 15%

Final selection points were more tightly clustered around targets, with SDₓ reduced by 15%.

Selection Accuracy

SDₓ ↓ 15%

Final selection points were more tightly clustered around targets, with SDₓ reduced by 15%.

Selection Accuracy

SDₓ ↓ 15%

Final selection points were more tightly clustered around targets, with SDₓ reduced by 15%.

System Usability

SUS Score > 80

Indicates strong user acceptance and ease of use.

System Usability

SUS Score > 80

Indicates strong user acceptance and ease of use.

System Usability

SUS Score > 80

Indicates strong user acceptance and ease of use.

User Feedback

High Confidence

Users reported strong satisfaction and trust during precise interactions.

User Feedback

High Confidence

Users reported strong satisfaction and trust during precise interactions.

User Feedback

High Confidence

Users reported strong satisfaction and trust during precise interactions.

More deatiled result, please view: https://summit.sfu.ca/item/39151