XR INPUT STUDIES

A Human-Computer interaction study on the performance of Oculus controllers vs Hand Tracking 2.0

XRInputCover

Overview

It's mid-summer of 2022 and Oculus has just released their Hand Tracking 2.0. Ultraleap is now embedded in VR headsets, and smaller startups and Apple are creating their own input hardware. There is a race to determine which input modality will be primary within the AR/VR space.

To test these modalities, I conducted an experiment with 17 students. They were asked to click on a blue cube/square with either hand tracking or controllers. The blue cube/square was in different positions and distances for each turn, and the students' times and accuracy were calculated. The goal of the experiment was to determine which input modality is the most accurate, efficient, effective, versatile, and preferred among Generation Z.

What I did

  • Drafted a research and testing plan
  • Prototyped on Unity with the Oculus Integration SDK
  • Collected both Quantitative and Qualitative data
  • Analyzed data for insights
  • Wrote suggestions for improvements based on results

"This study, filled with critical insights into gestural interface usability, establishes Lucas Thin as a
thought leader in virtual and augmented reality design. "

Julian Scaff, Associate Professor and Faculty Director of MDes in
Interaction Design.

Key Findings

5.5x

Time taken by Hand Tracking over controllers in worst cases

2x

duration for Hand Tracking due to Occlusion

63%

Still preferred hand tracking

Hand tracking is less effective than controllers, but many participants still preferred it because they believed it would improve in the future and because
it is more enjoyable and portable.

Suggestions for Improvements

Direct Interactors for Hands Tracking

Based on the results of our tests, I have identified the following suggestions for improving users' awareness of their ability to directly interact with objects when using hand tracking. Please note that these suggestions have not yet been tested.

directhand

Ray Interactors for
Hand Tracking

The majority of testers had difficulty using the hand ray interactor, as they assumed that the ray was emanating from the fingertip. Additionally, there were several ergonomic issues with the arm shaping and moving in a circular rather than linear motion, which caused the cursor to be shaky and less precise, leading to inefficiency.

rayHand

Ray Interactor for Controllers

The controllers had the fewest issues of the three methods tested, although testers did experience some difficulty in distinguishing between the cursors for the left and right hands. There was also some slight shakiness when interacting with objects at a distance, and some testers struggled to identify which objects were interactable.

rayCont

Full Test Video

This video shows the entire testing process. For their second round, the user will repeat the flow using either hand tracking or a controller, depending on which input system they used in their first round.

Learning Points

01.  Identifying test variables and factors

02. Thoroughly planning the study

03. Onboarding testers into VR

04. Analyzing for Insights over Solutions

01.

Identifying test variables and factors

Prioritising the interactions

Initially, I planned to test the click, drag, and release interactions, which are primitive digital actions. However, due to the limited time, I focused on the click interaction and included various factors in the test.

gestures

Limit the factors

The initial factors were distance, size, occlusion, 2D/3D and shapes. However, likewise with the interactions, I had to cut down to mainly distance, occlusion and 2D/3D.

02.

Thoroughly planning
the study

Defining the data collected from
the testers

As a first-time researcher in the field of Human-Computer Interaction, I was unsure of what data to collect from my participants before testing. I knew that collecting too little data could impact the results, so I identified the main variables that could affect a person's testing performance. While it is generally better to collect more data, I also had to consider ethical considerations when deciding which data to collect.

testres

Planning the questions and order of testing

To understand how other HCI testing studies were conducted, I researched relevant papers and used the information to create two quantitative surveys, one for each input, and a few qualitative questions to conclude the study. This provided me with both the actions and thoughts of the users during the experience.

surveys

Maintain consistency between 2D and
3D conditions

Initially, I planned to randomly assign the cube positions for each user, but I realized that this would impact the validity of the test. Therefore, I learned that consistency is crucial to ensure that the 2D and 3D occlusions are equally challenging. To achieve this, I kept the cube positions constant and slightly adjusted the position of the interactable blue cube to be pressed in all five directions.

occlusion

03.

Onboarding testers
into VR

Testing all interactors
for each input

Prototyping all forms of direct interactors for both controllers and hand tracking allows users to explore the full capabilities of each modality and choose the option that works best for them.

interactors

Designing an
onboarding scene
before the actual test

After testing with one user, I realized that it was important to provide an onboarding scene for users who are unfamiliar with hand tracking or VR in general. This would create a more level playing field with users who have prior experience. The onboarding scene included both 2D and 3D interactable blue cubes in random directions and allowed users to try it with both controllers and hand tracking.

onboarding-scene-unity

04.

Analyzing for Insights over Solutions

Learning to create a statistical data sheet

I recorded and timed all of the tests and used averages and percentages to calculate the time differences. The data was collected both in VR and off VR.

Calculate and visualize the data on a whiteboard

It had been a while since I used any statistical knowledge, so this was a new experience for me. I calculated the standard deviation of the post-survey results to understand the diversity of opinions among the participants. I also assigned each participant a color and created a whiteboard chart of all survey results to identify any commonalities. Unfortunately, there were no notable trends

whiteboard

Analyzing Qualitative Data for Patterns

Finally, I coded each qualitative interview and matched them with the participants' behaviors to identify any common patterns. However, there were very few, if any, patterns that could be justified. Therefore, I focused on the general feedback I received about input modalities in the present and future of XR.

Future Improvements

  • Timing each turn with Unity Script to reduce human errors
  • Prototyping all interactions in one scene to prevent lag during testing
  • Enlisting a friend to help record the data so that I can focus
    on observation
© madebylucas 2022
Linkedin
View