Meeting Key Engagement Metrics

The Immerse application is a virtual reality language learning platform that teaches users Spanish in the Metaverse using the MetaQuest headset to engage in a 3D interactive social learning experience.

For this project, we wanted to validate the Immerse product by identifying whether users felt engaged and satisfied after using the product for four consecutive weeks. As a UX researcher for Immerse, I was the lead UX researcher for this project and completed this project from start to end.

Project Overview

This project followed a group of users over four weeks while experiencing the Immerse app to explore to what extent users felt engaged, immersed, and satisfied. We implemented pre/post-surveys and conducted focus group interviews during each session to capture users within the moment. Each userlogged into the Immerse app using their MetaQuest headset and joined a live, 1 hour session, over the course of four weeks to learn Spanish with 5 students per class. We had 3 cohorts at a time, with a total of 30 user participants. Our goal was to explore how user attitudes and behaviors in VR might change over time with prolonged usage of the Immerse app.

User participants

Cohorts

live testing sessions

Hours of video recorded

team members

User Premise

You have enrolled in the weekly Immerse VR Spanish language course for the next four weeks. You have joined this Spanish course because of the unique affordances of VR and your interest in learning Spanish without having to leave the comfort of your home.

During the next four weeks, the research team will be collecting data to contextualize your experience with the Immerse platform in order to better understand how they might change over time. We will ask a series of questions to describe your feelings, emotions, affordances, and constraints with scales and open-ended questions. We value your honest opinions, feedback, and journey to help improve the Immerse learning experience.

Goals & Objectives

Attitudes & Behaviors

We wanted to explore how user attitudes and behaviors in VR might change over time with prolonged usage of the Immerse app.

Tensions

We wanted to identify what frustrations users were having and dive deeper into the reasonings as to why these problems might be occuring. 

Validation

We wanted to validate that the product features, interactions, and the experience was meeting the intended use case for the end user.

Customer Satisfaction

We wanted to identify user satisfaction with the Immerse app after uttilizing the product for four weeks to determine how we might better support our users.

Engagement

We wanted to evaluate users’ engagement with the product and find out what might be making users’ disengaged.

Scenes

We wanted to uncover what scenes were drawing users in and what types of 3D interactions were speaking users’ interests.

Research Process

I led the team in conducting a longitudinal pre and post-test survey design to track users’ behaviors over 4 weeks. This allowed us to analyze the change in users’ feelings, thoughts, and behavior over two or more time points. The affordance of this method allowed sufficient dosage for users to explore, play, and interact with the VR interface in order to determine which specific features might need iterative improvements and which features were ready for deployment. I analyzed this data by conducting descriptives, frequency analysis, and paired samples t-tests. We also analyzed users’ open-ended survey responses and focus group data to identify why they felt engaged, disengaged, satisfied, or unsatisfied.

1) Methodology

We deployed longitudinal pre and post-surveys, collected open-ended responses, and facilitated focus group interviews after each session to conduct evaluative user research.

4) Facilitating Interviews

I facilitated the focus group interview questions within the Immerse VR app at the end of each class session. I asked a series of evaluative and validation questions to gather user insights, immediately after concluding the lesson.  This afforded rich and rigorous data on what the user just experienced.

2) Recruitment

We selected 30 users from our internal recruitment pool of participants and chose a range of users from beginner to novice VR users spanning from 18 to 50+ years of age.

4) Data Analysis

I analyzed this data by conducting descriptives, frequency analysis, and paired samples t-tests. We rewatched the recordings and analyzed the open-ended questions to determine why or why not users’ felt engaged, immersed, and satisfied. Finally, we triangulated our findings to suggest practical recommendations for product iteration.

3) Survey design

I led the team in survey design and development. I drafted all of the survey constructs, instruments, and open-ended constructs which were reviewed by the UXR lead. I then used Qualtrics to build out the survey to collect the data for this study.

5) Ux Report & Presentation

Lastly, after organizing the data, I created a UX report and slide deck to disseminate the mixed-method findings to my cross-functional partners.

Results

My task was to evaluate to what extent and why user engagement, immersion, and satisfaction might have increased or decreased over time. By corroborating our quantitative and qualitative findings, we were able to evaluate the Immerse app in its final stage of development and provide key metrics that validated the product for launch on the Meta store.

 

Engagement

We measured engagement by using a validated 12-question survey construct on a 5-point Likert scale. This survey was administered at two-time points on Qualtrics. Once after the first testing session and again, after the fourth testing session. We also asked open-ended questions in the survey and concluded with focus group interviews with each cohort after every testing session. 

 

Key Insights #1

Engagement (Quant)

Results indicate that, on average, 71.3% of learners (averaged across all 12 questions) feel engaged after experiencing their first VR lesson in the Immerse app.

Results indicate that, on average, 78.9% of learners (averaged across all 12 questions) feel engaged after experiencing their last VR lesson in the Immerse app.

The results paired-sample t-tests show from the pre-test (M = 42.1, SD = 3.91) and post-test (M = 45.6, SD = 4.18) on users’ level of engagement indicate that their was a statistically significant increase in user engagement after four weeks, t(10) = 12.1, p < 0.001.

In summary, we found an 8.1% mean increase in users’ engagement after using the Immerse app to learn Spanish for four weeks. This finding was statistically significant and indicates that users feel engaged while learning Spanish on the VR platform. 

Engagement (Qual)

After analyzing the open-ended questions and focus group data, we coded each of the user responses and categorized each code into overarching themes. After sorting our data, we took a deductive approach when looking at the qualitative results to explain potential reasonings for our quantitative findings. Below is an excerpt of the user commentaries.

What made the VR learning experience engaging or disengaging?
Lesson Activities VR Environment Distractions

I really liked how we did activities where we had to talk to each other and introduce each other. I also really liked when we were at the whiteboard and brainstorming questions and I really liked the last activity where we were given another person to play.

The interactions with the objects made the learning experience really fun. I could match the newly learned Spanish words with the objects in this VR world, which helped me consolidate my understanding of these words and sentences.

I think the VR learning experience is super engaging! It is novel and fun. The well-crafted mini-world has plenty of elements that match the lesson.

The colorful scenes and objects make it engaging. Watching the other people practice is interesting. I would like to hear the sentences repeated a couple of more times. This session more effectively and clearly used the posted notes.

Sometimes the interactions with the objects, including taking photos/notes, can be distracting to achieving my learning goal. Because the mechanism of controlling those interactions was novel and fun, sometimes I might miss what the teacher was saying.

Sometimes moving closer to the whiteboard can be tricky because I don’t want to bump into any other student/the teacher, but at the same time, I would like to get close enough to see the words on the whiteboard clearly.

From these commentaries, we see that users were raving about their engagement, interactions, and learning. The overarching trend gleaned from student responses was the instructor’s lesson facilitation skills. Users found her teaching to be charismatic, encouraging, supportive, and balanced. She was personable and conversational which I suspect lowered students’ pressure/ affect and found it easy to relate. Additionally, the alignment of the lesson with the activities really boosted students engagement. Specifically, the self-exploration, partnering, matching, and the pictionary at the end. The game at the end seemed to be a big highlight because it felt like it was real-life and students were synthesizing all they learned from the lesson. 

NPS & Satisfaction

We measured users’ Net Promotor Score (NPS) and Customer Satisfaction Score (CSAT) by using 10-point and 5-point Likert scale questions, respectively. These questions were administered after the fourth and last testing session. Capturing these quantitative metrics allowed us to benchmark users’ feelings and attitudes to get a pulse check on the Immerse product. To further dive into how and why, we analyzed the qualitative data from the open-ended questions and focus group interviews to triangulate our findings. 

 

Key Insights #2

Net Promotor Score (NPS)

The data revealed an NPS Score of 88%, where 88% of users “would likely recommend this VR learning experience to a friend or colleague.”

 To better understand this NPS score, we looked at what users were saying:

“I think I would recommend this. The possibilities are endless with VR learning and I think Immerse is a great example of the great things VR can be used for.”

“I absolutely would share this with my friends and I personally want to continue using VR. I think there is a real market in the retirement community for language learning and many other activities. Friends posted on facebook that they were trying to use it for travel experiences and desperately wanted some help. They want to get together soon to talk about how we can use it.”

“I probably will recommend my friends to try VR for learning in the future. but maybe after I have some basic knowledge of VR. I think being completely immersed in a virtual environment is terrifying for a beginner.”

Based on user responses, a majority of users would highly recommend the Immerse app. We found that users resonated with the app for language learning, but also identified other potential markets and use cases where the platform might thrive. From language learners to travel enthusiasts or even a retirement community, our users are feeling excited about the potential of the Immerse app. Capturing the customer experience through an NPS measure allowed us to benchmark where the Immerse app is currently situated with our users and provides a metric for how we might continue future iterations.

CUSTOMER satisfaction Score (CSAT)

The data revealed an CSAT Score of 91.2%, where 91.2% of users “felt satisfied with the Immerse VR app.”

To better understand this CSAT score, we looked at what users were saying:

“Overall, I am very satisfied with the learning interface because the activities are fun and are well-designed for integrating the VR functions with the learning content.”

“Overall I like the VR learning interface. It feels welcoming. Everything so far as felt very simple and easy to navigate once you play around with it and I enjoy the variety of immersive activities in the settings and the different layouts we are able to interact with.”

“The design is very simple and straight forward and is not distracting while learning. Some of it can be a bit overstimulating, but you get used to it as you use it. Some of the controls were hard to grasp at first but become more natural as more time is spent in the platform and in VR in general.”

Upon analyzing CSAT score, we discovered that a majority of users were quite satisfied with the Immerse app after four weeks. We infer that one of the primary reasons for such a high rating was the content/activities and 3D interactions alignment. Gleaning user responses, we can see that users felt very satisfied with the interactive activities they were engaged in and how the design of these activities sustained their interest. Capturing a CSAT provided us with a snapshot of what our users currently think about our product and analyzing the open-ended responses and focus group transcript validated our product for launch.  

Reflection

Having served as the lead UX researcher on this project, I had the unique opportunity to engage with users deeply by utilizing a variety of user research methods including, longitudinal pre and post-surveys with quantitative metrics and open-ended questions, focus group interviews, and contextual inquiry. One of my favorite aspects of conducting contextual inquiry is that users are in their ecological valid settings in which the product is used in the naturally occurring environment. This boosts the validity and rigor of our research design and reduces bias in our data.

In this way, we were able to obtain user reactions, feelings, and attitudes after the user had ample time to experience the product, tracking users for four consecutive weeks. This project validated many of our conjectures and assumptions about where our product was at and where our product was going. As we were in the last several weeks preparing for the app launch on the Meta store, the data collected and analyzed from this project provided invaluable metrics to share with our cross-functional team members, including a final report sent to Meta before beginning the final quality control process to publish the Immerse app on the Meta app store.

This project was quite rewarding for me personally as I led our team from beginning to end, playing a pivotal role in validating our product prior to launch. From creating the survey measures and facilitating the interviews, to disseminating the findings to our key partners, this experience afforded me the space to work cross-functionally with my team members and collaboratively showcase impact in the business. I was able to identify tensions and validate our product by demonstrably showing metrics that evaluated our product success.