Home » VR-REU 2024 » Enhancing Virtual Exploration for Blind and Low Vision Users: In-Place Virtual Exploration with Mixed Reality Cane

Enhancing Virtual Exploration for Blind and Low Vision Users: In-Place Virtual Exploration with Mixed Reality Cane

Hong Zhao, CUNY Borough of Manhattan Community College

Mentors: Hao Tang and Oyewole Oyekoya

 

Week 1:

The day before the VR-REU program officially kicked off, Dr. Wole organized a bowling icebreaker event. It was really fun! Everyone was very enthusiastic, and I was happy to meet the other cohort. On Monday, we first completed an REU pre-program survey, then took a quick tour of Hunter College and met all the REU mentors on Zoom. My work officially started as I began discussing my project direction with Dr. Tang and getting a preliminary understanding of some of the current project code. Wednesday was the first class on VR, AR, and Mixed Reality, which primarily covered the ideal principles, how it works, and the history of VR. For the rest of the week, I reviewed some related literature and then had another discussion with Dr. Tang to finalize some research directions and complete my project proposal. Finally on Friday, we got an introduction to ParaView and presented our project proposals.

 

Week 2:

This week, Dr. Wole introduced us to the writing tool Overleaf. He also demonstrated how to use Tableau for optimizing data visualization. 

I completed the CITI training and successfully developed the initial version of the MR Cane control system. Here is how it works:

  • When the user long-presses the screen for two seconds, the virtual character begins to move.
  • The direction of the character’s movement is determined by the current orientation of the headset.
  • The mobile phone acts as a white cane. Users can swing their phones left or right to control the movement of the virtual cane.

Initially, I used the AR Session’s camera to capture the phone’s rotation data. However, this method proved to be imprecise when the phone was parallel to the ground, leading to suboptimal performance. To address this, I switched to using the phone’s gyroscope to obtain the rotation angles. This approach has significantly improved the test results.

Here are some key points about using the gyroscope:

  • When the device is in its default orientation, the X-axis points horizontally to the right, the Y-axis points vertically upward, and the Z-axis points away from the front of the screen. Thus, the back of the screen corresponds to a negative Z value.
  • As the device moves or rotates, these coordinate axes remain fixed relative to the phone.

This new method using the gyroscope has shown promising results in our tests, enhancing the accuracy and responsiveness of the MR Cane control system.

Week 3:

This week, Dr. Wole explained topics related to immersion, presence, and reality demo seminar, and also introduced us to the tools in VMD. 

I developed a new method for controlling movement in place, inspired by the Meta Quest 3 controller. By using a mobile phone as the control device, users can simulate forward movement by swinging the phone up and down. It’s a more natural and intuitive way to navigate the virtual space, making the experience feel even more immersive.

Additionally, I completed the development of the footstep sound management module. This module plays corresponding footstep sounds based on the material of the ground the character is stepping on. The technical details involve creating the character’s walk animation and adding a callback function to play sounds at keyframes. An animator is used to control the transitions between standing and moving states of the character.

I also added a settings menu to allow users to switch between different movement control modes.

 

 

Week 4:

Our coursework this week was very rich. Dr. Wole covered topics such as 3D tracking, scanning and animation, interaction and input devices, and an introduction to GPUs. These topics were very interesting and have been very helpful for my project.

This week, I mainly completed a module called Layout Learning. This module is designed to help visually impaired individuals build a mental map before exploring virtual environments. Specifically, by holding down the screen and moving your finger back and forth to explore the layout of a virtual room, if you touch walls, tables, or chairs etc., the system provides vibration feedback and plays the corresponding object name. Additionally, when moving outside the room, the system alerts “Out Of Room.” Some of these improvements actually came from user feedback during user studies.

 

Due to focusing on the development process previously, I fell a bit behind on my research paper. So, on Wednesday and Thursday, I caught up on my research paper progress and completed the introduction and methodology sections.

On Friday, we had our midterm presentations, and everyone performed very well. I also completed my presentation smoothly. 

 

Week 5:

This week, Dr. Wole lectured on Graphics Processing Units (GPUs) and Immersive Audio. I learned a lot from the class. The presentation introduced many technical points that help us improve rendering efficiency. 

In terms of development, I focused on enhancing the tutorial module of my app. This module is designed to guide users on how to use their phone as a white cane. It provides instructions on turning, moving forward using different gestures like long presses and downward swipes. The most exciting part is the use of 3D audio and sound playback speed to help users navigate towards their targets. The 3D audio cues indicate whether the target is on the left or right, and the playback speed increases when the user is facing the target directly.

Additionally, I made significant progress in setting up the Apple developer account and successfully uploaded the app to TestFlight. This will make it easier for our testers to download and use the app, providing us with valuable feedback. I also designed a survey and created it using Google Forms to gather detailed user feedback on their experience with the app.

 

Week 6:

This week I focused on designing all the survey questions. The goal was to verify if my Layout Module can help BLV (Blind and Low Vision) users build a mental map and assess its effectiveness.  I also aimed to test the operability of two in-place control movement methods: long-press and swing down, asking users to choose their preferred static movement method.

Additionally, we invited a few high school students and a BLV user to the Hiterman Hall lab for testing. The non-BLV testers performed the app tests with their eyes closed to simulate the BLV experience. After the tests, they completed the survey questionnaire.

This Saturday, the REU cohort attended a lunch cruise. We had a great time and took many beautiful photos. Moreover, Dr. Wole assigned each REU member two research papers from SIGGRAPH to review and evaluate.

 

Week 7:

On Monday, we shared the progress of our respective projects. Then, we reviewed the papers from the SIGGRAPH Asia conference. Those papers truly provided me with a lot of inspiration. On Thursday, we went to the CUNY Advanced Science Research Center, where we experienced some very cool interactive visual technologies and visited various scientific labs. I also participated in Cason’s project presentation, which was really creative.

This Friday, Dr. Wole explained some statistical analysis strategies and provided reasonable analysis suggestions for each person’s survey. I learned many new analysis methods and look forward to applying them to my project. This week, I also participated in application testing and surveys conducted by other REU students. I invited high school workshop students to test our application with their eyes closed and complete surveys. Their feedback was very helpful to us.

Additionally, I tried to invite BLV (blind and low vision) users from a blind school to test our application. Unfortunately, when I arrived at the blind school, they were in class and unable to participate in the testing. But I will continue to look for opportunities to get them involved.

 

Week 8:

I can’t believe it’s already the last week of my REU project. At the beginning of this week, I started delving into analyzing the user data we collected. I used Tableau to create some excellent charts for our presentation. I began wrapping up the remaining sections of our paper, adding results, discussion, and future work. I also prepared the slides for the final presentation on Thursday.

Wednesday was so much fun! Our REU group participated in the teacher motion capture session. Professors gave short talks while wearing motion capture equipment, and we listened to their presentations and evaluated their performance.

Thursday was a significant day, we gave our final presentations! We summarized our research projects and presented them to our fellow REU participants, mentors, and some students from Iowa State University. It was a bit nerve-wracking but also very meaningful.

On Friday, we spent the entire day at Hunter College. We finished our paper together and had a wonderful day. We also attended presentations by students from Iowa State University. On the way home, I started brainstorming ideas for the poster and planned to submit both the poster and the paper to ACM ISS 2024 by Saturday.

I had a fantastic time with everyone in the REU program. I am very grateful for this experience and the friendships I have made. I wish all the REU participants good luck in their future endeavors. Thank you all for embarking on this journey with me.

 

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar