Home » VR-REU 2024 » Enhancing Virtual Exploration for Blind and Low Vision Users: In-Place Virtual Exploration with Mixed Reality Cane

Enhancing Virtual Exploration for Blind and Low Vision Users: In-Place Virtual Exploration with Mixed Reality Cane

Hong Zhao, CUNY Borough of Manhattan Community College

Mentors: Hao Tang and Oyewole Oyekoya


Week 1:

The day before the VR-REU program officially kicked off, Dr. Wole organized a bowling icebreaker event. It was really fun! Everyone was very enthusiastic, and I was happy to meet the other cohort. On Monday, we first completed an REU pre-program survey, then took a quick tour of Hunter College and met all the REU mentors on Zoom. My work officially started as I began discussing my project direction with Dr. Tang and getting a preliminary understanding of some of the current project code. Wednesday was the first class on VR, AR, and Mixed Reality, which primarily covered the ideal principles, how it works, and the history of VR. For the rest of the week, I reviewed some related literature and then had another discussion with Dr. Tang to finalize some research directions and complete my project proposal. Finally on Friday, we got an introduction to ParaView and presented our project proposals.


Week 2:

This week, Dr. Wole introduced us to the writing tool Overleaf. He also demonstrated how to use Tableau for optimizing data visualization. 

I completed the CITI training and successfully developed the initial version of the MR Cane control system. Here is how it works:

  • When the user long-presses the screen for two seconds, the virtual character begins to move.
  • The direction of the character’s movement is determined by the current orientation of the headset.
  • The mobile phone acts as a white cane. Users can swing their phones left or right to control the movement of the virtual cane.

Initially, I used the AR Session’s camera to capture the phone’s rotation data. However, this method proved to be imprecise when the phone was parallel to the ground, leading to suboptimal performance. To address this, I switched to using the phone’s gyroscope to obtain the rotation angles. This approach has significantly improved the test results.

Here are some key points about using the gyroscope:

  • When the device is in its default orientation, the X-axis points horizontally to the right, the Y-axis points vertically upward, and the Z-axis points away from the front of the screen. Thus, the back of the screen corresponds to a negative Z value.
  • As the device moves or rotates, these coordinate axes remain fixed relative to the phone.

This new method using the gyroscope has shown promising results in our tests, enhancing the accuracy and responsiveness of the MR Cane control system.


Leave a comment

Your email address will not be published. Required fields are marked *

Hunter College
City University of New York
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar