Home » VR-REU 2023 » VR as a Learning Tool for Students with Disabilities – Summer 2023

VR as a Learning Tool for Students with Disabilities – Summer 2023

Filip Trzcinka – Hunter College

Mentor: Daniel Chan

Week 1:

Before meeting with my mentor Daniel to narrow down what kind of project I would be working on, I decided to get ahead on my work and start on the Literature Review portion of my research paper. I wasn’t sure if I would be focusing on a physical, mental, or learning disability for the project so I began to research and read about said topics to see which direction I would prefer to take. Upon furthering my knowledge, I found myself more focused on papers that described learning and cognitive disabilities. I came up with two main research proposal ideas that I brought up to Dan when we met, and asked for his advice and for any feedback he may have. Upon conversing and pitching my ideas, he allowed me to choose which of the ideas I’d prefer to work on. After some though I decided my research project would focus on the creation of a driving simulation for student drivers who have Attention-Deficit Hyperactivity Disorder (ADHD). We planned to meet again next week after I have thoroughly researched the problems people with ADHD face when engaging in a learning activity, and what possible methods or features I should include in my simulation to aid in their learning experience. I also plan to begin mapping out and creating the simulated setting on Unity so I may also get ahead on the creation portion of the project. When we pitched our proposals to the group on Friday, Dr. Wole mentioned that he could try to connect me with someone who has had experience with making a driving simulator so that I could potentially build upon their work rather than start from scratch. Nevertheless, I will still take a look if Unity has some assets already implemented that I could use if the challenge of making a driving sim from scratch should arise.

 

Week 2:

This week I focused on the Literature Review section of my paper. I found a paper that described research conducted using a VR driving simulation to see if such a tool can help improve driving skills of people with Autism Spectrum Disorder. I decided to use this paper as ground work for how I’d like to develop my idea. Though their simulation was very basic, there was a test group where the simulation that used audio feedback to help remind the drivers of important rules of the road: like staying under the speed limit, staying in their lanes, etc. I considered what other features could be implemented to help those using the simulation to learn. I knew I would focus on ADHD so I read through papers where a VR driving simulation was tested with people with ADHD as a test pool, but could not find any conducted research that used an enhanced simulation instead of a basic one. Some used tools like eye trackers and such, but no real software implementations to benefit the learning experience of the user. I then looked through papers of teaching techniques used to help people with ADHD learn, with an emphasis on keeping their attention. After discussing with Dr. Wole, I went back and found papers that tested for ways to keep attention of everyone driving, besides just people with ADHD. With what I’ve read so far I created a list of features I’m hoping to implement into the driving simulation I create. The week was finished with writing my Related Works section and posting it on Overleaf with proper citations.

 

Week 3

With the Related Works draft completed, it was time to start the development of my driving simulation. Using the Unity game engine, I was able to import a low polygon city asset to use as the environment. I made some edits to it to make intersection lines more clear, box colliders for buildings so the user wouldn’t just phase through them, as well as adding traffic lights for every intersection. Since the traffic light assets were only for show and had no real functionality, I had to add point lights for the Red, Green, Yellow lights and I tried to write some scripts to allow the lights to change based on a timer, but unfortunately I made little progress with making that work. Will have to continue next week. I got a car asset which already includes a steering wheel object (so I would not have create my own) and imported it into the scene. I wanted a steering wheel object as I eventually hope to have the user actually grab the wheel to steer. For the car, I removed animations that were connected to the asset, added a camera to simulate the first person view then got to work on my scripts to allow the car to accelerate, reverse, and steer using the WASD keys (temporary inputs so I can test everything out at the moment) as well as having a hand brake as the space bar key. I had to take time to adjust the float values of the car’s rigid body, as well as the values for the acceleration as my car did drive pretty slow at the start. It still drives slow, but that could benefit the use for driving in a city environment. After running through the added features I plan to include with Dan, I began my work on the Methodology section of my paper as well as taking another hack at the traffic light scripts.

                                     
 
 
Week 4
 
Got my traffic lights system working! Though the method to get it to work is very unconventional, unless it breaks something else I will not be touching it anymore. After getting that completed I worked on fixing the speed of my car object during acceleration as it was extremely slow last week. The main work done this week however, was the implementations of my visual cue features. The first was a ring that appears over the traffic lights to help grab the attention of the user. This would occur when the user reaches a certain proximity trigger to that specific traffic light, so you don’t have so many of the rings active at once as that would be counter productive to its purpose. It took some time to get that working, and unfortunately this feature exists only for certain traffic lights in my scene, so I need to go through the lights without the feature and add that in. The second feature is that of a lane alert. When the car object moves through the lane line, a red translucent bar appears to signal the user that they need to stay in their lanes. With those features implemented I was able to present a decent product for the midterm presentations that occurred this Friday. At the end of the week I finished up the draft for the Methodology section and I eagerly await the notes Dr. Wole has for me as I am not sure if I wrote it in a conventional way. Next week I will try to deploy the game onto the Oculus Quest headset, change the input device from the WASD keys to the actual controllers, and begin implementing my audio cue features that I have planned.

 

Week 5

Unfortunately this week was not as productive as I had hoped. Through the process of trying to deploy to the Meta Quest 2 headset, many issues occurred. First there was the issue of unsuccessful builds. Many console errors that made no sense to me would pop up which would cause the build process to exit unexpectedly. As typical with computer science, this was solved by googling said errors and working through solutions others have posted online, leading me to be able to progress forward with a single step of having a successful build. However, the error of running the simulation on the headset was the next and more difficult hurdle to overcome. When the Unity game would try to run on the headset, a infinite loading screen would appear with the unity logo jittering in front of you. I had a colleague in the program who did have a successful deployment of his game try to help, but still the same problem was happening. Together we tried to deploy an empty scene, however still no success. I got permission to factory reset the headset and set it up as if it was my own, however through this I would be unable to verify my account as a developer due to a problem Meta has had for over a year where they would have issues sending over a SMS confirmation code for account verification. Eventually I brought the headset in to have it checked and set up by Kwame who was now able to get a previous Unity game to deploy on the headset. With this light at the end of the tunnel giving us hope, we tried to deploy the empty scene which worked! And yet our final roadblock of the week appeared, my work for the game would still not deploy. The same issue of the infinite loading screen would appear. As is typical for roadblocks, I will now have to take a few steps back in order to progress forward. I will need to rebuild what I originally made in my empty scene that I know works. This will have to be done incrementally as I need to ensure that any progress made can still deploy to the headset, rather than rebuild it all in one go and encounter the same issue. In a more positive light, this week I was able to implement another feature I planned to include, which is when the car object enters the lane collider trigger, a sound cue will loop at the same time the visual cue will appear. This is to use both the senses of sight and sound to grab the attention of the driver towards their mistake. I also worked to edit my Methodology section of my paper to polish it up and include more specific and important information relevant to the proof of concept paper. Week 6 will definitely require me to go the extra mile with my work as I am currently behind everyone else, with 3 weeks left to go in the program, yet often times diamonds are formed under pressure.

 

Week 6

This will be a short blog post as throughout this entire week, I have just been working to recreate all that I had into the empty scene that was able to deploy to the headset. Since I wasn’t too sure what exactly caused problems for deployment originally, any time I added something new to the scene, I would deploy to check if it worked. This along side the fact that what you see in Unity on your laptop is different than what you see with the Quest 2 headset on, led to a very repetitive process of adding something or making a change, build and run to the headset, check to see how it looks, something is a bit off so lets go and change it, build and run to headset, rinse and repeat. Though tedious, I was able to get almost everything I had earlier, deployed and working. The only thing that still needs to be completed is the actual player input manager so the car can accelerate/decelerate/steer/brake through the player’s button presses. My suspicion for last week’s roadblock was most likely due to me making a mistake with handling player input, so I am a tad nervous that I will make a mistake again and cause it to break, especially since I’m not familiar with how Unity deals with Quest 2 controller inputs, as will a keyboard its just: Input.GetKey(“w”). In the meantime however, I implemented my final feature idea which is when the user is not looking forward for two seconds, a audio cue is played until their FOV once again is focused on the road. With just the player input left to go, I’m excited to start player testing next week and finishing the Data Collection and Analysis portion of my paper.

 

Week 7

I was able to complete the button inputs for the game on Monday this week. It’s not exactly what I had planned, but since we’re stretched for time it will just have to make do. That same evening I created my google form questionnaire, then had my older brother and my father test out the game for two and a half minutes each. They filled out the questionnaire, as well as giving me more feedback face to face that I took down in my notes. Tuesday I had another person I know test out the game and complete the questionnaire, and Wednesday I had anyone in the program who was willing to test it out try out the game, making sure I maintained consistency with how I managed this user study. That led to a total of eleven people for my Control Group. That same day I had one person who I knew was diagnosed with ADHD by a professional, also test the game and fill the questionnaire, leading me to complete my actual testing. Thursday night I completed the “Testing” section of my research paper and this Friday and weekend I will continue to work on the “Data Results and Analysis” section and the “Conclusion” section.

 

Week 8

What a week. As the REU approached its end, everyone in the program scrambled to get to the finish line. I for one spent Monday and Tuesday this week getting my paper’s draft finished. I sent a draft to my mentor Dan who gave it back with some extremely helpful notes. This allowed me not only fix my paper, but also know what I needed to prep for our Symposium that happened on Thursday, and let me tell you that presentation was stressful for me. I did not do a presentation for over five years and was severely out of practice. Thankfully Dan and a couple of friends from the REU helped me prep. I still stuttered and stumbled my way through it, but received many interesting questions about my project that made me think more in depth about it. Friday was spent finalizing my paper while also helping my colleagues with theirs. It was a definitely a bittersweet ending to an amazing program experience.

Final Paper:
Filip Trzcinka, Oyewole Oyekoya, and Daniel Chan. 2023. Students with Attention-Deficit/Hyperactivity Disorder and Utilizing Virtual Reality to Improve Driving Skills. In Companion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces (ISS Companion ’23). Association for Computing Machinery, New York, NY, USA, 1–4. https://doi.org/10.1145/3626485.3626529 – pdf

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar