Amaya Keys, Howard University
Week One:
As a pleasant start to the program, I met the other students I’d be spending the next eight weeks with over a fun night of bowling. On Monday morning, we briefly met with our mentors via Zoom following an in-depth tour of Hunter College’s facilities. My mentor, Mr. Daniel Chan, recommended that we keep in mind the scope of the project and not to overextend ourselves with a wildly complex project for only 8 weeks. To me specifically, he suggested that I pick one specific disability, do thorough research on it, and from there decide on how a VR/AR application may help. I met with him one-on-one twice during the week to receive guidance on all of the jumbled thoughts racing through my mind. After bouncing from idea to idea, I finally landed on the development of a virtual multi-sensory stimulation room for adolescents with ASD that experience anxiety. To close out the week, we completed a short lab on ParaView, decided on a conference that we would submit our research to at the end of the summer, briefly reviewed our updated project proposals, and then finally got a sneak peek into the VR lab. Now that my project is solidified, I hope to dive into experimenting with Unity, as I know I have a steep learning curve ahead of me.
Week Two:
To start off the week, I finished the remaining modules of my CITI training so that I could solely focus on writing my literature review. I also received one of Dr. Wole’s VR Meta Quest headsets to take home and begin experimenting with. I attempted to set it up to my computer and phone on my own as well as deploy Unity but experienced many challenges. Later in the week, Kwame assisted me with the set-up process, and we ran an escape room demo to test its functioning. Following successful set-up, I worked to enable hand tracking on the headset and test another demo but ran into further difficulties that I am still working to resolve.
I spoke with my mentor at our scheduled Tuesday meeting time, and we just ensured that I was feeling confident and secure in the direction my project was heading. Throughout the week I continued to take notes on various studies and worked to weed out the ones I felt were unnecessary to include in my paper. The first draft of my literature review was unnecessarily long and contained way too many sources that weren’t directly related to my research, but by Thursday night, I had finalized a version that I was happy with. I sent it to my mentor for feedback and while he had a few comments about potential changes, he was overall pretty satisfied with how it looked. On Friday, we conducted a brief lab on Tableau and updated Dr. Wole on the status of our projects.
Next steps for me include solidifying my methodology and making headway on the development of my sensory room. I intend to search through the Unity assets store and TurboSquid to hopefully find some ready-made elements for my project. Before anything else though, I definitely want to fix the errors with the hand tracking functionalities, as this will be a significant element of my application.
Week Three:
Now that the introduction and related works sections are out of the way, I used this week to focus on the development of my application in Unity. I got off to a pretty slow start; I worked through figuring out what packages I needed to import and how to enable hand tracking. My next goal was to create 3D bubbles that the user can pop as they rise from the floor. I looked through the Unity Assets store and Turbosquid, but couldn’t find any pre-made model that would fit my needs. I followed a tutorial to create a basic bubble GameObject and wrote a script for them to multiply and continuously respawn. I am still trying to equip it with a poke interaction so that they “pop” when touched. I then created a light panel on one wall of my room and am working to make each circle change colors when tapped. Now that I have a little more experience with navigating and troubleshooting in Unity, I hope to be more productive with development next week.
Week Four:
This week I focused on making as much technical progress in Unity as possible while also thinking about my methodology to prepare for our midterm presentations on Friday. I decided that the five elements I wanted in the environment are a light panel, poppable bubbles, 3D object play, glowing interactive stars, and an alphabet/number board. By attaching a script to the Interactable Unity Event Wrapper from the Meta Interaction SDK, I was able to enable random color changes of the buttons on the light panel when touched. From there, it became so much easier to add interactive components to other objects in the scene now that I had that baseline. I also decided to make the light panel resemble an LED hexagon panel commonly used by individuals with Autism. I then modified a map and pins from one of the Meta Interaction SDK samples and imported models from Sketchfab and CGTrader to recreate a magnetic alphabet board. The user can drag letters and numbers from the tray and place them freely on the board. The third element of focus this week was importing 3D shapes and enabling the necessary components to allow the user to scale, rotate, position, and throw them in their hands. I now have a bubble popping animation clip but was still unsuccessful in getting them to pop when poked, which will be a focus for next week. Another issue I’m running into is that the game starts in a new camera angle every time I run it, and I am not sure why. I have worked around it for testing purposes, but it is something I will definitely have to resolve later so that there is no confusion when the participants put on the headset. As a relieving end to the stressful week, midterm presentations went well, and it was interesting hearing the updates on everyone’s projects. It’s refreshing to know we’re all dealing with different challenges and are working through them at around the same pace.
Week Five:
This week was the most productive for me technical-wise. While I unfortunately was not able to get the bubble animation working in Unity, I did enable a simple popping functionality where they are destroyed on touch. The last and final activity that I created was a series of interactive stars in the sky that have paths drawn between them when “activated” or lit. I then imported a counter surface for my activities to be displayed on and created a simple dome in blender as the exterior. The new models caused some technical problems with the existing objects, so a lot of my time was spent debugging those issues.
I also worked on completing the required documents necessary for IRB approval. However, at the end of the week, I still had not been reached out to by any prospective participants, so Dr. Wole and I discussed whether it was even worth going through the IRB process at all. He brought up the idea of possibly designing a work-in-progress research paper instead of going through with a user study. This would give me the opportunity to either turn this into a long-term project or allow another researcher to pick it up.
Week Six:
On Tuesday, Dr. Wole, Mr. Chan, and I had a discussion about the direction of my project and decided it would be best if we follow through with the work-in-progress paper, and put the IRB process on hold for now. With that being said, I don’t have much to write about this week, as I just focused on making tweaks to my virtual environment and editing my experimental procedure. The room now has a welcome console with options for user to control audio input as well as text instructions beside each activity. I still need to record a short demo of each activity in action to be played above the instructions for those that may want a visual. I also added an additional constellation for users to interact with. I am having many technical difficulties with the letter/number board and may need to start brainstorming some alternative methods for how it will function. I would like for the user to be able to drag letters from the tray and place them on the board, which will then “lock” the object to the board surface as if it is magnetic. This is proving to be quite difficult so instead, they may have to just touch the letter/number they’d like and it’ll appear on the board. I’d like to have everything finished by early next week so I can receive feedback on the environment from others in my cohort. It will not be a traditional user study, but it is still feedback that can ultimately be included in my paper.
Week Seven:
This week I modified my experimental design and found an anxiety scale that can be used to quantify my results. The Glasgow anxiety scale is a 27-question questionnaire that measures anxiety in people with intellectual disabilities. I modified and shortened it to 11 questions to make it relevant for my study. After discussing my options with my mentor and Dr. Wole, we decided it would be best to conduct a pilot study for now, include the preliminary results in my paper, and just write about the time constraint being a limitation towards conducting a full-scale project. I finalized my pre- and post-intervention survey and prepared to immerse my peers within the environment. On Friday, we briefly went over statistical analysis with Dr. Wole and I was advised to perform a Friedman repeated measures ANOVA for the survey ratings and a one-way repeated measures ANOVA for the anxiety scale. While I wasn’t familiar with using SPSS statistics or performing any kind of statistical analysis for that matter, I was confident I could figure it out with the help of the links and tutorials Dr. Wole posted since I had pretty simple data. After the session, I ran the study with 7 of my peers that were able to stay back and help me. Even by this point, I still wasn’t completely satisfied with how the room looked and functioned, but with only a week left I had no choice but to move forward. I really hope that this is a project I am able to continue to work on next semester so I can finally explore the benefits on the target subject group!
Week Eight:
I can’t believe it is the final week! This week honestly felt like a blur; one day I was gathering my stats and writing my results and the next I was giving my presentation. I finalized the methodology and results sections to send to my mentor early in the week and then began drafting my discussion and conclusion. On Wednesday, we participated in the teacher motion capture session to help Dr. Hayes and Dr. Wole’s research study. I thought it was a really cool project and found many of the presentations to be quite interesting. My favorites were Dr. Hayes’ talk on XR for education and Mr. Lichtman’s talk on Inclusive Game Design. After a long day, I came back and worked alongside a couple of my peers to finish up my slides for Thursday afternoon’s final presentation. On the day of presentations, I was very nervous and didn’t feel as prepared as I had felt for midterms—just because of how fast everything seemed to be moving these past 2 weeks. I still feel like it went well, despite a few mess-ups, and was relieved to have it over with. Lots of my family and friends joined the Zoom to watch and support me, so I was immensely grateful for that. The Iowa State students also seemed very interested in our projects and asked some great questions. I appreciated them sticking with us throughout all the technical difficulties and construction noises. The following day, we listened to their presentations after our morning paper writing session. I was impressed with all that they were able to develop on Unity in a group setting with such a short turnaround time. We wrapped up by going over any final conference submission details and with that, the summer was officially over! I had such a great time with my cohort and can say they definitely made my first research experience! I hope to see that all our papers get approved for our respective conferences and that I can witness their talks/poster presentations when the time comes.