Home » VR-REU 2024

Category Archives: VR-REU 2024

Exploring Perceptions of Structural Racism in Through 3D Visualizations

Exploring Perceptions of Structural Racism to see how it contributes to climate and its influence on respiratory diseases through 3D Visualizations

Rhia Kumar, Stony Brook University

Week 1:

Before the internship began, there was a social gathering at Bowlero to facilitate introductions with our peers and Dr. Wole. Unfortunately, I didn’t catch the memo and ended up missing out on my initial opportunity to connect with my fellow interns. The next day, I had the opportunity to meet with my peers and Dr. Wole. We explored Hunter College and concluded the day by connecting with our mentors. I scheduled an individual session with my mentor(s) to delve into the potential structure of my project. I started examining the relationship between housing valuations and climate, considering them as pivotal elements for my research. Additionally, I initiated a review of literature focusing on structural racism and its historical visualization to enrich my understanding. Navigating the various directions for my project left me feeling slightly overwhelmed. During the remainder of the week, I completed my project proposal, delineating its distinctiveness. We concluded the week by familiarizing ourselves with Paraview and delivering our project proposals.

Week 2:

The week began with an introduction to the fundamental principles of writing a scientific research paper and familiarized us with Overleaf for composing our documents. Additionally, Professor Wole conducted VR lectures on immersive visual and interactive displays, alongside 3D geometry. Following that, the next day I had a meeting with Dr. Wole, Professor Cogburn, and Lisa on exactly the next steps moving forward and what I should have accomplished by the end of the week. We decided it was best to explore the visualization tools further and begin data collection to find specific years that I should specifically analyze. Out of all the tools, I definitely think mapbox and unity does seem the most feasible. Additionally, they mentioned after I had done this, beginning next week my implementation of VR in my research would be discussed and further finalized. 
Professor Wole then took the time to go over everyone’s technical implementations and progress with their research. I took this time to download unity and integrate mapbox. I began to watch videos on how to use Mapbox, so I can familiarize myself. It didn’t go that well, but I continued my literature review. I was able to add my introduction and related works section to overleaf. 
Thursday was where I felt extremely overwhelmed. As I was in the midst of collecting data from the online data portal relating to climate, specifically heat index, and health (particularly asthma related ed visits), I came across several roadblocks. All the portals with access to this data were prohibiting me to analyze data of heat index over particular years, rather it would just say ” heat index of 1 out of 5″ for x neighborhood or it gave data only for one particular year (2018). However, I was able to find data on heat events, which is where the most people die due to extreme heat than all other extreme weather events combined. According to the portal, 100-degree days are a significant heat event, so I was able to compile data on heat events citywide, yet again I ran across the issue of finding specific data for the two neighborhoods (Riverdale and Soundview). Hence, I then zoomed into the two historically redlined areas for asthma-related visits because it is seen that these heat events exacerbate asthma. I decided to look at asthma-related visits because once someone has asthma, they always have it. 
We ended the week exploring Tableau, which I found to be very interesting. We created GitHub ID’s, where we then updated our to-dos, in progress, and done sections in relation to our projects. Subsequently, everyone gave Dr. Wole updates on our projects, and I was able to express what errors I came across the day prior with data collection. He advised that maybe if there is data on specific boroughs, I can use those as comparison instead of two specific neighborhoods. We ended the day with a chance to look into the virtual reality headsets, which was so darn cool! 
Here is the data I collected on Thursday, but will need to be updated as per Dr. Wole:

Week #3:

This week started with Professor Wole’s lecture on immersion, presence, and reality, along with 3D tracking, scanning, and animation. We then watched a behind-the-scenes video for the Avatar movie, which I found really fascinating. I continued exploring different functions in Mapbox Studio and tried to follow a game-building tutorial on YouTube by Mapbox. I ran into compilation errors in Unity that prevented me from entering play mode and constructing maps. Lisa suggested a solution: after creating the Unity app, import everything for Mapbox SDK except GoogleArCore, MapBox AR, and UnityARInterface. This worked!
The next day, I created two demo models for the neighborhoods I’m examining, Riverdale and Soundview, but hit a roadblock. Every time I try to import my maps into Unity, I get the error “Transform tile provider – no location marker specified.” In a one-on-one meeting with Lisa, she mentioned she would contact someone who might know the solution. We also discussed future steps and potential questions for my user study, which I wrote down based on Lisa’s advice. We ended the week by exploring the VMD Interface and updating GitHub with our project progress.
Attached below are my two demo models I created:

Week 4:

The week began with Dr. Wole covering the topics “3D Tracking, Scanning and Animation” and “Interaction and Input Devices” in his lectures. I spent some time experimenting with Mapbox and Unity but soon felt frustrated as nothing seemed to be working. The following day, I reached out to Richard Yeung for help, knowing he had experience with Mapbox and Unity. He resolved my initial error, but I encountered a new problem: every time I imported my map into Unity, it displayed as a black map. Richard suggested I review the tutorials again, but this did not resolve the issue.
The next day, I met with my mentors, who advised me to temporarily set Unity aside and focus on working with my maps. I also discussed my challenges with climate (heat index) data with Professor Cogburn. Despite lacking disaggregated data, she noted that I knew Riverdale had a heat index of 1 out of 5 compared to Soundview’s 4 out of 5. We also talked about my upcoming midterm presentation and how to address my progress despite the setbacks.
The next day, I prepared my presentation, creating various maps to highlight the underlying inequities due to asthma prevalence in two neighborhoods. These maps included factors such as air quality, housing conditions, and green spaces. I also illustrated socioeconomic status by showing income levels and access to healthcare facilities. Additionally, I compiled some data into a spreadsheet to quantify my findings. The week concluded with my midterm presentation, during which I received feedback from Dr. Wole.
Attached below are my data spreadsheet and some maps I created:
 

Week 5:

The week started with Dr. Wole going through his lecture on “Introduction to GPU’s”, which I found to be quite interesting for anyone that has a passion for possibly building their own one day. The next day, I felt slightly behind in terms of updating my paper so I took the time to add/change/edit all of my previous work because I did make some changes to the format of my maps. I still am having issues with Unity and Dr. Wole mentioned that if the issue persists, I will have to end up leaving my maps on Mapbox. After, we went over Dr. Wole’s other lecture on “Immersive Audio”. Additionally, Lisa gave me the following suggestions for my maps: May need to change up visualizing roads and highways, socioeconomic maps, and definitely the access to healthcare (potential extrusions or reducing green space colors for non-green space maps so that map highlights highways and access to healthcare symbols).” Thus, I took the initiative to fix my roads and highways map that day by breaking up the different types of roads nearby instead of just classifying all black lines as roads/highways. Thursday was Fourth of July, so I took the day to myself and went to the beach to enjoy the fireworks show with my friends. On Friday, we had a self paced Graph/Network Visualization with Gephi and did a weekly update with Dr. Wole. Along with that, I fixed my access to healthcare map as per Lisa’s suggestions. Lastly, the week ended with a social outing on a lunch cruise with my peers, which was very much enjoyable and nice to hangout with the group! 
Attached below is some of the maps I updated:

Week 6:

This week began without any in-person meetings, which allowed me to connect with Lisa for a much-needed catch-up session. We focused on addressing issues in two out of my five models and collaborated on refining potential survey questions. Lisa shared her survey as a helpful template for me to structure my own. We wrapped up by scheduling our next meeting and setting a goal to finalize my survey for her review and potential distribution by the weekend. The following day, I continued to diligently work on refining the remaining models. Midweek, I found Dr. Wole’s lecture on perception, VR sickness, and latency to be particularly fascinating and insightful. Later in the week, I took Lisa’s feedback to heart and iteratively improved my survey structure. My aim was to present a polished version to Dr. Wole by Monday, ensuring it was ready for data collection. As the week drew to a close, I dedicated time to reviewing articles for SIGGRAPH Asia.
Attached below is sample questions from my survey:

Week 7:

The week started by reviewing the SIGGRAPH Asia feedback from the previous week. We also held our weekly check-in meeting, which was rescheduled because we missed it last Friday. At the end of the day, I sought feedback from Dr. Wole on my survey before distributing it. He suggested modifying my last question from a checkbox grid to a multiple-choice grid. The following day, I focused on sending out my survey to various individuals and servers to gather as many responses as possible. On Wednesday, I took some personal time to enjoy a fun outing with friends in the city. The next day, we had an enlightening field trip to the ASRC, where I learned a lot of new things and got to test-run Cason’s fascinating project. The week concluded with an unexpected family emergency, which prevented me from attending the meeting in person. However, I used this time to make updates to my paper and began working on the data analysis section.

Week 8:

The last and final week was finally here! This was an intense and overwhelming experience, but I’m thrilled to say I made it through! The journey was short yet challenging, with moments where I felt uncertain about the project’s direction. The week began with a Zoom meeting to discuss SIGGRAPH Asia logistics. Following that, I dedicated time to analyzing my results and writing the final sections of my paper. The next day, I focused on refining my paper by enhancing the discussions and conclusions. Wednesday was entirely consumed by a comprehensive teacher MoCap session, which was both exhausting and incredibly insightful. Afterward, I returned home to prepare for my presentation by building upon my midterm presentation. Before I knew it, the final day arrived. I completed a post-program survey, added references to my paper, and submitted it to the competition. I am so grateful for all the experiences made over these last eight weeks and couldn’t be more thankful for meeting such a wonderful group of people!
 

Amateur Confidence in Creativity with the Community Game Development Toolkit

Lance Cheng (he/him), University of Massachusetts Amherst

Week 1: Mon 06/03 – Sun 06/09

Hi! A little bit about myself: I’m Lance, I use he/him pronouns, and I’m a native New Yorker. I just finished my first year at UMass, where I study data science, CS, public interest technology, and comparative literature. Besides academics, I also love volunteering as a notetaker, working as a TA and at UMass’s queer resource center, learning languages, and playing guitar.

This summer, I’ll be working on the Community Game Development Toolkit with Professor Daniel Lichtman. The Toolkit is a set of tools for the Unity game engine that allows you to make collage scenes, and it was particularly developed so people without technical game development skills could still create games – otherwise, we’d miss out on so many of their unique perspectives! I hope this blog can be useful to future applicants to the REU who want to see what the experience is like or future students who work with Dan and want something to reference.

I spent most of the latter half of the week doing some literature review and coming up with different experimental designs, with the goal of the experiments being to determine if the Toolkit’s features help people feel more creative and in touch with themselves. It was great meeting Dr. Wole (who organizes this REU), the mentors, and the other interns so far, and I’m excited to work more with all of them in the coming weeks! I’m also excited to bring together the artistic and quantitative aspects of computation and figure out how to design something that maximizes creative possibilities.

Week 2: Mon 06/10 – Sun 06/16

Second week completed! The biggest event of this week was finalizing the basis of the experiment I’ll be running. Dan wanted to see how the Toolkit could help diverse communities tell stories about themselves, and to make that benchmark a little more measurable, I’ve decided to investigate if the Toolkit’s collage-style approach makes people more confident in their creativity compared to other tools. Most of my time was spent brainstorming experiment structures, doing even more literature review, and drafting the introduction and related works sections of my paper. This was also my first time using LaTeX, which was much easier than I thought it would be, thankfully.

Something else that’s been really helpful: reaching out to Dan’s former interns! Two people have worked with Dan before me (Amelia Roth and Habin Park, both of whose publications are linked on this REU home page), and both of them are lovely people who gave thoughtful advice when I discussed some of the problems I was running into. It sounds obvious, but to anyone in the future, please do reach out to past REU cohorts; it made me feel much less isolated to know they encountered the same issues and published successfully despite that.

Week 3: Mon 06/17 – Sun 06/23

Come on and slam and welcome to the (game) jam. I’ve been reflecting on itch.io’s visual novel and narrative game community, which is largely made up of amateurs who want to tell stories about themselves – exactly what the goal of the Toolkit is. Some of my favorite itch creators include graeme borland, Angela He, and Nicky Case!. I’m also happy to announce that the first few trials (ie, people I will be making into my Unity guinea pigs) will be run next Tuesday! They’ll be asked to take a “before” survey, perform a creative task using Unity and the Toolkit, and then take an “after” survey.

Week 4: Mon 06/24 – Sun 06/30

I’ve finished drafting the first few sections of the paper (Abstract, Introduction, Related Works, and Methods). For the next couple weeks before I have enough data to analyze, most of my work will consist of brute-forcing my way into finding experiment subjects.

Week 4-5: Mon 06/31 – Sun 07/07

This week, I performed my frist trial! The one subject I’ve worked with picked up Unity and the Toolkit a lot faster than I thought people would, and this person was on the less technically inclined side, so it can only be up from here. Plus, now I have at least a couple images of that subject’s creation for the paper.

I have thirteen (!!!!) more subjects lined up, as well as some candidates I need to get in touch with, so I’m feeling a lot more optimistic about my sample size and confidence levels! In a perfect world, I’d like to have somewhere in the ballpark of 25 subjects, but honestly, fourteen isn’t too bad.

Immersive Content for Interdisciplinary STEM Education

Cason Allen, Florida A & M University

Week 1: 

This REU opened up with a social event where Dr. Wole took our cohort to bowl. It was fun, as I had the opportunity to meet the other students in the program. The following day commenced the first official day of the REU, where we toured Hunter College and were given an introduction to some of the mentors. The week progressed with meeting my mentor Kendra Krueger, where we started formulating ideas for the project I would be engaging in this summer. With my project relating to how the Advanced Science Research Center (ASRC) is involved in STEM education and outreach, it was nice to see how field trips at the CUNY Graduate Center are conducted and how they aim to teach students in the short time they have. This set the stage for what my project proposal would look like, as I spent much of the rest of the week doing research on the different ways to go about the project and brainstorming methods to conduct it. In addition, we ran through the first couple of lectures in Dr. Wole’s Introduction to Virtual Reality class and gained an introduction to the software Paraview. Seeing how much we have done in such a short amount of time, I am excited about what the next 7 weeks have in store in this REU.

 

Week 2: 

This week, I met with Kendra to further discuss my project’s expectations. With this project primarily being a visualization tool for various instruments within the ASRC, its accessibility must be more open to the public. Therefore, I aim to create a desktop visualization program and, if there is any time, create a VR version. For this tool, I have started finding objects to create the virtual object library needed and have scratched the surface of what can be done in Unity. Unfortunately, I could not explore as much as I needed to due to getting sick mid-week, so moving forward, I will have to catch up and stay on track. In addition, I have been working on my literature review, gaining a couple more sources revolving around augmented reality in STEM education and assessing student learning outcomes. I was able to continue my paper by implementing this into Overleaf, and moving forward, I will be starting my methodology and meeting with a researcher at the ASRC to see how some of the instruments operate. If I can not find digital models for such instruments, I may have to recreate them in SolidWorks.

 

Week 3: 

This week, I was met with some complications, dealing with licensing issues with SolidWorks and trying to get over the learning curve with Unity. I have been able to create a sample scene for a user to navigate through to set up my workflow for the final project, as I need to go into a lab of the ASRC and replicate a lab setup there. Focusing on user interaction with stand-in objects, I have been trying to figure out a method for a pop-up screen to appear and show information on the interacted object. I have been slowly working on my methodology as well as adding sample models to my virtual library to plug into the final virtual environment.

Week 4:

This week consisted of quite a bit of development for the lab walkthrough game. Learning how to script player actions, raycasting, and interactions was challenging as I have run into countless issues, but nevertheless entertaining. Much more will need to be done, including expanding on character interactions, creating a more game-like approach to the project, and eventually deploying this project on some type of web application, but I am confident much will be completed given how this week has gone. In addition, we had to meet the milestone of completing our methodology and presenting our midterm presentations. For my presentation, I was able to reveal a demo of what I had done for the game. Next will be completing the user study section for the paper paired with creating surveys to assess the participants of this study.

Image of the demo where the user is able to interact with a laser diode

Week 5:

This week, I was able to get a bit of technical work done, fortunately. Being able to create interactions with the object, the next step is to perfect the pop-up informational slides that appear upon interaction. At least one object has been added for the disciplines of Environmental Science, Structural Biology, Photonics, and Neuroscience, with the remaining discipline being Nanoscience. In the process of adding the objects, I decided to change the environment the user was in the distinguish each of the disciplines from one another by assigning their rooms a specific color that is easily associated with its given discipline. In addition, I added sound effects for walking, inspecting objects, and leaving inspection mode and I am in the process of remodeling an instrument from the ASRC Instrument database to put into the environment. Overall, I have been trying to make the program feel more fun and game-like while developing everything. Moving forward, I will have to personalize the pop-ups to each object, finish modeling the nanofabrication device, and the incentive for the user to learn each of the facts.

Image of the new environment, showcasing the difference in lighting and some of the instruments in the Structural Biology space.

Week 6:

This week, I completed the educational game and the Google form to assess participants. The main hurdle to overcome was creating the pop-up window to give the user information on the instrument selected took the most time to figure out as the first step was to learn how to customize the player UI, then programming the panel to turn on when interacting with an object. Following that, the actual informational page needed to be displayed instead of a black screen, and finally the informational page needed to be specific to each item. Miraculously, I solved this toward the beginning of the week, leaving the rest of the week prioritizing web deployment and creating the survey. The web deployment took some time to understand, as it took many trials to finally load and render correctly but I now have a fully working link to the game through my GitHub page and a survey to assess participants who use the game for educational purposes.

Image of game in use with informational pop-up
Image of informational game before clicking on object to show informational pop-up

Week 7:

After changing my survey slightly, I sent the link to my mentor to be reviewed and approved. Now that all developments were completed, I was able to start with beta testing. On Tuesday, I went to the ASRC to give a presentation to high school students on a field trip, entailing my journey from high school to doing research at Hunter College. Following that, The students explored the Illumination Space where they also were the first group to test the game I created. After giving them the survey, I received great feedback on their satisfaction with the game and what could be added or reduced to make the game more fun and usable. Later in the week, as a group, the VR-REU cohort I am in went to the ASRC on Thursday. During the visit, they also tested the game, gave feedback, and completed the survey, helping me with my data collection and analysis. Following that, we went on a tour, exploring the labs of the facilities. This week has been slightly calmer than most of the previous weeks but as this program comes to a close, I know this next week will be loaded with work.

Intricate eye movement and it’s affect on perceived realism in Look alike Avatars

Deshanae Morris, Farmingdale State College

Week 1:

The first week of the REU program began on the 2nd of June where I got to meet the group I would be working with for the next 8 weeks over a fun game of bowling. Everyone was extremely welcoming and although I utterly failed the game that day I still enjoyed myself. The following day, the group got to know their mentors and toured Hunter College. For the remainder of the week, I dedicated my time to brainstorming unique ideas and diligently working on drafting and finalizing my research proposal. I also conducted a literature review to ensure that my proposal was unique. While there is still a significant amount of work ahead, I am excited about the journey and look forward to achieving much more in the upcoming weeks.

Week 2:

During the second week of the REU program, I continued to develop my proposal, literature review, and methodology. This past week has been dedicated to understanding the technical aspects of my study to begin implementing my methodology. Dr. Wole has introduced the REU group to various applications and concepts related to 3D modeling and virtual reality systems, enhancing our familiarity with these tools. I have been practicing more with Reallusion as I will need it for my research paper, but I still have much more work to do to produce my final product.

Week 3:

This past week, I have refined my methodology and concentrated on the technical aspects of my research. I have focused on creating the first of two look-alike avatars required for the study. I successfully developed a realistic avatar that closely resembles myself. I will continue using character creator in Reallusion to see if I can make my avatars more similar to the models than they already are. In the upcoming week, I plan to complete the second avatar, a male look-alike, but I am pleased with the progress thus far. Additionally, I experimented with the facial puppet and live face features in iClone to achieve the desired emotions and scenarios. Still, it was fascinating to see my look-alike avatar come to life. Additionally, Dr. Wole went over immersion, presence, and reality with the group so we were further familiarized with reality systems and immersion. In the following week, I aim to begin recording the videos needed for my survey, complete the audio recordings, and develop the survey so I can start collecting data for my research.

Screenshot 2024-06-21 132115.png

Week 4:

This week, we concentrated on finalizing our methodology. I continued to refine my skills with Reallusion, Character Creator, and iClone to develop two avatars, one of myself and one of Dr. Wole, that I felt confident with. Utilizing the facial puppet feature in iClone, I began varying the eye movements in these look-alike avatars. However, I will continue to explore the applications and features to enhance their eye movements and facial expressions further.

The group also dedicated time to preparing for the midterm presentation scheduled for the end of the week. We each created slideshows and demos to showcase our progress, ensuring that everyone was informed about the individual projects and the overall direction of our work. I produced several demo videos to present to the group, aiming to see if they could accurately identify the different eye movements in my avatars. This exercise provided valuable insights into how participants in my survey might respond to my research. Regarding my research objectives, I plan to conduct a survey evaluating realism, emotion/intention, perception, and comfortability using a self-introduction scenario, specifically seeing how these eye movements play a role in an observational social context.

Week 5:

This week, our entire group concentrated on developing our surveys. I did start with Google Forms, but it wasn’t able to achieve what I hoped to capture with my data so I moved over to Qualtrics. I added most of my questions and videos to the survey through Qualtrics, but Dr. Wole and I agreed that adding cropped videos of my avatars would be a compelling element so I had to add those in too. As I continue editing my survey I gain more and more confidence in my research and I am excited about what I am planning to contribute to the research community. Throughout the week, we continued expanding our knowledge of virtual reality and 3D modeling with Dr. Wole. It’s always exciting to learn something new in the tech world.

This weekend, we’re planning a lunch cruise as we approach the end of the REU. It’s a bittersweet moment, but I’m looking forward to having fun and getting to know my group even better. By the end of the weekend, my survey should be finalized, and I should have some collected data to start adding to Overleaf. I can’t wait to see how the participants respond to my work!

Week 6: 
I took the time to finalize my survey, and although I did have lots of revisions I ended up with a final product that I was really satisfied with. I used Qualtrics which took a little time to get used to because of my lack of knowledge of the software, but it ended up being rewarding when I got it to work. I ended up enjoying the process of making the survey and below I displayed the survey flow as a short insight into my survey and a component needed to create a survey using Qualtrics. I stumbled into a roadblock when developing my survey because my collected data was not displaying properly in the results section. It was hard trying to find a solution as this data is valuable to my research, but after importing my data to a copied version of my survey I managed to fix my problem. Additionally, Dr. Wole allowed the REU group to review two demo research papers on SIGGRAPH. It was refreshing to finally be on the other side, reviewing other papers instead of receiving feedback on my own. The end of the REU is approaching quickly therefore, I have to start adding and updating my research paper on Overleaf so I can present a final product.
Week 7: 
This week, we finished data collection and started exploring how the responses can be analyzed for the topics we investigated. The results I collected were pretty interesting because the comfort and realism rankings were different between all my variables (Female and Male avatar), (Normal, Random, Static), and (Full video and Cropped videos). Conducting some basic analysis I was able to see how drastic each variable affected the realism and comfort ranking of my avatars, but I still have more statistical analysis to be done as I plan to conduct ordinal regression on my variables to evaluate the significance and the cause and effect relationship between them. Over the weekend I should have finished my results and analysis to move on to the discussions and conclusions. I have lots of data and variables so analyzing the data might prove to be extensive, but Dr. Wole has helped me break down the information to figure out what I need to find from it.
Week 8: 
The last week was very busy as I including my participants spent a lot of time finishing up the final components of our paper. Additionally, this week we participated in a teacher mocap session where the group and I viewed multiple presentations from professors. One really interesting presentation was one about people telling their stories or showcasing their culture through games and the game that was displayed I found to be a funny way of showcasing a real-world struggle. To be more specific the game was about a woman of color swatting away the hands of those who want to touch her hair without permission. I was glad to support these teachers with their research and I found it to be very rewarding. Furthermore, the group prepared to give our final presentation this Thursday to those at IOWA state, the group mentors, and the other REU students.

Enhancing Virtual Exploration for Blind and Low Vision Users: In-Place Virtual Exploration with Mixed Reality Cane

Hong Zhao, CUNY Borough of Manhattan Community College

Mentors: Hao Tang and Oyewole Oyekoya

 

Week 1:

The day before the VR-REU program officially kicked off, Dr. Wole organized a bowling icebreaker event. It was really fun! Everyone was very enthusiastic, and I was happy to meet the other cohort. On Monday, we first completed an REU pre-program survey, then took a quick tour of Hunter College and met all the REU mentors on Zoom. My work officially started as I began discussing my project direction with Dr. Tang and getting a preliminary understanding of some of the current project code. Wednesday was the first class on VR, AR, and Mixed Reality, which primarily covered the ideal principles, how it works, and the history of VR. For the rest of the week, I reviewed some related literature and then had another discussion with Dr. Tang to finalize some research directions and complete my project proposal. Finally on Friday, we got an introduction to ParaView and presented our project proposals.

 

Week 2:

This week, Dr. Wole introduced us to the writing tool Overleaf. He also demonstrated how to use Tableau for optimizing data visualization. 

I completed the CITI training and successfully developed the initial version of the MR Cane control system. Here is how it works:

  • When the user long-presses the screen for two seconds, the virtual character begins to move.
  • The direction of the character’s movement is determined by the current orientation of the headset.
  • The mobile phone acts as a white cane. Users can swing their phones left or right to control the movement of the virtual cane.

Initially, I used the AR Session’s camera to capture the phone’s rotation data. However, this method proved to be imprecise when the phone was parallel to the ground, leading to suboptimal performance. To address this, I switched to using the phone’s gyroscope to obtain the rotation angles. This approach has significantly improved the test results.

Here are some key points about using the gyroscope:

  • When the device is in its default orientation, the X-axis points horizontally to the right, the Y-axis points vertically upward, and the Z-axis points away from the front of the screen. Thus, the back of the screen corresponds to a negative Z value.
  • As the device moves or rotates, these coordinate axes remain fixed relative to the phone.

This new method using the gyroscope has shown promising results in our tests, enhancing the accuracy and responsiveness of the MR Cane control system.

Week 3:

This week, Dr. Wole explained topics related to immersion, presence, and reality demo seminar, and also introduced us to the tools in VMD. 

I developed a new method for controlling movement in place, inspired by the Meta Quest 3 controller. By using a mobile phone as the control device, users can simulate forward movement by swinging the phone up and down. It’s a more natural and intuitive way to navigate the virtual space, making the experience feel even more immersive.

Additionally, I completed the development of the footstep sound management module. This module plays corresponding footstep sounds based on the material of the ground the character is stepping on. The technical details involve creating the character’s walk animation and adding a callback function to play sounds at keyframes. An animator is used to control the transitions between standing and moving states of the character.

I also added a settings menu to allow users to switch between different movement control modes.

 

 

Week 4:

Our coursework this week was very rich. Dr. Wole covered topics such as 3D tracking, scanning and animation, interaction and input devices, and an introduction to GPUs. These topics were very interesting and have been very helpful for my project.

This week, I mainly completed a module called Layout Learning. This module is designed to help visually impaired individuals build a mental map before exploring virtual environments. Specifically, by holding down the screen and moving your finger back and forth to explore the layout of a virtual room, if you touch walls, tables, or chairs etc., the system provides vibration feedback and plays the corresponding object name. Additionally, when moving outside the room, the system alerts “Out Of Room.” Some of these improvements actually came from user feedback during user studies.

 

Due to focusing on the development process previously, I fell a bit behind on my research paper. So, on Wednesday and Thursday, I caught up on my research paper progress and completed the introduction and methodology sections.

On Friday, we had our midterm presentations, and everyone performed very well. I also completed my presentation smoothly. 

 

Week 5:

This week, Dr. Wole lectured on Graphics Processing Units (GPUs) and Immersive Audio. I learned a lot from the class. The presentation introduced many technical points that help us improve rendering efficiency. 

In terms of development, I focused on enhancing the tutorial module of my app. This module is designed to guide users on how to use their phone as a white cane. It provides instructions on turning, moving forward using different gestures like long presses and downward swipes. The most exciting part is the use of 3D audio and sound playback speed to help users navigate towards their targets. The 3D audio cues indicate whether the target is on the left or right, and the playback speed increases when the user is facing the target directly.

Additionally, I made significant progress in setting up the Apple developer account and successfully uploaded the app to TestFlight. This will make it easier for our testers to download and use the app, providing us with valuable feedback. I also designed a survey and created it using Google Forms to gather detailed user feedback on their experience with the app.

 

Week 6:

This week I focused on designing all the survey questions. The goal was to verify if my Layout Module can help BLV (Blind and Low Vision) users build a mental map and assess its effectiveness.  I also aimed to test the operability of two in-place control movement methods: long-press and swing down, asking users to choose their preferred static movement method.

Additionally, we invited a few high school students and a BLV user to the Hiterman Hall lab for testing. The non-BLV testers performed the app tests with their eyes closed to simulate the BLV experience. After the tests, they completed the survey questionnaire.

This Saturday, the REU cohort attended a lunch cruise. We had a great time and took many beautiful photos. Moreover, Dr. Wole assigned each REU member two research papers from SIGGRAPH to review and evaluate.

 

Week 7:

On Monday, we shared the progress of our respective projects. Then, we reviewed the papers from the SIGGRAPH Asia conference. Those papers truly provided me with a lot of inspiration. On Thursday, we went to the CUNY Advanced Science Research Center, where we experienced some very cool interactive visual technologies and visited various scientific labs. I also participated in Cason’s project presentation, which was really creative.

This Friday, Dr. Wole explained some statistical analysis strategies and provided reasonable analysis suggestions for each person’s survey. I learned many new analysis methods and look forward to applying them to my project. This week, I also participated in application testing and surveys conducted by other REU students. I invited high school workshop students to test our application with their eyes closed and complete surveys. Their feedback was very helpful to us.

Additionally, I tried to invite BLV (blind and low vision) users from a blind school to test our application. Unfortunately, when I arrived at the blind school, they were in class and unable to participate in the testing. But I will continue to look for opportunities to get them involved.

 

Gamification of Food Selection and Nutrition Education in VR

Student: Caroline Klein, Vassar College

Mentor: Margrethe Horlyck-Romanovsky, Brooklyn College

 

Week 1

After a hectic move-in and learning to use the subway on the first day, I was happy to meet Dr. Wole and all my wonderful cohort members at bowling. Monday was mostly settling in to Hunter and learning about the structure of the program, but later in the week we dove into discussing research conferences and VR, including a brief introduction to ParaView. I also met with my mentor twice to discuss research ideas and develop my project. I spent most of the week researching, brainstorming, and writing my proposal in my room, using the previous REU publications as reference. I also dedicated a significant amount of time to completing my CITI certification to ensure proper research practices.

I am currently planning to develop a virtual buffet environment and design an experiment to see how incorporating gamified elements like nutrient-based point incentives influences people’s food selections in the simulation. It was challenging gauging how much development I could accomplish in 4-5 weeks since I’ve never worked with Unity or VR, but I am open to adjusting my project as needed once I am familiar with the software and have a better sense of what I can accomplish. Although things were a bit unclear at first, I’m excited for the following weeks as I take the next steps with my project and begin learning Unity.

 

Week 2

I spent most of the first half of the week setting up and connecting the MetaQuest headset and learning how to deploy a game from Unity. Kwame was very helpful in this area, and spent time during Wednesday’s class helping me and Amaya get a handle on the basics. I followed some introductory Unity tutorials and played around with the VR development environment on my own as well. For the second half of the week, I completed my literature review and drafted the Introduction, Related Work, and References sections of my research paper. I also completed my CITI certification and tried out Tableau in class on Friday. On my own time, I was excited to attend a women in computing event at Bank of America and get a NY Public Library Card after finally getting our student IDs. 

Next week, my goal is to jump into development and make significant progress on the VR game because the headset logistics took longer than I expected this week and I have not started implementing the actual buffet simulation yet. I aim to finish the game in the next 3 weeks so I have enough time to collect data, analyze results, and finalize the paper. I will also start planning the specifics of the buffet options and gathering information about their nutritional value using the USDA database as my mentor suggested.

 

Week 3

It was challenging resolving runtime errors on my own and piecing together tutorials to develop the features I wanted, but I feel like I made real progress in Unity this week as I began building the VR simulation. I was able to model a simple version of the buffet setting, and implemented movement controls, hand animations, grabbing and ray interactors, text boxes, and updating the overlaid image of a point bar when a food item is selected. These components will lay the foundation of my game. I also conferred with my mentor and explored the available assets on the Unity store to plan the food options, and started writing the methodology section of my research paper. Most of my week was spent on technical development at home, but we also learned more about VR concepts and technologies like immersion, presence, and VMD in class.

 

Image of test simulation

 

One setback I had this week was that something went wrong when I was working on preventing food items from falling through the floor that made the program very glitchy, and I couldn’t figure out how to fix it. Luckily, I had backed up an early version of the project to GitHub so I was able to retrieve that and redo my work. This was a lifesaver and I will definitely continue to utilize version control on GitHub as I continue to implement the program. However, I am still finding Unity’s errors confusing and sometimes don’t know what is causing them. In the coming week, I hope to expand on my Unity project to have a working prototype of my game with a limited selection of options, before expanding to accommodate all food options and nutrient points in the final version.

 

Week 4

I started off the week with some technical progress as I gathered assets from the Unity Store and arranged them in a 3D scene to make the buffet setting for participants to interact with. One issue I ran into was that we had planned to have several salad options as part of the food selection, but strangely there were no salad prefabs in the asset store. Luckily, I was able to find a lettuce leaf prefab which I used in combination with a bowl, tomato slices, and dumplings (hidden among the leaves to look like chicken) to make the desired salads. Now that the virtual setting is more expansive, I am starting to experience motion sickness when testing the game since I am spending longer in it and moving around more, but it hasn’t been too bad.

 

Point system activated      Buffet setting

 

After meeting with my mentor on Tuesday, we decided to amend the methodology and get rid of the two-group design because of the challenge of validating difference results between the groups and the time cost of having many participants since each has to complete the VR simulation. I quickly updated the research paper to reflect this, but plan to rework each section more fully to reflect the change. I feel like I’m having trouble pinning down what type of results and conclusions I want to draw from the experiment now that I’m not comparing how their food choices varied with the gamified intervention, but I hope to meet with my mentor soon to develop this more.

I continued game implementation throughout the week, adding more point bars and the food list to the overlay. I spent the afternoon Thursday trying to integrate a SQLite database, but kept running into errors and ended up unable to successfully add it. I’ve found debugging in Unity very frustrating since it says that I have build errors but gives no indication in the log of where they are coming from. This necessitates building every few minutes to check for errors, but each build takes about 30 seconds to run and you still don’t know exactly what is wrong.

I also prepared a slideshow and demo videos for our midterm presentations on Friday. In this presentation, I briefly went over my proposed idea, literature review, methodology, and technical progress, showcasing the virtual scene and interaction mechanisms. It was also helpful to hear about how everyone else’s projects were developing. I finished up Friday by working on the verbal instructions and Google Forms to give participants before and after the experiment. Next week I hope to combat some of the limitations I identified this week and finish up the technical implementation aspect so the game is ready for user trials.

 

Week 5

This week was heavy in technical implementation as I worked on incorporating feedback from the midterm presentation and finishing up the game. Technical accomplishments include:

  • Restructuring the point system code to make it more organized and easier to scale out by associating each food option with a FoodItem object (with attributes for mass, nutrition values, etc.) that I can pass into the add and remove functions.
  • Adding ray interactor functionality so that when you hover over the different point bars you can see health information about each nutrient.
  • Adding the ability for players to change their food selections and remove items from their plate. This was recommended by my mentor to make participants feel empowered to make informed decisions after seeing the effects of their choices.
  • Expanding the existing functionality and details I had prototyped with the burger selection demo to all point bars and food items, including accurate point assignments based on nutrition label data.

I also decided to replace the sushi roll option with a teriyaki salmon and broccoli dish in order to have a dish rich in Vitamin E since that is one of the nutrients I am highlighting. In addition to development in Unity, I worked on planning the pre- and post-surveys for the experiment. My mentor recommended using an established NEMS-P Food Environment survey for reference, so I will plan to incorporate some of those questions and add similarly structured questions about specific nutrients for the knowledge quiz. She also recommended adding an indicator for the recommended daily amount of each nutrient and nutrition labels for each food to the game, so I will look into incorporating these components, too. On campus this week we continued learning about VR topics like GPUs, audio, telepresence, and VR sickness, and played around with Gephi for graph/network visualizations.

 

Week 6

After seeing more of NYC and connecting with the other REU participants on a lunch cruise over the weekend, I jumped into preparing for the research trials. Since I have human subjects testing the game, most of the week was spent preparing for the IRB application. I finished all the required documentation, such as consent forms, and got a faculty member from my home institution to sign off on the Reliance Agreement. I also finalized the Google Form questionnaires and had them approved by my mentor. On the technical side, I added some features to enhance the game experience, such as sounds and an end screen. Adding the background music was simple, but I struggled for a little longer with debugging event-triggered sound effects for food selections.

 

End screen       Hover nutrient information panels

 

I also spent some time reviewing SIGGRAPH Asia XR submissions, which exposed me to some cool research being done. At this point, the VR game and pre- and post-study questionnaire forms are finalized, so I am mostly waiting on IRB approval to start running the study with participants.

 

Week 7

I started the week with some initial usability testing and incorporating user feedback into the game (including fixing typos!). One of the main complaints was motion sickness, so I slowed down the movement controls to help with that. After a final review of the questionnaire, I began the official user study and was able to collect data from 10 participants. Some motion sickness was still reported, but it was much less and did not interfere with the experience. I updated my research paper in Overleaf to reflect the current state of the project.

At Hunter this week, we started off by sharing our progress on our projects and going over our SIGGRAPH Asia conference paper reviews from last week. On Thursday, we took a group trip to the CUNY Advanced Science Research Center, where we got to try some cool interactive visual technologies and tour the various science labs. We ended the week discussing strategies for statistical analysis and our plan for final presentations. Throughout the week, I also participated in all the other REU students’ studies, which involved filling out surveys and testing their applications.

I feel like I still have a lot to do in the last week with data analysis and finishing the research paper, but I’m glad I have the technical implementation and user trials done, and I’m excited to put it all together.

 

Week 8

At the beginning of the week, I analyzed my user response data, identifying the appropriate statistical tests based on which assumptions were met and following Laerd tutorials to process the data. I used IBM SPSS software to evaluate the significance of the increase in knowledge scores and perceived nutrient importance. Although I found a mean increase in knowledge quiz scores, it wasn’t statistically significant according to the Wilcoxon Signed Ranks Test. However, there was a statistically significant increase in the reported consideration of dietary fiber, lutein and zeaxanthin, and vitamin E after the intervention, though these were based on subjective self-reported scores. I also calculated summary statistics to assess changes in awareness of nutrients and to identify which aspects of the game experience were effective. I updated the results section of my paper to reflect my statistical analysis and wrote the discussion and conclusions sections.

By Wednesday, I had a finished draft of my paper, but I had to revise it because two more people participated in my study, bringing the total to 12 participants. Fortunately, most general results and conclusions remained the same, but I had to spend a few hours redoing the statistical analysis and updating the paper accordingly. The REU group also participated in a Motion Capture event on Wednesday, where different teachers gave short presentations while wearing motion capture technology and we evaluated their performance. It was a long day, but it was exciting participating in an external research study and fun listening to the various talks. At this point, I was in the process of getting feedback from my mentors and revising the paper.

We had our final presentations on Thursday, which consisted of a 10-minute slideshow summarizing our research projects. We presented to our REU peers, mentors, and some Iowa State University REU students. In return, we listened to their research presentations on Friday. The rest of Friday, the group spent the entire day at Hunter finalizing our papers. I created my one-page poster, reviewed submission guidelines, and clarified some details with Dr. Wole. We also completed the end-of-program survey. It was a fun day, as we all worked in the same room, talked over lunch, and helped each other navigate the submission process.

 

Final presentations Zoom

 

I’m not fully done with my project at this point because I haven’t submitted yet (almost no one has), but I feel like I am in a good place. My mentor is in Europe right now and has limited availability, but I want to give her a chance to read it over one last time before I submit. I will be submitting a Poster Submission to ACM ISS Poster 2024, which is not due until August 15th, but I hope to send it in early next week. Overall, the Hunter REU program was a very enjoyable experience that I was happy to be a part of. Even if our research didn’t feel the most impactful, I definitely feel more confident with the process of writing and publishing a research paper, which I knew essentially nothing about at the beginning of the program. Beyond that, I loved all my fellow REU participants and couldn’t have asked for a better cohort group. I wish everyone the best of luck in their future endeavors, and hope this blog convinces someone to apply for the REU.

VR as a sensory stimulation tool for adolescents with ASD and anxiety

Amaya Keys, Howard University

Week One:

As a pleasant start to the program, I met the other students I’d be spending the next eight weeks with over a fun night of bowling. On Monday morning, we briefly met with our mentors via Zoom following an in-depth tour of Hunter College’s facilities. My mentor, Mr. Daniel Chan, recommended that we keep in mind the scope of the project and not to overextend ourselves with a wildly complex project for only 8 weeks. To me specifically, he suggested that I pick one specific disability, do thorough research on it, and from there decide on how a VR/AR application may help. I met with him one-on-one twice during the week to receive guidance on all of the jumbled thoughts racing through my mind. After bouncing from idea to idea, I finally landed on the development of a virtual multi-sensory stimulation room for adolescents with ASD that experience anxiety. To close out the week, we completed a short lab on ParaView, decided on a conference that we would submit our research to at the end of the summer, briefly reviewed our updated project proposals, and then finally got a sneak peek into the VR lab. Now that my project is solidified, I hope to dive into experimenting with Unity, as I know I have a steep learning curve ahead of me. 

Week Two:

To start off the week, I finished the remaining modules of my CITI training so that I could solely focus on writing my literature review. I also received one of Dr. Wole’s VR Meta Quest headsets to take home and begin experimenting with. I attempted to set it up to my computer and phone on my own as well as deploy Unity but experienced many challenges. Later in the week, Kwame assisted me with the set-up process, and we ran an escape room demo to test its functioning. Following successful set-up, I worked to enable hand tracking on the headset and test another demo but ran into further difficulties that I am still working to resolve.  

I spoke with my mentor at our scheduled Tuesday meeting time, and we just ensured that I was feeling confident and secure in the direction my project was heading. Throughout the week I continued to take notes on various studies and worked to weed out the ones I felt were unnecessary to include in my paper. The first draft of my literature review was unnecessarily long and contained way too many sources that weren’t directly related to my research, but by Thursday night, I had finalized a version that I was happy with. I sent it to my mentor for feedback and while he had a few comments about potential changes, he was overall pretty satisfied with how it looked. On Friday, we conducted a brief lab on Tableau and updated Dr. Wole on the status of our projects.

Next steps for me include solidifying my methodology and making headway on the development of my sensory room. I intend to search through the Unity assets store and TurboSquid to hopefully find some ready-made elements for my project. Before anything else though, I definitely want to fix the errors with the hand tracking functionalities, as this will be a significant element of my application.

Week Three:

Now that the introduction and related works sections are out of the way, I used this week to focus on the development of my application in Unity. I got off to a pretty slow start; I worked through figuring out what packages I needed to import and how to enable hand tracking. My next goal was to create 3D bubbles that the user can pop as they rise from the floor. I looked through the Unity Assets store and Turbosquid, but couldn’t find any pre-made model that would fit my needs. I followed a tutorial to create a basic bubble GameObject and wrote a script for them to multiply and continuously respawn. I am still trying to equip it with a poke interaction so that they “pop” when touched. I then created a light panel on one wall of my room and am working to make each circle change colors when tapped. Now that I have a little more experience with navigating and troubleshooting in Unity, I hope to be more productive with development next week. 

.
 

 

 

 

 

 

 

 

 

Week Four: 

This week I focused on making as much technical progress in Unity as possible while also thinking about my methodology to prepare for our midterm presentations on Friday. I decided that the five elements I wanted in the environment are a light panel, poppable bubbles, 3D object play, glowing interactive stars, and an alphabet/number board. By attaching a script to the Interactable Unity Event Wrapper from the Meta Interaction SDK, I was able to enable random color changes of the buttons on the light panel when touched. From there, it became so much easier to add interactive components to other objects in the scene now that I had that baseline. I also decided to make the light panel resemble an LED hexagon panel commonly used by individuals with Autism. I then modified a map and pins from one of the Meta Interaction SDK samples and imported models from Sketchfab and CGTrader to recreate a magnetic alphabet board. The user can drag letters and numbers from the tray and place them freely on the board. The third element of focus this week was importing 3D shapes and enabling the necessary components to allow the user to scale, rotate, position, and throw them in their hands. I now have a bubble popping animation clip but was still unsuccessful in getting them to pop when poked, which will be a focus for next week. Another issue I’m running into is that the game starts in a new camera angle every time I run it, and I am not sure why. I have worked around it for testing purposes, but it is something I will definitely have to resolve later so that there is no confusion when the participants put on the headset. As a relieving end to the stressful week, midterm presentations went well, and it was interesting hearing the updates on everyone’s projects. It’s refreshing to know we’re all dealing with different challenges and are working through them at around the same pace. 

 

Week Five: 

This week was the most productive for me technical-wise. While I unfortunately was not able to get the bubble animation working in Unity, I did enable a simple popping functionality where they are destroyed on touch. The last and final activity that I created was a series of interactive stars in the sky that have paths drawn between them when “activated” or lit. I then imported a counter surface for my activities to be displayed on and created a simple dome in blender as the exterior. The new models caused some technical problems with the existing objects, so a lot of my time was spent debugging those issues. 

 

 

I also worked on completing the required documents necessary for IRB approval. However, at the end of the week, I still had not been reached out to by any prospective participants, so Dr. Wole and I discussed whether it was even worth going through the IRB process at all. He brought up the idea of possibly designing a work-in-progress research paper instead of going through with a user study. This would give me the opportunity to either turn this into a long-term project or allow another researcher to pick it up.

Week Six:

On Tuesday, Dr. Wole, Mr. Chan, and I had a discussion about the direction of my project and decided it would be best if we follow through with the work-in-progress paper, and put the IRB process on hold for now. With that being said, I don’t have much to write about this week, as I just focused on making tweaks to my virtual environment and editing my experimental procedure. The room now has a welcome console with options for user to control audio input as well as text instructions beside each activity. I still need to record a short demo of each activity in action to be played above the instructions for those that may want a visual. I also added an additional constellation for users to interact with. I am having many technical difficulties with the letter/number board and may need to start brainstorming some alternative methods for how it will function. I would like for the user to be able to drag letters from the tray and place them on the board, which will then “lock” the object to the board surface as if it is magnetic. This is proving to be quite difficult so instead, they may have to just touch the letter/number they’d like and it’ll appear on the board. I’d like to have everything finished by early next week so I can receive feedback on the environment from others in my cohort. It will not be a traditional user study, but it is still feedback that can ultimately be included in my paper.

VR-REU 2024

Or Butbul

 

Week 1

 

After landing in a sunny New York and unpacking my bags, I headed down to the lobby to meet the group of people that would be living in the same residence as me, we all went together to go bowling with the rest of the group and we had a lot of fun! The next day was spent touring the college, meeting the professors, and starting to talk about proposals. Initially I was drawn to a project focusing on Motion Capture. After reading the bulk of articles focusing on Motion Capture I decided to shift my focus more towards graphics. My proposal was accepted after the class we had on wednesday and I spent my time refining it on Thursday. Friday was spent learning Paraview and exploring the city

 

Week 2

This week focused on completing the preliminary material. I have been working on the CITI training materials as well as the literature review for the beginnings of my paper. I have been having an issue with my base university that has been stopping me from connecting to the internet here and limiting the things I can do while we meet. I hope to resolve the issue before the beginning of next week, as well as set up a remote desktop so I will be able to access a faster computer to render my virtual humans. Outside of the project, I have been trying many new restaurants and getting to know my cohort. Talking about each others project has given us a clarity about our ideas and goals in our projects.

 

Week 3

I managed to create the virtual humans last weekend. The main goal of this week was to connect to my home computer remotely so I could work in unreal and have my Meta-humans. I connected to my home computer Wednesday and downloaded the mesh of my first avatar. This weekend I plan to get the mesh of my other avatar, and the textures of both my avatars. Using Blender over Unreal was a big decision for me as Unreal Engine has its own rendering pipeline better suited to the realism of the Meta-humans, however, Blender gives me more information on render time and memory usage, which is vital to the project. The goal for next week is to get all the renderings of the avatars so the survey can be prepared. As an aside from my work this week, I had some fun opportunities to meet my group for pizza and shopping at Chelsea Market. I also met some of my old friends who live in the city now.

 

Week 4

This week focused on preparing the material we needed for our midterm presentations, I spent the bulk of my week refining my literature review and creating a more detailed methodology for the project itself. It was difficult to connect to my home computer this week and I did not have that much time to work in Unreal, I did have some work that I could do outside of my computer. Namely, I could prepare animation data so my avatars could move in the survey and survey takers could have a better reference for the level of realism of the avatar. Next week I plan to get the renderings of my avatars at their different levels of detail.

This is an example Metahuman with an idle animation found in Metahuman Creator, UE’s web based platform for avatar creation.
Outside of work, I went to a festival over the weekend in Brooklyn and I was able to see many amazing singers and hear a lot of great music.
Week 5
This week had many difficulties for me in capturing the renderings of my virtual human. capturing animation from Metahuman creator was not possible so a custom idle animation had to be recorded using LiveLinkFace. Importing that animation into Unreal and getting it to work with their avatars was a long ordeal as well. I was able to change the level of detail of my avatar in the scene and modify its texture maps which will be helpful for the study. Unreal also provides me with the render times of each frame and the total render time which is helpful for the project. Lastly, I had a big issue with getting a specific render engine to work with my MetaHumans. I was wanting to use Unreal Engine’s path tracing as my chosen engine however there is very little support that engine has with MetaHuman and renderings could not be achieved at a sufficient level of detail. I have decided to use their detailed lit render engine to replicate a lower performance system’s render engine, while still maintaining a sufficient level of detail overall.
I will get the renderings as soon as possible, My survey questions have been prepared, I plan to render the avatars as soon as I can and importing those videos into the survey, hopefully finishing my survey before the start of next week.
Week 6
This was a great week to get work done! All the MetaHumans were animated and rendered. The image sequences were rendered out as png sequences, and then assembled as mp4 videos in blender. I then tried to import the videos into Qualtrics to create the survey however the videos were too large in size. I used Giphy to down-scaled them and turn them into gifs which seemed to work with the survey. The surveys should be completed in the weekend, and distribution will likely happen at the beginning of next week.

Enhancing Trust in Telepresence: The Influence of Familiarity and Varied Eye Contact on Trust in Look-Alike Avatars

Kriti Kalary, SUNY University at Albany / SUNY Upstate Medical School

Week 1:

I arrived in New York City on Sunday and we kicked off the REU program with an icebreaker social event. I met the rest of my REU cohort while bowling (I wasn’t very good!). All too soon, we jumped right into work. I met with my mentor, Dr. Wole, and started brainstorming ideas for my proposal due on Friday. I worked to narrow down my field of interest and land on a unique yet interesting idea for my project—I had to toss out a lot of ideas before ending up with something I was happy with. This week also started off our first few classes of Dr. Wole’s VR, AR and MR course which has been incredibly interesting so far. I spent the rest of the week furiously searching for relevant articles to include in my literature review, trying to round out my rationale and support my research proposal. For fun this week, I met a friend in Central Park and tried some great bagels!

Week 2:

Week two was equal parts work and fun! I had a couple of road bumps this week where I had to alter my proposal a bit to make it more unique. Luckily, my older research for my literature review from last week was still helpful and I was able to collect all the information that I needed for my paper without too much trouble. This week, I finished my introduction and related works section of my paper, created a figure for the theoretical model of my paper and started outlining my methodology.

I also downloaded reallusion and tried to get character creator to work on my macbook through parallels, but it took too long to load. The headshot plugin worked well, so I will likely end up working on the software aspect of my project next week on the computers at CUNY Hunter. For fun this week, Or, Asmita and I went to the Chelsea Market, some of my friends came up to visit me and we explored Central Park. I also listened to a performance by the NY Philharmonic and saw fireworks!

Week 3:

This week saw some pretty solid progress on the technical front! I created my two avatars— one which was familiar to participants modeled after my own face and the other was modeled off of a headshot of a random person I found online. The avatars look fairly realistic and I tried importing my avatars to Iclone and tested out the live face app to test out various methods of eye contact/gaze behavior and it worked out well.

The first avatar is the unfamiliar avatar, while the second is the familiar avatar (I still have a few edits to do for the former).

I also finalized my methodology this week, so I’ll be in a good spot next week to start working on my survey. For fun this week, I went to see a 24kGoldn concert and explored Brooklyn with a friend! I also went thrifting and visited some cute bookstores.

Week 4: 

I revised my methodology and technical implementation this week! I ended up completely refreshing my avatars since I wasn’t super happy with my original versions of the unfamiliar and familiar avatars from last week. I also worked on my slidedeck for the midterm presentation on Friday and created demos to share with my classmates to update them on my progress. Here are pictures of my new refreshed avatars (unfamiliar and familiar, respectively).

iClone worked well without any issues for my survey video clips and I was able to standardise each eye contact levels’ eye movements to maintain a control variable for each avatar. The plan for the following week is to start working on the survey and send it out to start collecting data.
For fun this week, I met up with a friend and went to Roosevelt Island, visited the Guggenheim museum and tried out some new restaurants with friends! I also got a chance to catch up on some of my shows (The Boys S4 and Invincible S1) and got time to get through some books as well (A Room with a View by E. M. Forster and Notes from the Underground by Fyodor Dostoyevsky).
Week 5:
This week I edited my clips for the unfamiliar and familiar avatars and created the survey to start collecting some data. I encountered some road blocks in terms of being able to randomize the order of my questions for all three parts of my experiment— the demographic section, the trust game section and the trust and familiarity section. I ended up reorganising my survey via Google forms and changing each avatar video clip from being in individual sections to moving them all into one section. I also struggled with attaching my video clips to the form and attaching them to individual questions, so instead of embedding the videos I simply placed a YouTube link into the stem of the question for easier user access to the videos. I hope to finalize my survey and start sending it out and start collecting preliminary data next week.
For fun this week, I went to the Met and spent time with my friends and cousins who came to visit me. I went to the vegan night market with my colleagues and tried some fantastic bagels. I also got a chance to watch the fireworks on July 4th and had incredible Korean fried chicken!
Week 6: 
I continued to edit my survey and sent it out to collect preliminary data. With the initial 5-10 responses, I used their feedback as a gauge to improve my question phrasing and edited my survey accordingly. I then sent out the updated survey to collect data. I now have 33 responses with the following characteristics.
Next week, I hope to collect some more responses and make headway into data analysis. I aim to make some figures and start analysing my data with a focus on looking for significance. For fun this week, I met up with friends for dinner at Red Lobster and went to a speciality book store and explored Times Square. I also finished reading Battle Royale by Koushun Takami and finished watching Invincible S2!

Utilizing VMD to Visualize and Analyze Two PDB Files of FNDC1

Asmita Deb

Week 1

This week we first started by bowling and getting to know everybody that we would be working with for the next 8 weeks. It was a great team bonding experience and by Monday, when we stepped into Hunter College, everything went smoothly! We went up to a conference room and met with the mentors of everyone in the REU and then we took a tour of the college itself. We came back up and were told our tasks for the week and then left. The next few days were dedicated to creating a proposal for our project that would satisfy our mentor, Dr. Wole, and ourselves. On Wednesday we submitted and discussed our drafted proposals and then finalized them to be submitted on Overleaf. Throughout this we also watched many lecture videos that introduced us to VR/AR and the research tools we would be using. The project that I am planning on focusing on is the utilization of Paraview, a scientific visualization software, to analyze and visualize a protein called FNDC1 whose overexpression is associated with multiple cancers.

Week 2

This week I worked on my literature review. I found about 4-5 sources about the protein, FNDC1, and also using visualization software and how it could benefit the research of understudied proteins. This was the first time I used OverLeaf, but it was easy to get the hang of. I also played around with Paraview all week and downloaded both of my files. I’ve been editing them, but I will be also trying VMD next week, just to see the differences. I also figured out what type of user study I wanted to conduct and what I wanted to specifically write about in my paper. Throughout the week I also enjoyed good food, fun activities, and bonding with the other interns!

Week 3

This week was more focused on the methodology and moving forward in our projects. I inserted both of my PDB files of the two different structures into VMD, but for a lot of the analysis tools in VMD, it is necessary to have some sort of trajectory or movement file. To gain this, I had to download a separate program called GROMACS which is used for a lot of trajectory and molecular dynamics. I didn’t have a lot of previous experience with Homebrew, CMake, or terminal coding, but through this experience I learned a lot. I got my movement files and will now load it onto my VMD protein frames. I have also been working on my research paper on OverLeaf and also watching the videos of the class.

gk protein

Week 4

This week I worked on my midterm presentation that we would be showing to our peers and also the other mentors. The presentation included an introduction to our project, why we were conducting this research, our literature review (papers and works that are related to our project and have significance to our research), our methodology, future work, and a video that served as a demo to what we’ve been doing so far. It was a very stress-free presentation and was used for feedback and help in our projects. In the next week I hope to garner all my data so that I may start working on my survey and send it out by Week 6’s beginning to then start collecting survey data to include in my paper.

Week 5

This week I noted down all of the data that I have accumulated between the two PDB files. I have also started narrowing down the five questions I want to ask for my survey for each protein model, which means I will have 10 questions total. I have started writing down my comparisons and similarities about the two files in a way that is presentable in a research paper. I have been updating OverLeaf as well to keep up with the progress I’m making. My hope is by the beginning of Week 7 I will send out the survey and continue to edit my paper while I await my responses. This would mean that Week 7+8 I am just adding my survey data and finishing up my paper (fingers crossed). It was also 4th of July this week, so it was an odd week, but very fun!

Week 6

 

This week, I asked my friend to act as my guinea pig with my first and very preliminary survey. The feedback I got was to not ask so many quantitative questions and more so about comfortability with the software as not everyone has a biology background. It is easier to ask if they are able to use the video in my survey to answer the questions than judge if they answered correctly. I have been updating OverLeaf, blog posts (trying to figure out the images). I am going to send out the survey this week and start analyzing my data as it comes in. For fun this week, my best friend from home came to visit and we spent a great time together exploring the city as it is her first time!

Week 7

Wow! We are hitting the last week of this internship and I feel so many emotions. I feel happy, scared, and excited for the future. This week I focused on finalizing the data and information I wanted to include in my paper and what I wanted to really write about as a conclusion of my research. I also finished up my survey this week, sending it to Dr. Wole for a quick run-through and then sending out the survey to everyone and anyone to get the max participants I could. As I wait for more results, I will just start to finish up my paper in OverLeaf and discuss the results I’ve gained so far, as there seems to be a trend. I’ve attached an image below of the trend I’m seeing. For fun, my best friend from college visited this weekend and we had a really great time, also with some of my co-interns as we celebrated our last weekend in NYC!

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar