Home » Articles posted by VR-REU Student

Author Archives: VR-REU Student

Enhancing Trust in Telepresence: The Influence of Familiarity and Varied Eye Contact on Trust in Look-Alike Avatars

Kriti Kalary, SUNY University at Albany / SUNY Upstate Medical School

Week 1:

I arrived in New York City on Sunday and we kicked off the REU program with an icebreaker social event. I met the rest of my REU cohort while bowling (I wasn’t very good!). All too soon, we jumped right into work. I met with my mentor, Dr. Wole, and started brainstorming ideas for my proposal due on Friday. I worked to narrow down my field of interest and land on a unique yet interesting idea for my project—I had to toss out a lot of ideas before ending up with something I was happy with. This week also started off our first few classes of Dr. Wole’s VR, AR and MR course which has been incredibly interesting so far. I spent the rest of the week furiously searching for relevant articles to include in my literature review, trying to round out my rationale and support my research proposal. For fun this week, I met a friend in Central Park and tried some great bagels!

Week 2:

Week two was equal parts work and fun! I had a couple of road bumps this week where I had to alter my proposal a bit to make it more unique. Luckily, my older research for my literature review from last week was still helpful and I was able to collect all the information that I needed for my paper without too much trouble. This week, I finished my introduction and related works section of my paper, created a figure for the theoretical model of my paper and started outlining my methodology.

I also downloaded reallusion and tried to get character creator to work on my macbook through parallels, but it took too long to load. The headshot plugin worked well, so I will likely end up working on the software aspect of my project next week on the computers at CUNY Hunter. For fun this week, Or, Asmita and I went to the Chelsea Market, some of my friends came up to visit me and we explored Central Park. I also listened to a performance by the NY Philharmonic and saw fireworks!

Week 3:

This week saw some pretty solid progress on the technical front! I created my two avatars— one which was familiar to participants modeled after my own face and the other was modeled off of a headshot of a random person I found online. The avatars look fairly realistic and I tried importing my avatars to Iclone and tested out the live face app to test out various methods of eye contact/gaze behavior and it worked out well.

The first avatar is the unfamiliar avatar, while the second is the familiar avatar (I still have a few edits to do for the former).

I also finalized my methodology this week, so I’ll be in a good spot next week to start working on my survey. For fun this week, I went to see a 24kGoldn concert and explored Brooklyn with a friend! I also went thrifting and visited some cute bookstores.

Week 4: 

I revised my methodology and technical implementation this week! I ended up completely refreshing my avatars since I wasn’t super happy with my original versions of the unfamiliar and familiar avatars from last week. I also worked on my slidedeck for the midterm presentation on Friday and created demos to share with my classmates to update them on my progress. Here are pictures of my new refreshed avatars (unfamiliar and familiar, respectively).

iClone worked well without any issues for my survey video clips and I was able to standardise each eye contact levels’ eye movements to maintain a control variable for each avatar. The plan for the following week is to start working on the survey and send it out to start collecting data.
For fun this week, I met up with a friend and went to Roosevelt Island, visited the Guggenheim museum and tried out some new restaurants with friends! I also got a chance to catch up on some of my shows (The Boys S4 and Invincible S1) and got time to get through some books as well (A Room with a View by E. M. Forster and Notes from the Underground by Fyodor Dostoyevsky).
Week 5:
This week I edited my clips for the unfamiliar and familiar avatars and created the survey to start collecting some data. I encountered some road blocks in terms of being able to randomize the order of my questions for all three parts of my experiment— the demographic section, the trust game section and the trust and familiarity section. I ended up reorganising my survey via Google forms and changing each avatar video clip from being in individual sections to moving them all into one section. I also struggled with attaching my video clips to the form and attaching them to individual questions, so instead of embedding the videos I simply placed a YouTube link into the stem of the question for easier user access to the videos. I hope to finalize my survey and start sending it out and start collecting preliminary data next week.
For fun this week, I went to the Met and spent time with my friends and cousins who came to visit me. I went to the vegan night market with my colleagues and tried some fantastic bagels. I also got a chance to watch the fireworks on July 4th and had incredible Korean fried chicken!
Week 6: 
I continued to edit my survey and sent it out to collect preliminary data. With the initial 5-10 responses, I used their feedback as a gauge to improve my question phrasing and edited my survey accordingly. I then sent out the updated survey to collect data. I now have 33 responses with the following characteristics.
Next week, I hope to collect some more responses and make headway into data analysis. I aim to make some figures and start analysing my data with a focus on looking for significance. For fun this week, I met up with friends for dinner at Red Lobster and went to a speciality book store and explored Times Square. I also finished reading Battle Royale by Koushun Takami and finished watching Invincible S2!

STEM Education on Structural Biology through an Immersive Learning Environment

Sabrina Chow, Cornell University

Week 1: Introduction and Project Proposal 🎳

The first few days in NYC for the REU started with an introduction to the rest of the cohort and the facilities. I went bowling with Dr. Wole and the other REU students, which was a lot of fun and very competitive. The next day, we all gathered at Hunter College and toured the building. We met some of the mentors, including my own: Kendra Krueger. The day after was the start of the class “CSCI49383 – VR, AR, Mixed Reality,” where I learned about the ideal principles of VR, how it works, and its history. After class, I worked with some of the other students to brainstorm for our proposals over poke and boba.

Later, I met with Kendra to write up the details of my project proposal. Kendra is the STEM Outreach and Education Manager at the Advanced Science Research Center (ASRC), and from our conversation, I can tell that she is truly an educator at heart. I’m really excited to work on this project, which will enhance the learning experience for K-12 students visiting the Illumination space in the ASRC. Kendra gave me two different paths to go down, but ultimately, I have decided to focus on structural biology instead of neuroscience. It’s a subject I’m more comfortable with and I think I can create a good STEM education project about it. Finally on Friday, I met with the rest of the cohort, where we got an introduction to Paraview and presented our project proposals.

A snapshot of the Paraview tutorial we went through.
A snapshot of the Paraview tutorial we went through.

Week 2: Working at the Advanced Science Research Center 🧪

I started the week with getting set up at the ASRC and introduced to the other high school/undergraduate researchers working there over the summer. I got to talk more with Kendra about my project and briefly met Eta Isiorho, a researcher at the ASRC whose expertise in structural biology and crystallization I will be relying on. Then, I attended a lab safety training session over Zoom so that I’ll be able to enter Eta’s lab. I also used the time to complete CITI training since the world was on fire and it wasn’t safe to go outside. (see photo below)

Smoky air outside the dorm.
Smoky air outside the dorm from wind blowing down the smoke from the Canadian wildfires. The AQI was almost 300.

Towards the end of the week, I attended the SciComms conference at the ARSC with the rest of the VR-REU cohort. The format was each presenter prepared an informal presentation of their research followed by a formal, more scientific version. It was really interesting to hear about the wide variety of projects going on around us, and I think attending will really help to prepare me for our symposium at the end of these 8 weeks.

Part of the science/research art project at the SciComms conference.
Part of the science/research art project at the SciComms conference. The question was, “What about research inspires you?” For me, it’s my love for animals and therefore, biology.

For my project, I continued to compile sources and take notes for my literature review. I was hoping to create a mock-up for the project, but after meeting with Kendra and Eta, I think I will need to readjust my project to fit both of their expectations.

Week 3: Making Progress 📱

This week, I started to get into the meat of the project. Since this is a large project, I knew I had to break it down into smaller pieces. First, I made a mockup of what I wanted my application to look like using Figma (see below).

This is my general idea for the application that I’m developing. Students will be able to use their devices to see molecules and more through AR.

Second, I began to work with Xcode to create the real app. This took a little bit longer than I was expecting since I am still getting used to Xcode and Swift again, but I have the general layout.

Layout of app in Xcode.
The first look at my application in the Xcode storyboard.

Looking forward, I will need to work on the functionality of the application. That will be the most difficult part of the project, but I’ve found many YouTube tutorials that will help me understand how Apple’s RealityKit works so I am hopeful. In addition, another issue that I’ve been considering is how I will share my application. If I go through the official Apple App Store, I will need to submit the app for review and prepare the app with the proper certificates, etc.

Outside of my project, I also met a couple more times with Eta. She showed me the crystallization lab at the ASRC and taught me more about the software she uses. I’m hoping to use some of that software to create videos of the molecules. In addition, I attended the CUNY Graduate Sciences Information Session and learned more about the process of applying to grad school. Finally, towards the end of the week, Dr. Wole taught us about VMD.

Picture of VMD interface with overlaying structures.Picture of VMD interface with molecule with selected functional group.

Week 4: Application Framework 🛠️

For this week, I created the structural framework of my application in Xcode. I finished the storyboard for the application and began to make ViewControllers. The vast majority of the week was spent on implementing the collection tab. In hindsight, I think there are still ways that I could have made the code more efficient. For example, I made three separate UICollectionViews instead of just using the built-in sections version. This method would require adding custom sections though, so I will most likely not change this unless I have spare time at the end of the project.

The implemented version of the collections tab for my application.

I also worked on implementing the pop up page that shows up when a molecule is selected from the Collection tab. Each molecule will have more detailed information about what the image is showing and why it is relevant (in general and to the ASRC’s scientists).

This is the pop up tab that shows more details about a selected molecule from the Collection tab.

The only thing left to do regarding these parts is:

  • The actual game part. Users will need to unlock the molecules through the AR camera. That means that they should not be able to be clicked on until after the user has scanned a particular code.
  • The molecules. The image files used were random examples taken directly from RCSB PDB. I will need to find relevant molecules and their images– hopefully from Eta.
  • The descriptions. I will need to write the different blurbs and have Kendra look over them. My goal for the little descriptions is that they will be informational without having too much scientific jargon.

I think for this upcoming week, I will reach out to Eta and Kendra about getting files. Other than that, I will be focusing on implementing the AR part because I suspect that will be the most difficult. Once I have the files, I will also need to convert them from .pdb/.xyz/etc. to a 3D compatible format. Fingers crossed!

Week 5: Plateau-ing 🥲

This week, I started out with trying how to convert between file formats. Most protein files are saved as .PDB (old) or .mmCIF (new). First, I needed to change from those formats to .OBJ, a standard 3D file format. VMD and PyMol are both supposed to have native converter tools, but I found that when I tried to convert files using these two programs, the resulting files were almost or completely empty. Eventually, I found that the Chimera program works the best to convert the .PDB/.mmCIF files to .OBJ. Second, I would have to go from .OBJ to .USDZ, the 3D file format created by Pixar that Apple uses. The newest version of the application, ChimeraX, was the best for creating a .OBJ compatible with Apple’s RealityConverter tool that takes 3D files and converts them to .USDZ. The final file did not have color, which is definitely not ideal, but I think I will deal with that later.

A snapshot of RealityConverter taking in a .OBJ file and creating this .USDZ file.
A snapshot of RealityConverter taking in a .OBJ file and creating this .USDZ file.

Next, I worked on implementing the actual game functions. This required setting up ‘communication’ between the different ViewControllers. I tried many different methods, but I found that the best way was to have functions changing the items within the Collection class and make instances of the Collection class in the different ViewControllers that needed to use those functions.

Finally, I’ve been trying to learn and use the basics of RealityKit in the app. Specifically, I want to use the image tracking feature. I need to track multiple images, and each image should show a specific image. I have an idea of how to do it, but I have not been able to test it. Also, I still need the actual files that I will use in the application.

Week 6: Everything is Looking Up 🥳

I began the week knowing that I would need to get the AR function implemented. My goal is to do user testing next week, and I can’t do that with an app that doesn’t have AR since the whole point of this program is to use XR in an innovative way. I was starting to feel panic as the deadline is quickly approaching.

As a result, I worked on setting up the AR. I finally began to test on my iPhone, instead of the built-in simulator on my laptop. On the first tab of my application, there is an ARView. My first issue was that this ARView was just showing up as a black screen. Eventually I got it to work with the camera by setting up permissions in the app’s .PLIST file (property list).

Camera on.
The first tab of my application with the working camera.

My application uses images to track where the model should be placed. Therefore, using my phone camera allowed me to actual see the model from different perspectives. I made a couple of sample scenes in Apple’s RealityComposer and then imported the project into my Xcode project. In RealityComposer, I was able to display the model by scanning the image (as seen below), so I assumed that it would work in Xcode. It did not.

Molecule model in RealityComposer.
This was the sample molecule model in RealityComposer using my phone camera. The model is sitting on top of the QR code.

I ran into a complete roadblock with the AR in the middle of the week. I found that my Scene was loading correctly but was not connecting to the Anchor. I honestly think I spent 12 hours searching for a solution. I kept adding and testing code over and over. What was the solution, you might ask? Deleting everything but the code loading in the models from screen… Sometimes the simplest answer is the solution. As a result, the models correctly showed up when its respective image appeared. (I’m updating this blog in the middle of the week because I need to share that I succeeded 🥹.)

Working AR models on iPhone screen.
Using ARView to scan the QR codes and place the models from RealityComposer on their respective QR code.

Then, I worked on the ‘unlocking’ feature. This required reworking my Item class and learning about how Substrings work in Swift. Thankfully, it was not nearly as difficult as the AR stuff. I also spent some time downloading QR codes for the image tracking. Finally, I worked on downloading files from the Protein Data Bank and writing descriptions for them.

Week 7: Final Stretch 🏃🏻‍♀️

This week consisted of fixing a lot of small things before doing user testing. For example, one issue I had was that I needed to load the arrays of my Item class before ‘capturing’ a picture of the Item’s model in AR. My solution was to switch the positions of the Collection and AR tabs. This meant the Collection tab would load first. I also think logically it makes more sense for the Collection to be first as an introduction to the application.

Another major part was fixing the QR codes I was using. I originally generated the QR codes online using QRFY. Then I used a photo editor to add a blank space in the middle with the QR code’s number. The issue that I ran into was that the QR codes were too similar. Apple’s AR system kept mixing them up for each other, resulting in tons of overlapping models. At first, I thought the issue was that I was testing it on a screen; however, I printed them out and they were still glitching. I then spent a couple hours in the photo editor adjusting the QR codes by hand until Apple approved that they were different enough.

I also learned how to download the files with color. Instead of using the .OBJ file format, I started using the .GLB/.GLTF format. The command in ChimeraX looks like: “save filename.glb textureColors true”.

I then collected all of the files that I would need. I decided on three major subjects to talk about: 1) protein structure, 2) x-ray crystallography, and 3) scientists at the ASRC. All of the protein molecules were downloaded the RCSB Protein Data Bank. I wanted to use 3D models for the x-ray crystallography process, but I realized I did not have enough time to make them myself. Therefore, I used images from a presentation that Eta had sent me. For the scientist spotlights, I went through the faculty page of the ASRC and read a ton of papers until I found two more faculty (in addition to Eta) that work with protein files. Once I had all the images and converted the files, I put them into my RealityComposer project. Then, I wrote captions for each one.

What my Reality Composer project looked like with the correct model and QR code.
What my Reality Composer project looked like with the correct models and QR codes.

Finally, on Thursday of this week, I did user testing. It was a bit nerve-wracking, especially because I learned that I could not download the application on everyone’s phones. Apparently Apple only lets you download onto three devices, and for some reason, I could only download it onto my phone and one other person’s phone. I even tried to sign up for the paid developer program ($100 fee…) but it would take 2-3 days to get approved and it was the morning of user testing. Ultimately, I decided to split up the group into two and have everyone share the two phones.

The testing itself went pretty well! I was pleasantly surprised by how invested everyone was in finding all of the QR codes. Everyone was also quite impressed by the AR. The models stay still on the QR code, so moving around in real life allows the user to see a model from different perspectives.

Another part of the Thursday tour was my speech. I was invited to give a 30 min speech to my cohort about something related to research. My topic of choice was “Surviving Academia 101.” This is something I feel pretty strongly about since I am still figuring out my path through academia. To be honest, it certainly was not the most well-rehearsed speech, but I think (and hope) that my passion about the subject made up for it. I talked about my experience with wanting to do research but not feeling like I belonged.

Sabrina Chow giving a speech.
A photo of me giving a speech to my cohort about research.

Week 8: Saying Goodbye 🥲

Over the weekend, I started to write my final paper on Overleaf. I had already set up the general structure and some of the sections. Thanks, earlier me! The main thing that I did was updating my methods section and starting to look at the results. Although I did not get as much user data as I would have liked, I definitely had enough to consider my work a preliminary study.

On Monday, I went in to help test some of the other students’ projects. It was really impressive seeing what everyone else had accomplished in just 8 weeks! Dr. Wole also helped me with some questions I had about how to visualize my data. Later that night, all of us students met up for dinner. It was so much fun.

2023 VR-REU students dinner

For the rest of the week, I spent most of my time grinding out the paper and preparing my slides for the symposium. I honestly did not have too much trouble with using LaTeX. My real issue was finding the right words to summarize everything I did. There were parts where I wanted to overshare about the process (specifically to complain about all the problems I had run into). There were also parts where I had no idea what to write. Still, by writing a couple sentences and then switching when I ran into a mental roadblock, I began to make significant progress. Writing my paper while also working on the presentation helped a lot as well, since I could just take the information from the paper and simplify it for the presentation. Before long, the slide deck for my presentation was done.

Symposium presentation by Sabrina Chow
My presentation on what I had spent the last 8 weeks doing. It was 10 minutes long with another 2 minutes for questions.

Thursday morning was the VR-REU Symposium. One by one, we presented our projects, talking about how our projects took shape, what challenges we had faced, and our results. Even though I’d seen everyone’s projects by that point, it was quite interesting to hear about how they addressed issues with their projects.

Finally, Friday arrived. Our last day! It’s hard to believe that time passed by that quickly. I went to our usual classroom in Hunter and finished up my paper. I submitted it to the ACM Interactive Surfaces and Spaces poster session. Then, we said our goodbyes.

For my last night in NYC, I went for a quick walk through Central Park and reflected. I’m so grateful that I got to be a participant in this REU. I’ve learned so much from Dr. Wole, Kendra, and my fellow students. I challenged myself with a project that was all my own, and I am very proud of how it turned out. Wishing everyone else the best with their future endeavors! I know you all will do amazing things!

Final Paper:
Sabrina Chow, Kendra Krueger, and Oyewole Oyekoya. 2023. IOS Augmented Reality Application for Immersive Structural Biology Education. In Companion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces (ISS Companion ’23). Association for Computing Machinery, New York, NY, USA, 14–18. https://doi.org/10.1145/3626485.3626532 – pdf

Amelia Roth: The Community Game Development Toolkit

Amelia Roth, Gustavus Adolphus College

Project: The Community Game Development Toolkit–Developing accessible tools for students and artists to tell their story using creative game design

Mentor:  Daniel Lichtman

About Me: I’m majoring in Math and Computer Science at Gustavus Adolphus College in Saint Peter, Minnesota.

Week 1: I begun the week by working on several tutorials in Unity to better learn the application and how I can use it to improve the accessibility of the Community Game Development Toolkit (CGDT). When meeting with my mentor earlier this week, we decided that improving the accessibility and functionality of the CGDT was our main goal during these 8 weeks. We would like to shift the creation of visual art and stories from Unity to an interactive in-game experience. Over the next few weeks, I plan to work on the in-game editor.

 

Week 2: This week I was able to use my new Unity skills to start using the toolkit and understanding the scripting behind it. I created a GitHub account and soon, I’ll have access to the code for the CGDT! I met with my mentor at the beginning of this week and we created a to-do list for me over the next couple of weeks. To create an in-game editor, my first steps will be working on selecting objects, moving objects, and most importantly, making sure that any in-game changes will be saved once the user leaves play mode in Unity.

Here’s a bit of art I made in the CGDT, check it out!

 

Week 3: I spent this week working on the code for the CGDT and uploading it to the repository on Github so that my additions are documented in the CGDT. I met with my mentor twice this week to work get help on the coding needing and we got quite a bit done!As of now, while in play mode, a user can select an object (which highlights to show it has been selected), move that object in circle around them, towards and away from them, and up and down. They can also make the object smaller and bigger. Since this took less time than expected, I can move onto the next step which would be allowing the user to save changes in play mode instead of just in editor mode. I also think it would be helpful to add functions that allow the player to rotate objects on the object’s axes, so I’ll ask my mentor if he thinks we have time to add this in.

I also began writing my paper this week. The REU had a cowriting session on Tuesdays that we plan to implement indefinitely so that us students have set aside time specifically to work on the papers and ask questions in real time if we need help. I’ve been looking for previous research that relates to my project somehow, and one very interesting thing I found is called the Verb Collective. With the Verb Collective, different verbs such as “to scatter”, “to drop”, and “to spell” have functions attached to each of them, which is turn can call other verbs and their functions. I think it’s related to my project in the way that the CGDT is meant to be a storytelling tool. Both the Verb Collective and The Community Game Development Toolkit are interested in exploring VR as a way to see the world with a new perspective.

Week 4: This week hasn’t had as many satisfying results as last week, but I’m in the middle of working on several things that should hopefully be done next week. One of the things I’m currently working on is movie textures. Right now, the CGDT has an automatic importer for textures that turns images in usable sprites, but no such script for movie textures. I have learned how to do it manually, though, and if you look closely at the cube in the image below, you’ll see it has a video attached of Grand Central Terminal!

Dan and I are also working on saving the changes we make in play mode so that user’s hard work doesn’t go to waste! Some of the necessary functions are a bit over my head for this so Dan is lending a helping hand. I’ve also started working on some documentation for the CGDT, so that it’s easy to find a tutorial for exactly what you’re trying to learn how to use.

The writing for the paper is going well, I’ve got a solid related works section and a good start on my introduction. It is also our midterm REU presentation tomorrow, and I’m excited to share the work I’ve done on the CGDT with my fellow REU students!

 

Week 5: As with all things, this project has its ups and downs in terms of how much I get done in a week. This week was one of the slower ones. Saving and loading automatically is turning out to be trickier than expected, and building the CGDT project I have on Unity to my Quest 2 is turning out a whole slew of errors so far. But, the work I’ve done this week has progressed my understanding of these problems, and I feel confident that I can finish them up in week 6. I also created some new documentation for the CGDT this week on downloading and installing Unity, and how to use assets that were originally IRL art in the virtual setting.

In the next week, besides finishing up the lingering tasks of week 5, I plan to adapt the code I’ve written for moving, rotating and scaling objects so that the can be controlled in VR through joysticks instead of on a keyboard on the computer. Some of the original code of the CGDT might have to be adapted as well, such as Player Movement, which is also down with the keyboard currently. Looking further ahead, once I feel the CGDT has all the implements I’d like it to, I’ll test the usability of these functions in a small study. Once all these pieces are in place, I’ll be able to finish my paper!

Week 6: Week 6 had one of the most rewarding experiences of this REU so far: figuring out how to make automatic saving and loading work! It was very exciting to leave play mode, enter play mode again, and see the changes I had previously made saved. Even if I restart Unity and reopen the project, the changes remain. In my opinion, this is the most important aspect I’ve added to the CGDT. Without automatic saving and loading, the tools for moving, rotating and scaling objects aren’t very useful. Ideally, I’d love to add an inventory in play mode, so that dragging and dropping objects from the project window isn’t necessary, and a way to delete objects in play mode as well. Both of those things are definitely possible in the time I have left, but finding a way to save those changes as well might end up being beyond the scope of the project.

Week 6 also brought another change of plans as well. I’ve been trying to build the CGDT to my Quest 2 so I can work with it on a headset. Unfortunately, I’m still getting a lot of errors. It may have something to do with how the new scripts I’ve added to the CGDT. However, Dr. Wole and I agreed that working on deployment issues, especially when they’ve taken up a lot of time already, is probably not the best way to spend my remaining weeks of the REU. Although seeing the CGDT on a headset would have been very cool, I actually think working more on the desktop version is truer to the mission of the CGDT. The CGDT is meant to be accessible for students, artists, and non-game developers in general, and and lot more people own computers than VR headsets. At this point, though, I’ve learned to never say never, so who knows what Week 7 will bring!

Week 7: Success building to the headset! There were a few scripts in the CGDT that were editor-specific and therefore causing the building problems. I was able to remove those scripts and once I did, my scene built to the Quest 2. However, it doesn’t have any of the new capabilities that the desktop version of the CGDT has, so I’ve spent the last couple days figuring out what needs to change for the headset version. I created a new prefab for the CGDT that is VR-specific so that it relies on an OVRCameraRig instead of a Camera. Once this prefab is added, the user is able to fly around in the scene they’ve created, moving forward in the direction they’re facing, and rotating if they wish. I’d also like to move objects with raycasting, same as I did for the desktop version, so I’ve added the raycasting laser, although it isn’t able to grab anything yet.

There was some other great stuff I did this week for the program. I participated in a user study for another student’s project, and the whole program went ziplining at the Bronx zoo together, which was really fun! The deadline on the paper is also coming up quickly, so I’ve been polishing my abstract and working on the implementation section of my paper. I was recommended a few applications to use to draw some illustrations of what my functions do, so I’ll be adding those illustrations into my paper in our final week.

Week 8: The final week! Everything I worked on this week was related to polishing my paper and creating my presentation for the final day of the REU, today. I’ve learned a lot during this REU both in terms of programming tools and skills like writing and presenting. I’m think my presentation went well, and I look forward to putting the finishing touches on my paper today. I wish I could have gotten more done on the VR-CGDT version, but as this is an 8 week program, I’m really happy with everything I was able to accomplish. Thanks to all of my mentors for making this such a great experience!

Final Report was submitted and accepted as a 2-page paper (poster presentation) at VRST 2022:
Amelia Roth and Daniel Lichtman. 2022. The Community Game Development Toolkit. In 28th ACM Symposium on Virtual Reality Software and Technology (VRST ’22), November 29-December 1, 2022, Tsukuba, Japan. ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/3562939.3565661 – pdf

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar