Home » VR-REU 2024 » VR-REU 2024

VR-REU 2024

Or Butbul


Week 1


After landing in a sunny New York and unpacking my bags, I headed down to the lobby to meet the group of people that would be living in the same residence as me, we all went together to go bowling with the rest of the group and we had a lot of fun! The next day was spent touring the college, meeting the professors, and starting to talk about proposals. Initially I was drawn to a project focusing on Motion Capture. After reading the bulk of articles focusing on Motion Capture I decided to shift my focus more towards graphics. My proposal was accepted after the class we had on wednesday and I spent my time refining it on Thursday. Friday was spent learning Paraview and exploring the city


Week 2

This week focused on completing the preliminary material. I have been working on the CITI training materials as well as the literature review for the beginnings of my paper. I have been having an issue with my base university that has been stopping me from connecting to the internet here and limiting the things I can do while we meet. I hope to resolve the issue before the beginning of next week, as well as set up a remote desktop so I will be able to access a faster computer to render my virtual humans. Outside of the project, I have been trying many new restaurants and getting to know my cohort. Talking about each others project has given us a clarity about our ideas and goals in our projects.


Week 3

I managed to create the virtual humans last weekend. The main goal of this week was to connect to my home computer remotely so I could work in unreal and have my Meta-humans. I connected to my home computer Wednesday and downloaded the mesh of my first avatar. This weekend I plan to get the mesh of my other avatar, and the textures of both my avatars. Using Blender over Unreal was a big decision for me as Unreal Engine has its own rendering pipeline better suited to the realism of the Meta-humans, however, Blender gives me more information on render time and memory usage, which is vital to the project. The goal for next week is to get all the renderings of the avatars so the survey can be prepared. As an aside from my work this week, I had some fun opportunities to meet my group for pizza and shopping at Chelsea Market. I also met some of my old friends who live in the city now.


Week 4

This week focused on preparing the material we needed for our midterm presentations, I spent the bulk of my week refining my literature review and creating a more detailed methodology for the project itself. It was difficult to connect to my home computer this week and I did not have that much time to work in Unreal, I did have some work that I could do outside of my computer. Namely, I could prepare animation data so my avatars could move in the survey and survey takers could have a better reference for the level of realism of the avatar. Next week I plan to get the renderings of my avatars at their different levels of detail.

This is an example Metahuman with an idle animation found in Metahuman Creator, UE’s web based platform for avatar creation.
Outside of work, I went to a festival over the weekend in Brooklyn and I was able to see many amazing singers and hear a lot of great music.
Week 5
This week had many difficulties for me in capturing the renderings of my virtual human. capturing animation from Metahuman creator was not possible so a custom idle animation had to be recorded using LiveLinkFace. Importing that animation into Unreal and getting it to work with their avatars was a long ordeal as well. I was able to change the level of detail of my avatar in the scene and modify its texture maps which will be helpful for the study. Unreal also provides me with the render times of each frame and the total render time which is helpful for the project. Lastly, I had a big issue with getting a specific render engine to work with my MetaHumans. I was wanting to use Unreal Engine’s path tracing as my chosen engine however there is very little support that engine has with MetaHuman and renderings could not be achieved at a sufficient level of detail. I have decided to use their detailed lit render engine to replicate a lower performance system’s render engine, while still maintaining a sufficient level of detail overall.
I will get the renderings as soon as possible, My survey questions have been prepared, I plan to render the avatars as soon as I can and importing those videos into the survey, hopefully finishing my survey before the start of next week.
Week 6
This was a great week to get work done! All the MetaHumans were animated and rendered. The image sequences were rendered out as png sequences, and then assembled as mp4 videos in blender. I then tried to import the videos into Qualtrics to create the survey however the videos were too large in size. I used Giphy to down-scaled them and turn them into gifs which seemed to work with the survey. The surveys should be completed in the weekend, and distribution will likely happen at the beginning of next week.
Hunter College
City University of New York
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar