Home » REU » REU Projects

REU Projects

Proposed Projects

Research mentors will be paired with students and a graduate mentor. If you are a CUNY research staff/faculty interested in being a research mentor, please contact the PI. Faculty from other colleges and universities are also welcome.


Immersive Remote Telepresence and Look-Alike Avatar Project

Prof. Wole

Research Mentor and PI: Oyewole Oyekoya, Associate Professor, Computer Science, Hunter College, CUNY
Immersive remote telepresence, facilitating shared collaborative experiences with one or more interlocutors in distant locations, represents the frontier of VR/AR technology. Immersive telepresence can potentially reduce costs for users as they do not have to physically travel to remote locations, and in a simulated manner can facilitate the experience of being in or traveling through a remote location using virtual reality technologies such as immersive viewing and rendering systems, look-alike avatars, and 3D interaction metaphors for travel and manipulation. Such systems represent the next generation in business travel, meetings and online teaching systems. However, there are outstanding questions regarding the fidelity of the look-alike avatar’s animation quality (numbers and accuracy of the sensors), immersive display and rendering quality, latency and their impact on sense of presence, communication content and understanding. Research in immersive telepresence will benefit tremendously from this REU site, as the technological challenges are demanding. This proposed REU project will enable investigations into telepresence challenges such as facilitating audio feedback, gaze communication, facial animations, body/hand gestures, touch cues, and how users perceive scanned look-alike avatars.
Research Area: 3D Modeling, VR, AR, MR, Unity3D


MicroRNA-1205 as a regulator of cell lineage plasticity

Research Mentor: Olorunseun Ogunwobi, Barnett Rosenberg Professor of Biochemistry and Molecular Biology, Department of Biochemistry and Molecular Biology, Michigan State University.
Cell lineage plasticity is defined broadly as the ability of determined cells to phenotypically differentiate into another cell type. Although this mechanism is not common in normal microenvironments, early evidence of cell lineage plasticity has been demonstrated in drosophila during embryogenesis. More so, it is often observed under stressful conditions such as in diseased states including cancer and diabetes. There have been many demonstrations on various signaling pathways involved in cell lineage plasticity. The most notable example is the addition of the OSKM (Yamanaka factors) to fibroblasts to be induced into pluripotent stem cells. OSKM represents Oct4, Sox2, Klf4 and c-MYC as transcription factors that are required to induce pluripotency. In addition to transcription factors, biological pathways have been elucidated in inducing cell lineage plasticity including other functional proteins and even noncoding RNAs, including microRNAs. MicroRNAs are small non-protein coding RNAs that induce mRNA targeted degradation. More recently, as microRNAs have been involved in numerous developmental pathways, these small non-protein coding molecules are currently being investigated as master regulators in cell lineage plasticity. An example is the miR-200 cluster involved in reprogramming mouse embryonic fibroblasts into induced pluripotent stem cells (iPSCs). Immersive Visualization tools such as VMD and Paraview can enable further understanding of how microRNAs may be involved in reprogramming to accelerate the clinical use of iPSCs used in medicine. Students will conduct research on how to visualize microRNAs on VR headsets. This project would fit a student majoring in any Computational Sciences e.g. Computational Biology.
Research Area: Scientific Visualization, VMD, Immersive Paraview


Explore Virtual Environments Using a Mobile Mixed Reality Cane Without Visual Feedback
Research Mentor: Hao Tang, Associate Professor and Deputy Chair, Computer Information Systems, Borough of Manhattan Community College, CUNY
Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public still cannot enjoy the benefit provided by VR, especially the population of the blind and visually impaired (BVI). Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are not accessible and convenient for BVI individuals. We proposed a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes contact with objects in VR. Thus, the app allows BVI individuals to interact with the virtual objects and identify sizes and locations of the objects in the virtual environment. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.
Research Area: VR, AR, MR, Unity3D


The Community Game Development Toolkit — Developing accessible tools for students and artists to tell their story using creative game design

Image Credit: Millicent Encarnacion, New Media Art student, Baruch College

Research Mentor: Daniel Lichtman, Visiting Assistant Professor in Digital Studies, School of General Studies, Stockton University.
Video game design and technology presents many of today’s most advanced experiences of interactive storytelling. But making creative, visually compelling games and interactive 3D environments usually requires a host of specialized skills–advanced computer programming, 3D modelling, character animation, and more. The Community Game Development Toolkit is a set of game design tools that make it easy and fun for students, artists and community members to create interactive 3D environments and visually rich games without the use of code or other specialized skills. The toolkit seeks to provide new tools for diverse communities to tell stories in their own visual language, and for students and artists to explore their own traditions, rituals and heritages through creative game design. To make visual scene and object design quick and creative, the toolkit draws designers’ own sketches, photos, paintings and other creative materials. To easily create interactivity without the complexity of coding, the toolkit additionally provides game components that use drag and drop in order to implement features such as collecting inventory, non-player characters, changing scenes/levels and more. This REU project will expand the functionality of the toolkit and increase its accessibility to diverse student bodies who do not have previous technical experience in game design and programming. Research will include conducting a requirements analysis of students’ needs; designing and programming (using C#) additional game components for interactivity; further
exploration of representing 2D brushstroke and 2D textures in 3D space; and integration of toolkit functionality with AR, VR and MR.
Research Area: VR, AR, MR, Unity3D


Immersive Content for Interdisciplinary STEM Education

Research Mentor: Kendra Krueger, STEM Education & Outreach Manager, Advanced Science Research Center, CUNY Graduate Center
The Advanced Science Research Center (ASRC) is CUNY’s premier interdisciplinary research institute built for scientific discovery and education. The IlluminationSpace is an immersive learning environment featuring interactive, motion sensing exhibitions highlighting the fundamentals of each research initiative at the ASRC (Nanoscience, Photonics, Structural Biology, Neuroscience and Environmental Science). For this REU project, the student will work with the Outreach and Education team along with resident scientists to create immersive STEM education content. This content will help to explain the key concepts or interdisciplinary nature of the research happening at the ASRC for middle school, high school and undergraduate students. Content can be developed for VR, AR or projection screen platforms. This video explains more about our IlluminationSpace and programs.
Research Area: VR, AR, MR, Unity3D, Scientific Visualization

VR as a learning tool for students with disabilities

Research Mentor: Daniel Chan, School of Professional Studies, CUNY
Daniel started experimenting with VR for students with disability in 2017 whilst at Hunter College, hoping that through its ability to build association and stimulation, they would be able to help students who are tactile and or visual learners improve skills in areas such as biology, physiology and art. At first the technology attracted only a handful of students, but it became more popular through time and engaged faculty members from many academic departments such as Chemistry, Computer Science, Education, ICIT, Film and Media Studies and ACERT (Academic Center for Research and Teaching) who wanted to explore VR features, create assignments or dedicate a class session to the lab. VR creates a multisensory experience leading to further engagement by students who may struggle with inattention and/or social interactions, and those with physical limitations who otherwise may not be able to engage in certain tasks. For instance, students who struggle with changes in the environment (like those on the autism spectrum) can plan ways to decrease stress. Research shows that use of VR helps with visual motor skills and visual spatial attention and provides many other benefits that may not be directly related to classroom learning, such as with improving job interview skills. The proposed REU project will enable further research on the benefits of VR and look-alike avatars in immersive and effective learning experiences for students with disabilities.
Research Area: VR, AR, MR, Unity3D


Virtual Reality and Public Health Project: Nutrition Education
Research Mentor: Margrethe Horlyck-Romanovsky, DrPH., Assistant Professor, Department of Health and Nutrition Sciences, Brooklyn College, CUNY
Project 1: Nutrition education at point of purchase
Choosing healthy options at the supermarket can be difficult. Food labels are not user friendly and require a lot of reading and analysis of information. We propose the creation of an augmented reality app to visually interpret nutrition information on labels of real packaged foods and interpret these based on the dietary recommendations for the individual.
Project 2: Nutrition education and Look-Alike avatars
Modeling desirable behavior, e.g. shopping for healthy foods, preparing healthy foods and eating healthy foods in a virtual setting has been shown to increase those healthier behaviors in real life. We propose a gamification with look-alike avatars engaging in shopping for healthy food in a food desert where healthy food is sparse and there is an overabundance of unhealthy foods. Incentives, point systems and actual monetary rewards for food shopping could be components.
Research Area: VR, AR, MR, Unity3D


Data Visualization, Virtual Reality and Structural Racism project
Research Mentor: Courtney Cogburn, Associate Professor, School of Social Work, Columbia University
Though VR/AR/MR, we aim to visualize structural racism i.e. the ways in which multiple systems across societal domains produce racial inequity and racial advantage. This project will use interactive, narrative storytelling in conjunction with data visualization to capture and visualize people’s racial experiences in relation to various social domains, such as housing, health, policing and gentrification over time.
Research Area: VR, AR, MR, Unity3D, Data Visualization, Transdisciplinary Design


Previous Projects


Jump into Blackbox through Virtual Reality: Visualization of State of the Arts Deep Learning Model Architecture – Transformers
Research Mentor: Lei Xie, Professor, Computer Science, Hunter College, CUNY
The assigned student will work under the supervision of Professor Xie’s graduate student, Beatrice Cai.
Neural network is always criticized for being blackbox and its “unbelievably” high accuracy in prediction just cannot win trust and hence does not get widely adopted in sensitive applications such as medical diagnosis, drug discovery, judicial reasoning. The reluctance and even repulsion comes down to the fact that neural network, as model, is only transparent at the input and output ends. The only four types of widely recognized interpretable models are linear regression, decision tree, directional acyclic graph and rule-based models. They are thought to be “interpretable” just because human can see how input flow through the model and come out as an output. Thinking of this, with the support of Virtual Reality (VR), we can jump into the “blackbox” to observe the internal information flow, which could lead to a solution for the interpretablity dilemma. This project aims to open up the most famous and notoriously huge deep learning model, Transformers. A user can take the view of the input and follow it through the processing experience within Transformers to explore the inside world of the blackbox.


Sonifying the Microbiome: 365 days in 360º
Research Mentor: Andrew Demirjian, Assistant Professor, Department of Film and Media Studies, Hunter College, CUNY
This proposed project involves an exploration of the affordances of sound as an additional tool to help identify patterns and subtle changes in data that are difficult to trace by viewing spreadsheets or graphs. Scientists are increasingly finding important relationships between mental and physical health and the state of the microbiome. The patterns that emerge from the development of an infant microbiome may have intriguing implications for a wide range of mental and physical attributes later in life. Scientists can now use DNA sequencing to quantitatively monitor all of the different bacteria in a person’s microbiome. This project maps data of the evolution and fluctuations of microbes in the microbiome of fourteen infants from the day of their birth until they are one year old into sound. Different aural characteristics like pitch, note duration, timbre and location in the sonic field are used to identify variations in presence, quantity and type of bacteria changing over time. The outcome of the research will be a 360º listening experience that complements a visualization of the same data, enabling listening and viewing from multiple perspectives. The user will be able to choose to hear/view one, multiple or all of infants in the study simultaneously as well as being able to toggle through different filtering and grouping taxonomies of bacteria.
Research Area: VR, AR, MR, Spatial Audio


Arab Data Bodies: Social Media in Mixed Reality

Research Mentor: Laila Shereen Sakr, Assistant Professor, Department of Film and Media Studies, University of California, Santa Barbara

Arab Data Bodies is a 360-degree, first-person immersive, virtual and augmented reality (VR/AR) documentary of the Egyptian Uprising of 2011. This data-driven project will use nearly one hundred million social media posts in thirty languages (Twitter, Facebook, Instagram, YouTube, and other popular sites) harvested by the R-Shief media system from 2011-2013. It is an adaptation of data visualization into documentary mixed realities. This project will extend the representation of R-Shief’s historical, eleven-year-old archive of social media from two-dimensional data visualizations (R-Shief Dashboard) into an immersive VR/AR/MR experience.

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar