Home » REU » REU Projects

REU Projects

Proposed Projects

Research mentors will be paired with students and a graduate mentor. If you are a CUNY research staff/faculty interested in being a research mentor, please contact the PI. Faculty from other NYC colleges and universities are also welcome.

Immersive Remote Telepresence

Prof. Wole

Research Mentor and PI: Oyewole Oyekoya, Associate Professor, Computer Science, Hunter College
Immersive remote telepresence, facilitating shared collaborative experiences with one or more interlocutors in distant locations, represents the frontier of VR/AR technology. Immersive telepresence can potentially reduce costs for users as they do not have to physically travel to remote locations, and in a simulated manner can facilitate the experience of being in or traveling through a remote location using virtual reality technologies such as immersive viewing and rendering systems, self-avatars, and 3D interaction metaphors for travel and manipulation. Such systems represent the next generation in business travel, meetings and online teaching systems. However, there are outstanding questions regarding the fidelity of the self-avatar’s animation quality (numbers and accuracy of the sensors), immersive display and rendering quality, latency and their impact on sense of presence, communication content and understanding. Research in immersive telepresence will benefit tremendously from this REU site, as the technological challenges are demanding. This proposed REU project will enable investigations into telepresence challenges such as facilitating audio feedback, gaze communication, facial animations, body/hand gestures, touch cues, and how users perceive scanned avatars.
Skills gained: Unity3D, OpenGL, Cloud Computing

VR as a learning tool for students with disabilities

Research Mentors: Sudi Shayesteh and Daniel Chan, Office of AccessABILITY, Hunter College
The Office of AccessABILITY started experimenting with VR in 2017, hoping that through its ability to build association and stimulation, they would be able to help students who are tactile and or visual learners improve skills in areas such as biology, physiology and art. At first the technology attracted only a handful of students, but it became more popular through time and engaged faculty members from many academic departments such as Chemistry, Computer Science, Education, ICIT, Film and Media Studies and ACERT (Academic Center for Research and Teaching) who wanted to explore VR features, create assignments or dedicate a class session to the lab. VR creates a multisensory experience leading to further engagement by students who may struggle with inattention and/or social interactions, and those with physical limitations who otherwise may not be able to engage in certain tasks. For instance, students who struggle with changes in the environment (like those on the autism spectrum) can plan ways to decrease stress. Research shows that use of VR helps with visual motor skills and visual spatial attention and provides many other benefits that may not be directly related to classroom learning, such as with improving job interview skills. The proposed REU project will enable further research on the benefits of VR and self-avatars in immersive and effective learning experiences for students with disabilities.
Skills gained: Unity3D, VR/AR/MR headsets

MicroRNA-1205 as a regulator of cell lineage plasticity

Research Mentor: Olorunseun Ogunwobi, Director, Hunter College Center for Cancer Health Disparities Research (CCHDR), Associate Professor of Biological Sciences, Hunter College
Cell lineage plasticity is defined broadly as the ability of determined cells to phenotypically differentiate into another cell type. Although this mechanism is not common in normal microenvironments, early evidence of cell lineage plasticity has been demonstrated in drosophila during embryogenesis. More so, it is often observed under stressful conditions such as in diseased states including cancer and diabetes. There have been many demonstrations on various signaling pathways involved in cell lineage plasticity. The most notable example is the addition of the OSKM (Yamanaka factors) to fibroblasts to be induced into pluripotent stem cells. OSKM represents Oct4, Sox2, Klf4 and c-MYC as transcription factors that are required to induce pluripotency. In addition to transcription factors, biological pathways have been elucidated in inducing cell lineage plasticity including other functional proteins and even noncoding RNAs, including microRNAs. MicroRNAs are small non-protein coding RNAs that induce mRNA targeted degradation. More recently, as microRNAs have been involved in numerous developmental pathways, these small non-protein coding molecules are currently being investigated as master regulators in cell lineage plasticity. An example is the miR-200 cluster involved in reprogramming mouse embryonic fibroblasts into induced pluripotent stem cells (iPSCs). Immersive Visualization tools such as VMD and Paraview can enable further understanding of how microRNAs may be involved in reprogramming to accelerate the clinical use of iPSCs used in medicine. Students will conduct research on how to visualize microRNAs on VR headsets.
Skills gained: VMD, Immersive Paraview, HTC Vive VR headsets

Sonifying the Microbiome: 365 days in 360º
Research Mentor: Andrew Demirjian, Assistant Professor, Department of Film and Media Studies, Hunter College, CUNY
This proposed project involves an exploration of the affordances of sound as an additional tool to help identify patterns and subtle changes in data that are difficult to trace by viewing spreadsheets or graphs. Scientists are increasingly finding important relationships between mental and physical health and the state of the microbiome. The patterns that emerge from the development of an infant microbiome may have intriguing implications for a wide range of mental and physical attributes later in life. Scientists can now use DNA sequencing to quantitatively monitor all of the different bacteria in a person’s microbiome. This project maps data of the evolution and fluctuations of microbes in the microbiome of fourteen infants from the day of their birth until they are one year old into sound. Different aural characteristics like pitch, note duration, timbre and location in the sonic field are used to identify variations in presence, quantity and type of bacteria changing over time. The outcome of the research will be a 360º listening experience that complements a visualization of the same data, enabling listening and viewing from multiple perspectives. The user will be able to choose to hear/view one, multiple or all of infants in the study simultaneously as well as being able to toggle through different filtering and grouping taxonomies of bacteria.

Jump into Blackbox through Virtual Reality: Visualization of State of the Arts Deep Learning Model Architecture – Transformers
Research Mentor: Lei Xie, Professor, Computer Science, Hunter College, CUNY
The assigned student will work under the supervision of Professor Xie’s graduate student, Beatrice Cai.
Neural network is always criticized for being blackbox and its “unbelievably” high accuracy in prediction just cannot win trust and hence does not get widely adopted in sensitive applications such as medical diagnosis, drug discovery, judicial reasoning. The reluctance and even repulsion comes down to the fact that neural network, as model, is only transparent at the input and output ends. The only four types of widely recognized interpretable models are linear regression, decision tree, directional acyclic graph and rule-based models. They are thought to be “interpretable” just because human can see how input flow through the model and come out as an output. Thinking of this, with the support of Virtual Reality (VR), we can jump into the “blackbox” to observe the internal information flow, which could lead to a solution for the interpretablity dilemma. This project aims to open up the most famous and notoriously huge deep learning model, Transformers. A user can take the view of the input and follow it through the processing experience within Transformers to explore the inside world of the blackbox.

Explore Virtual Environments Using a Mobile Mixed Reality Cane Without Visual Feedback
Research Mentor: Hao Tang, Associate Professor and Deputy Chair, Computer Information Systems, Borough of Manhattan Community College, CUNY
Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public still cannot enjoy the benefit provided by VR, especially the population of the blind and visually impaired (BVI). Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are not accessible and convenient for BVI individuals. We proposed a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes contact with objects in VR. Thus, the app allows BVI individuals to interact with the virtual objects and identify sizes and locations of the objects in the virtual environment. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.

The Community Game Development Toolkit — Developing accessible tools for students and artists to tell their story using creative game design
Research Mentor: Daniel Lichtman, Adjunct Professor, Department of Fine and Performing Arts, Baruch College, CUNY

Image Credit: Millicent Encarnacion, New Media Art student, Baruch College

Video game design and technology presents many of today’s most advanced experiences of interactive storytelling. But making creative, visually compelling games and interactive 3D environments usually requires a host of specialized skills–advanced computer programming, 3D modelling, character animation, and more. The Community Game Development Toolkit is a set of game design tools that make it easy and fun for students, artists and community members to create interactive 3D environments and visually rich games without the use of code or other specialized skills. The toolkit seeks to provide new tools for diverse communities to tell stories in their own visual language, and for students and artists to explore their own traditions, rituals and heritages through creative game design. To make visual scene and object design quick and creative, the toolkit draws designers’ own sketches, photos, paintings and other creative materials. To easily create interactivity without the complexity of coding, the toolkit additionally provides game components that use drag and drop in order to implement features such as collecting inventory, non-player characters, changing scenes/levels and more. This REU project will expand the functionality of the toolkit and increase its accessibility to diverse student bodies who do not have previous technical experience in game design and programming. Research will include conducting a requirements analysis of students’ needs; designing and programming (using C#) additional game components for interactivity; further
exploration of representing 2D brushstroke and 2D textures in 3D space; and integration of toolkit functionality with AR, VR and MR.
Skills gained: Unity 3D, programming in C#, VR/AR/MR

Hunter College
City University of New York
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar 
%d bloggers like this:
Need help with the Commons? Visit our
help page
Send us a message