Sonifying the Microbiome: 365 days in 360

Deborah Rudin, University of Minnesota Twin Cities

Project: Sonifying the Microbiome: 365 days in 360°

Mentors:  Andrew Demirjian

About Me: I’m a Computer Science and Theater Arts double major at the University of Minnesota Twin Cities. My hobbies include reading, dancing, creating art, listening to music, and cooking. I currently work as an Audio/Media Technician for my University’s Theater department’s Sound Shop.

Week 1:

After meeting with my mentor and deciding what direction we wanted to take our project, I settled into learning how to use Max MSP. Max MSP is a visual programming language for music and multimedia design, and it’s what I’ll be using for the majority of the project. Going through the various tutorials and experimenting with the patches gave me a basic understanding which I will build upon as the summer progresses. I also took a look at the data we’re working with, and found the minimum and maximum of each feature in order to develop a range of values. These features refer to their respective taxonomic groups as found in the microbiomes of the infants. We’re only working with the top ten taxonomic groups, as observed in the study which the data was originally from. Then, I noted the periods between measurements from each infant in order for us to create an understanding of the intervals – especially to see if they held some form of consistency across the board. Later on, we may use these as a way to affect the duration of notes or other aspects of the sonification. I also wrote up a project proposal, and submitted it to Dr. Wole. At the end of the week, I attended the CUNY SciComs Symposium, which was highly interesting.

On the more extracurricular side of things, I’ve started exploring the city with my cohort. So far we’ve gone to the Highline and Chelsea Market!

Below:  Data on the Measurements and Features of the Infants

Week 2:

This week, my mentor and I started diving more into using Max MSP for our project. I first separated our data so that it was per infant, and transposed it so as to easily acquire the data per feature in Max MSP. Then, I worked on creating a patch which would go through the data on an infant and output the data on each feature separately. A patch is essentially a visual program in which you ‘patch’ different objects to the inputs and outputs of others. This language is similar to that of audio systems in real life in that aspect. After that, I worked on using said patch to make a basic sonification of the values of the first feature on the first baby. This worked successfully, although not quite in a way pleasing to the ear. Thus, I then worked on learning how to put the features through various synths. My mentor and I figured out how to generate notes and chords in Max MSP, and then I worked on creating a basic generative music piece by randomizing the steps between the notes. We also decided to create a room in Mozilla Hubs to showcase our work in addition to the in-person black box setting for our presentation.

As a cohort, my peers and have been learning how to use various visualization tools as well. We have so far worked with both ParaView and Tableau. We also attended a Zoom dissertation on the benefits of using VR/MR for Chronic Pain Self-Management.

Week 3:

As a cohort, we worked with VMD and working on our research papers in Overleaf. I wrote a draft of my abstract and introduction, and will continue working on my paper throughout the course of the summer. In my project, I worked on forming chords from my data, using the values to generate the notes and then putting them through MIDI synths. I played around with creating two chords, one basic piano and one through a pizzicato string synth. However, it doesn’t yet run completely correctly, so I still have much to develop with that patch. I also researched the different taxonomical families which we have data on in order to figure out how they’ll determine their usage in our sonification. I’m currently playing with the idea of using Gram-Negative and Gram-Positive bacterium for different chords or features of the sonification process. We are also considering using certain features for velocity, others for pitch, and perhaps some for frequency filtering. At the moment there are a lot of different possibilities that we can work with, and as such we’re considering all of them.

On the extracurricular side of things, I went to the Museum of Modern Art (MoMA) and had afternoon tea at a lovely cafe I found.

Week 4:

This week, I heavily focused on my research paper. I got my introduction and related works section done, which leaves me to start working on writing up my methodology next. We now have research paper writing sessions on Zoom on Tuesdays. As I researched the related works, I found the information fascinating! Pythagoras’ Harmony of the Spheres is something which I had never heard of before — I definitely want to learn more about it. Coding-wise, I fixed up the patch and got it running, with gram-positive bacteria making one chord and gram-negative bacteria making another chord. However, the sound isn’t exactly what I want yet, so my mentor and I are definitely going to play around with what features do what. I also started working in Mozilla Hubs, feeling out how it works. Once I’ve got a placement of visual aids I desire, then I’ll work on figuring out how the audio zoning works.

As a cohort, we worked on integrating R and Python with Tableau. On my own, I attended a concert at Palladium Times Square!

Week 5:

As the start of the second half of the program, much of this week’s focus was on developing a first model in Mozilla Hubs to see how everything works. I arranged objects and gave them corresponding mp3 files in order to figure out how the audio zoning functions. We’ve created a system wherein each feature corresponds to a note in a scale, which is then pitch modulated according to the data values. Then, each baby has a different synth corresponding to it to create distinction while also making patterns identifiable and viable. We’re still finalizing our scale and what instruments to use. With this set up, we’re sending everything through Ableton Live, which directly reads in the MIDI notes from Max MSP and lets us convert them into the mp3 files we need. I’ve continued working on my research paper, now working on my methodology.

This week we also toured the ASRC, or Advanced Science Research Center, which was fascinating. I was most interested in their Neuroscience and Photonics research.

Week 6:

This week was spent putting together the first draft of the final project. I arranged the Grogu baby objects in a circle, each with their corresponding audio files in Mozilla Hubs. First, we used a test audio to make sure the objects were at least relatively in sync with each other. After that, we used the sonifications which we’d created. We used string synths to create a sound more pleasing to the year, and modified the durations of the notes to correspond with the time between measurements: shorter at the beginning and getting longer towards the end. Next, we want to try different spatializations within Mozilla Hubs. For my paper, I finished my abstract and methodology, and now just need to write my results and conclusion.

On my own, I went to Spyscape, which is an interactive spy museum.

Below: Baby and Sound objects in Mozilla Hubs

Week 7:

This week has pretty much been crunch time. As a cohort, we’ve started to work on testing each others’ programs and running user studies. I set up the new spacialization for the sonification, and I much prefer this new version. I set in in a geodesic dome preset scene, which allows a circular set up and much more room between the babies as well as the middle. This allows one to listen to them all at once or go around the edge to focus on one or several at a time. The volume on each audio is also adjustable, so if one wishes to only hear one baby, they could turn down the volume all the way on the others. Some small stumbling blocks my mentor and I dealt with were originally having a bunch of duplicate audios instead of separate unique ones, and making sure the duration of the notes matched with their periods of measurement. Luckily, these were easily resolved, and we were able to go on with our final implementation. After I put everything into Mozilla Hubs, I also labeled each baby in order for our observations to be correct and valid without room for confusion. Unfortunately, Mozilla Hubs does not have a labeling system, so I resorted to using the Pen object in order to create a drawing and then turn it into a pinnable 3D object.

On a less work intense side, we took a group trip to the Bronx Zoo to do the Treetop Adventure Zipline. That was a lot of fun, even in the boiling heat. I also tried a NYC restaurant week restaurant – which was incredible – as well as a magical afternoon tea at The Cauldron.

Below: Baby objects in the Geodesic Dome with the Audio Debugger showing the Audio Zoning in Mozilla Hubs

Baby objects in the geodesic dome with the Audio Debugger showing the Audio Zoning

Week 8:

As the last week of the program, this week was focused on getting our papers done and submitted. Our abstracts were due Monday, with the papers themselves needing to be submitted by Friday. For me, this meant getting participants to experience the sonification and then respond to a response for in order for data to be collected. This data collection allowed me to analyze my results and write my results section of my paper. Once that was done, I was able to write my conclusion and finish off the paper. Before I turned it in, I checked it over with my mentor so that I could add anything he thought was necessary. That done, I was able to submit my short paper! After that, I worked on developing my slides for my Friday presentation. On Friday, all of us presented our projects in a presentation session. We each had about 25 minutes to present, and the session was hybrid in-person and on Zoom. Included was a session Dr. Wole set up in which he presented a zoom recording of us REU participants discussing computing, STEM, and VR/AR/MR. We recorded this session on Tuesday, and it was recorded so as to create an easy method of presentation. Breakfast and lunch were both provided during the presentation session, which was a very nice addition to our last day of the program.

Outside of working on finishing up my project, I saw Phantom of the Opera on Broadway, which was incredible. I really enjoyed working in NYC this summer, and I’m so glad I had the chance to participate in this REU.

Final Report

VR-REU 2022: New Research Experience Program to Broaden Participation in Computing

Ten students will be attending the VR-REU research program at City University of New York (CUNY), Hunter College from June 6 – July 29, 2022.

Aisha Frampton-Clerk
CUNY Queensborough Community College

Amelia Roth
Gustavus Adolphus College

Ari Riggins
Princeton University

Deborah Rudin
University of Minnesota Twin Cities

Diego Rivera
Iona College

Mustapha Bouchaqour
CUNY New York City College of Technology

Nairoby Pena
Cornell University

Olubusayo Oluwagbamila
Rutgers University New Brunswick

Talia Attar
Cornell University

Zhenchao Xia
Stony Brook University

New York, NY — June 3, 2022 — The VR-REU program is a Research Experience for Undergraduates (REU) program sponsored by the National Science Foundation (NSF) that enables undergraduate students to undertake multidisciplinary research projects in Immersive 3D Visualization and VR/AR/MR (Virtual, Augmented and Mixed Reality) at Hunter College. The intended impact is to use the creative potential of Immersive 3D technology to attract and broaden participation of women and underrepresented minorities in STEM and computing fields. Participants will also be provided training in immersive 3D visualization tools and technologies. This experience will include excursions and social events.

“I believe that virtual, augmented and mixed reality technology helps to unleash the creativity of students and increases their interest in technology,” Wole Oyekoya, Associate Professor of Computer Science at Hunter College, said. “This program also provides the opportunity to be involved in building the metaverse with diversity, equity and inclusion in mind”.

The objective of the program is to inspire participating students to consider STEM as a career path and pursue STEM careers at the graduate level, and specifically target participation of women and underrepresented groups. The program identified research mentors with immersive visualization needs and pair each student with a research mentor. Research mentors include diverse faculty members not just from Computer Science but also from diverse fields including film and media, arts, health, journalism, social sciences and biological sciences. Research mentors include faculty members from eight CUNY colleges, University of Columbia and University of California, Santa Barbara.

Students will participate in mentored research projects in data driven research areas, including scientific visualization and visual analytics. Some of the planned research projects include: VR for aiding students with learning disabilities, MicroRNA (Ribonucleic Acid) as a regulator for cell lineage plasticity, immersive remote telepresence, sonifying the microbiome, and visualization of deep learning model architecture.

In addition to the exposure to cutting edge research, the students selected to participate will receive a travel stipend for one round trip to and from New York City, a housing allowance, a weekly stipend to cover living expenses and access to research faculty and VR/AR/MR resources to help facilitate their success in the program.

Two of our students (Aisha Frampton-Clerk and Diego Rivera) were also selected to receive a $10,000 fellowship through a new program, Last Mile Fellowship to Broaden Computing-Related REU Participation, that aims to improve diversity in computing-related research.

The VR-REU program is sponsored by NSF Grant No. 2050532 and is directed by the PI, Dr Wole Oyekoya. This funding supports Hunter College and CUNY in its mission to continue to provide higher education to a largely disadvantaged population that includes women and underrepresented minorities, immigrants or the children of immigrants and first-generation college students. This year’s program supports a diverse student cohort that is 70% female, 30% male, 40% African American, 20% Hispanic, 60% White, and 10% Asian.

Participants will present the results of their research on July 29, 2022 at Hunter College. Please contact the PI for more information.

Visualization of the food consumed during COVID

Visualization of the food consumed during COVID by Paul Grant, CUNY Hunter College

Frontiers in VR Paper

Our Paper, “Exploring First-Person Perspectives in Designing a Role-Playing VR Simulation for Bullying Prevention: A Focus Group Study” has been accepted in the Frontiers in Virtual Reality Journal.

ACM SUI Paper

Our paper, “Usability Evaluation of Behind the Screen Interaction” has been accepted at the ACM Spatial User interaction, SUI 2021.

PhD Student Position

I am seeking a PhD student to join my research lab in Fall 2022. More information available here.

NSF REU Award

Prof. Wole has been awarded a National Science Foundation (NSF) award.

Title: REU Site: Research Experience for Undergraduates in Immersive 3D Visualization
Abstract: This Research Experiences for Undergraduates (REU) Site award funds a new site to enable ten undergraduate students each year to undertake Immersive 3D Visualization research projects at Hunter College, City University of New York (CUNY). The intended impact is to use the creative potential of Immersive 3D technology to attract and broaden participation of women and underrepresented minorities in STEM and computing fields.

ACM ICVARS Paper

Our paper, “A Comparative Study of Smartphone, Desktop, and CAVE systems for Visualizing a Math Simulation” has been accepted at the ACM International Conference on Virtual and Augmented Reality Simulations, ICVARS 2021.

Immersion – Night-time Forest

Noman Ahmad, Victor Huang, Michelle Lucero, Alyssa Ma

This game examines a virtual reality experience that was created to emphasize environmental immersion. The goal was to combine lighting, sound, and user directional choice to simulate the most immersive experience possible. We incorporated a night-time forest as our environment setting in order to deliver this experience.

Poacher Protection Survival Game

Hannan Abid, Steffen Loh, Joseph Ruocco

This is a ”Poacher Protection” game where users can explore the endogenous (food and reproduction) and exogenous (poachers) factors affecting an ecosystem. The main objective when we first conceived our ”Poacher Protection” game centered around an exploratory animal protection experience. In this experience the user can take part in preventing poachers who are trying to kill the animals and see the consequences they (the poachers) have on the habitat. The goals are for (1) create an day-night environment and ”ecosystem” of randomly spawning wolves that roam the environment and ”consume” the grass objects. (2) a fence building mechanism for users to create a fence that they can place wolves into (3) A light pointing mechanism that can ward off predators that inhabit the environment during the night phase.

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar