Home » VR-REU 2022 » Sonifying the Microbiome: 365 days in 360

Sonifying the Microbiome: 365 days in 360

Deborah Rudin, University of Minnesota Twin Cities

Project: Sonifying the Microbiome: 365 days in 360°

Mentors:  Andrew Demirjian

About Me: I’m a Computer Science and Theater Arts double major at the University of Minnesota Twin Cities. My hobbies include reading, dancing, creating art, listening to music, and cooking. I currently work as an Audio/Media Technician for my University’s Theater department’s Sound Shop.

Week 1:

After meeting with my mentor and deciding what direction we wanted to take our project, I settled into learning how to use Max MSP. Max MSP is a visual programming language for music and multimedia design, and it’s what I’ll be using for the majority of the project. Going through the various tutorials and experimenting with the patches gave me a basic understanding which I will build upon as the summer progresses. I also took a look at the data we’re working with, and found the minimum and maximum of each feature in order to develop a range of values. These features refer to their respective taxonomic groups as found in the microbiomes of the infants. We’re only working with the top ten taxonomic groups, as observed in the study which the data was originally from. Then, I noted the periods between measurements from each infant in order for us to create an understanding of the intervals – especially to see if they held some form of consistency across the board. Later on, we may use these as a way to affect the duration of notes or other aspects of the sonification. I also wrote up a project proposal, and submitted it to Dr. Wole. At the end of the week, I attended the CUNY SciComs Symposium, which was highly interesting.

On the more extracurricular side of things, I’ve started exploring the city with my cohort. So far we’ve gone to the Highline and Chelsea Market!

Below:  Data on the Measurements and Features of the Infants

Week 2:

This week, my mentor and I started diving more into using Max MSP for our project. I first separated our data so that it was per infant, and transposed it so as to easily acquire the data per feature in Max MSP. Then, I worked on creating a patch which would go through the data on an infant and output the data on each feature separately. A patch is essentially a visual program in which you ‘patch’ different objects to the inputs and outputs of others. This language is similar to that of audio systems in real life in that aspect. After that, I worked on using said patch to make a basic sonification of the values of the first feature on the first baby. This worked successfully, although not quite in a way pleasing to the ear. Thus, I then worked on learning how to put the features through various synths. My mentor and I figured out how to generate notes and chords in Max MSP, and then I worked on creating a basic generative music piece by randomizing the steps between the notes. We also decided to create a room in Mozilla Hubs to showcase our work in addition to the in-person black box setting for our presentation.

As a cohort, my peers and have been learning how to use various visualization tools as well. We have so far worked with both ParaView and Tableau. We also attended a Zoom dissertation on the benefits of using VR/MR for Chronic Pain Self-Management.

Week 3:

As a cohort, we worked with VMD and working on our research papers in Overleaf. I wrote a draft of my abstract and introduction, and will continue working on my paper throughout the course of the summer. In my project, I worked on forming chords from my data, using the values to generate the notes and then putting them through MIDI synths. I played around with creating two chords, one basic piano and one through a pizzicato string synth. However, it doesn’t yet run completely correctly, so I still have much to develop with that patch. I also researched the different taxonomical families which we have data on in order to figure out how they’ll determine their usage in our sonification. I’m currently playing with the idea of using Gram-Negative and Gram-Positive bacterium for different chords or features of the sonification process. We are also considering using certain features for velocity, others for pitch, and perhaps some for frequency filtering. At the moment there are a lot of different possibilities that we can work with, and as such we’re considering all of them.

On the extracurricular side of things, I went to the Museum of Modern Art (MoMA) and had afternoon tea at a lovely cafe I found.

Week 4:

This week, I heavily focused on my research paper. I got my introduction and related works section done, which leaves me to start working on writing up my methodology next. We now have research paper writing sessions on Zoom on Tuesdays. As I researched the related works, I found the information fascinating! Pythagoras’ Harmony of the Spheres is something which I had never heard of before — I definitely want to learn more about it. Coding-wise, I fixed up the patch and got it running, with gram-positive bacteria making one chord and gram-negative bacteria making another chord. However, the sound isn’t exactly what I want yet, so my mentor and I are definitely going to play around with what features do what. I also started working in Mozilla Hubs, feeling out how it works. Once I’ve got a placement of visual aids I desire, then I’ll work on figuring out how the audio zoning works.

As a cohort, we worked on integrating R and Python with Tableau. On my own, I attended a concert at Palladium Times Square!

Week 5:

As the start of the second half of the program, much of this week’s focus was on developing a first model in Mozilla Hubs to see how everything works. I arranged objects and gave them corresponding mp3 files in order to figure out how the audio zoning functions. We’ve created a system wherein each feature corresponds to a note in a scale, which is then pitch modulated according to the data values. Then, each baby has a different synth corresponding to it to create distinction while also making patterns identifiable and viable. We’re still finalizing our scale and what instruments to use. With this set up, we’re sending everything through Ableton Live, which directly reads in the MIDI notes from Max MSP and lets us convert them into the mp3 files we need. I’ve continued working on my research paper, now working on my methodology.

This week we also toured the ASRC, or Advanced Science Research Center, which was fascinating. I was most interested in their Neuroscience and Photonics research.

Week 6:

This week was spent putting together the first draft of the final project. I arranged the Grogu baby objects in a circle, each with their corresponding audio files in Mozilla Hubs. First, we used a test audio to make sure the objects were at least relatively in sync with each other. After that, we used the sonifications which we’d created. We used string synths to create a sound more pleasing to the year, and modified the durations of the notes to correspond with the time between measurements: shorter at the beginning and getting longer towards the end. Next, we want to try different spatializations within Mozilla Hubs. For my paper, I finished my abstract and methodology, and now just need to write my results and conclusion.

On my own, I went to Spyscape, which is an interactive spy museum.

Below: Baby and Sound objects in Mozilla Hubs

Week 7:

This week has pretty much been crunch time. As a cohort, we’ve started to work on testing each others’ programs and running user studies. I set up the new spacialization for the sonification, and I much prefer this new version. I set in in a geodesic dome preset scene, which allows a circular set up and much more room between the babies as well as the middle. This allows one to listen to them all at once or go around the edge to focus on one or several at a time. The volume on each audio is also adjustable, so if one wishes to only hear one baby, they could turn down the volume all the way on the others. Some small stumbling blocks my mentor and I dealt with were originally having a bunch of duplicate audios instead of separate unique ones, and making sure the duration of the notes matched with their periods of measurement. Luckily, these were easily resolved, and we were able to go on with our final implementation. After I put everything into Mozilla Hubs, I also labeled each baby in order for our observations to be correct and valid without room for confusion. Unfortunately, Mozilla Hubs does not have a labeling system, so I resorted to using the Pen object in order to create a drawing and then turn it into a pinnable 3D object.

On a less work intense side, we took a group trip to the Bronx Zoo to do the Treetop Adventure Zipline. That was a lot of fun, even in the boiling heat. I also tried a NYC restaurant week restaurant – which was incredible – as well as a magical afternoon tea at The Cauldron.

Below: Baby objects in the Geodesic Dome with the Audio Debugger showing the Audio Zoning in Mozilla Hubs

Baby objects in the geodesic dome with the Audio Debugger showing the Audio Zoning

Week 8:

As the last week of the program, this week was focused on getting our papers done and submitted. Our abstracts were due Monday, with the papers themselves needing to be submitted by Friday. For me, this meant getting participants to experience the sonification and then respond to a response for in order for data to be collected. This data collection allowed me to analyze my results and write my results section of my paper. Once that was done, I was able to write my conclusion and finish off the paper. Before I turned it in, I checked it over with my mentor so that I could add anything he thought was necessary. That done, I was able to submit my short paper! After that, I worked on developing my slides for my Friday presentation. On Friday, all of us presented our projects in a presentation session. We each had about 25 minutes to present, and the session was hybrid in-person and on Zoom. Included was a session Dr. Wole set up in which he presented a zoom recording of us REU participants discussing computing, STEM, and VR/AR/MR. We recorded this session on Tuesday, and it was recorded so as to create an easy method of presentation. Breakfast and lunch were both provided during the presentation session, which was a very nice addition to our last day of the program.

Outside of working on finishing up my project, I saw Phantom of the Opera on Broadway, which was incredible. I really enjoyed working in NYC this summer, and I’m so glad I had the chance to participate in this REU.

Final Report

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar