Home » REU2017

Category Archives: REU2017

Jacovia Cherry: Campus Basemap

About Me:

I am a rising junior here at Clemson studying Computer Engineering.

Project:

Campus Basemap

Mentors:

Patricia Carbajales-Dale and Blake Lytle

Week 1:

This first week was spent getting familiar with the other interns participating in the REU, my mentors, and learning general knowledge of how the summer will go. Since my project is to create a map of Clemson’s campus, I also spent some time familiarizing myself with CityEngine through the use of ArcGIS site and their tutorials. This was very helpful with learning basic essential skills.

 

Week 2:

This week was spent attending seminars to learn more about the researching computing using the Palmetto cluster, Scientific Visualization, navigating Linux and the command line, and GIS.

After finishing all of the tutorials about CityEngine, I was able to start making some process on building the foundation for my map. This week I was able to import the terrain and the buildings that were already made. However, the terrain isn’t as appealing as I had hoped. Next week, my focus will be making the streets and sidewalks.

 

Week 3:

This weeks focus was bringing in streets and sidewalks. However, things didn’t go as well as planned. I ran into multiple obstacles with the CityEngine software. I was able to bring in the streets, but not able to modify them to my liking. This task will have to be finished in the first few days of next week.  After more complications, I was able to load my scene onto the Gear VR for the sprint review.

Additionally, this week was spent working on the intro for the research paper. This along with streets will be continued next week. I will also be preparing for my midterm presentation.

Week 4:

This week I had the pleasure of working with a new GIS intern. She is very knowledgeable when it comes to CityEngine, and were able to collaborate on adding the remaining street networks to the project. This week also consisted of adding athletic fields and trees to the project .

 

I had my midterm presentation this week which went pretty well. This presentation was very helpful in determining how far the project has come and what the next steps will be. Next week will be spent adding grassy areas and bodies of water to the project to be ready to present at the ESRI conference.

 

Week 5:

This week was spent adding the final touches to the project before the ESRI conference. We were able to successfully add the grassy areas, bodies of water, and parking lots to the project. Ellie, the newest addition to the GIS team, added a multi-use facility to the project that she will be presenting at the project.

Additionally, I spent some time researching articles to continue working on my research paper.  These last few weeks will mainly be dedicated to working on my research paper.

 

Week 6:

This week we visited the ITC Research Park in Anderson to learn a little more about the Palmetto Cluster that some of the REU students had been working with this summer.

This was the beginning of the push to the end. I spent some time writing rules to better the grassy areas and parking lots in the project. I also analyzed more articles for my final report.

 

Week 7:

This week were attended a few leadership sessions to sharpen our skills in the workplace. One was designed to help our elevator pitches and the other to identify our “color” in the work place. These sessions were very beneficial and gave a more introspective view of how we work with others and gave me more confidence in communicating my skills and accomplishments to others.

This week I experienced many technical issues. I spent a lot of time reapplying rules to all the models in the project after CityEngine continued to crash and it seemed as those some of my work was lost. I also had problems aligning the shapes to one set terrain after modifying the rules applied to the  grassy areas and Clemson area boundaries.

 

Week 8:

This final week was dedicated to the fine tuning of the final presentation and the final report. I was able to meet with Dr. Wole for some last words of wisdom regarding the final report. I added to my midterm powerpoint, and we had a chance to practice presenting these as a group for feedback. We also had a group dinner at Sole on the Green. It was great opportunity to relax from the stress of the final project and spent some time together.

 

Final Report

 

 

Andrew Tompkins: Capturing our world through interactive virtual reality field trip

 

Project: Capturing our world through interactive virtual reality field trips

Abstract: A wide range of emerging technologies are available to create virtual reality experiences that can take us on journeys to explore regions of the world that we might never otherwise be able to visit – ranging from polar ice caps to tropical forests. In particular, 360o imagery is becoming increasingly easy to capture, edit, and annotate to engage people in deeper, interactive virtual field experiences. This project will evaluate different techniques and methods for creating interactive virtual reality field trips, including experiences that integrate imagery and other content within the setting of the virtual world with an emphasis on educational field experiences.

Mentor: Stephen Moysey

About: I am a rising senior currently studies Digital Arts at Stetson University. I am also pursuing a minor in Computer Science and Business System Analytic. I enjoy creating interactive experiences and want to use this summer to learn more about the techniques used in the creation of VR experiences.

Weekly Reports

Week 1: The first week, I wanted to setup my work environment and begin familiarizing myself with Unity. I created a basic scene that allowed me to pick up objects and teleport. Next I explored custom interactions which allowed me to snap objects to certain orientations and spawn items that are automatically grabbed.

Teleportation – Snap Grab

 

 

 

 

 

 

 

 

Teleporter uses a raycast laser to find a location and project the reticle at that location. Once the location is selected it is a simple transform to the location of the laser to teleport there.

The Snap Grab required a me to create a class that allowed me to overwrite the builtin Vive Controller functionality. Once the overwrite class was made I simple made special interactions that are dependent on the object being interacted with. To create the arrow snap effect we transformed the arrows location to that of the controller  with a offset to compensate for its size/desired orientation and parented it to the controller.

Week 2: I met with my advisor to get a better idea of what is expected of me. Finished the project charter which outlined what I would be doing during. Started Literature review and collected all of the sources that will be used. formatted this web page and added the images. The project has shifted from creating a project to researching ways to make future projects and how the mechanics used in virtual reality can be used in geologic education.

Week 3: Continued to gather sources for my research paper and created the research introduction. Broke up the aspects that I will be researching which are movement, interactivity, and imagery. I have started on the movement category and have been collecting examples of mechanics being used.

Week 4: Finished gathering all of the reports needed to start the analysis for the movement section of the paper. Started working on the Midterm presentation and worked on the methodology section of the research paper.

Week 5: Finished gathering information for the imagery section of the paper and started to write out the bulk of the paper. The interactivity section of the paper was lacking in information, which will be my focus next week while working on the geoscience part.

 

Week 6: Finished all the research necessary and started to write it all out in the research paper. Started the research poster which acts as a summary of what is contained inside of the research paper. Created a demo using 360 imagery in unity. Started working on a photogrammetry demonstration went to the location and captured the room. Next week will have the first draft of the poster and paper done. I will also try compiling the images into a 3D mesh using Photoscan.

Week 7: Continued to work on the research paper, and started the final demo. The final demo is a platform inside of the Grand Canyon that allows users to pick up rock samples and see which layer the rock sample belongs to.

Week 8: Last week of the REU, finished the Grand Canyon demo and created a video demonstration for it. Updated the midterm presentation to include all of the work that has been since then.

Final Report

Jackson Fletcher: Systems Biology Visual Analytics

 

 

 

 

 

 

About Me:

I am an undergraduate at Mississippi State University. I’m studying Computer Engineering there and will be a third-year senior this fall. I have a passion for sports of any kind but I also enjoy optimizing daily life to be more efficient.

About My Project:

My research mentors, Dr. Alex Feltus and Dr. Melissa Smith, have teamed up to research genetics using GPU-optimized algorithms. My role in the project is to create an interface between an actively-developed program, BioDep-Vis, and its users, geneticists. BioDep-Vis is a gene clustering program that clusters genes that are exhibited at similar times. It can display the gene networks of organisms at the same time to see development from a common ancestor. We explored the use of various interaction methods to to control the BioDep-Vis interface such as a Microsoft Kinect and a Mobile Touch Interface.

Week 1

Week 1 was pretty chill despite getting off to a quick start. The first couple days consisted of a general briefing on the program, a meeting with our project mentors, and an orientation-esque tour of the campus. A normal day consists of either a planned educational session or a tour of some facility in the morning and then a meeting/work with mentor(s) after lunch. This week’s morning sessions included a session on writing a research paper (hint: LaTeX), a “Screen Capture and Video Editing” session (namely, Adobe Premier Pro and Camtasia), and a tour of the Watts Family Innovation Center on campus.

On the project side of things, I became somewhat familiar with the current infrastructure. However, I have been set back by an inability to build and run BioDep-Vis due to access restrictions on the Palmetto cluster and the cluster’s untimely maintenance. I spent most of Thursday downloading programs and compilers (along with trying to obtain cluster access). On Friday, the Kinect adapter finally arrived so I had an opportunity to play around with that while the cluster was down. Its a really cool piece of hardware but I don’t really have any idea how to program with it yet. That’ll have to wait for next week.

Week 2 – 8

Current Issues
1. What is the protocol to connect to a computer?
* For now, I’m experimenting with Unified Remote, the most popular remote desktop control app. Unified Remote uses either Wifi or Bluetooth to control the computer. Various controllers are available for specific applications and the mouse and keyboard control seems to be very good. I’m not sure if it is possible to add a visual that can be dragged around or not as discussed in the meeting.
2. How to best design the controller to manipulate the 3D space?
* 3D box model
* actual graph

Potential Improvements
Eventually, the controller application could interact directly with the remote server without needed to interface with the intermediate display computer. However, Palmetto does not support the opening of ports needed for this protocol.

Final Report

Charles Hockaday, Jr.: The Open Microscopy Environment (OMERO) for Digital Marine Organism Imaging

About me: I am Charles Hockaday, Jr. I am a junior at Elizabeth City State University studying Computer Science.

Project: The Open Microscopy Environment (OMERO) for Digital Marine Organism Imaging

Mentor: Andrew Mount

Weekly Report

Week 1: As of the first week at Clemson for my research I was able to learn about how to use software known as Paraview and OMERO. Both pertaining to needing a lot of data to be used and to make 3-D images. After learning about the two I needed to install them both in order to use them and play around with it before receiving biological data and putting it in. Now I just have to configure Python with OMERO and I’ll be squared away as week one comes to an end.

Week 2: I was able to finally download OMERO and Paraview onto my PC. After that was completed I had to become accustomed to using both of the types of software and along with that be able to configure Python with OMERO. I was going to also try to connect paraview to the palmetto cluster but an update had held us back on that task so I do not know if that can be done at the moment. I received images from Vera and toyed around with it on OMERO and have been able to make some progress on using the software.

Week 3: Still have not been able to connect paraview to the palmetto cluster. This week I was able to convert the images that I received from lif to tiff format. Once that was complete I am now in the process of trying to get that stacked image into 3-D. Along with that I had to download more material onto the computer so that I’m able to use python shell in order to make the 3-D image.

Week 4: We finally decided to not add the palmetto cluster. I continued to mess around with OMERO and try to connect it to paraview by using the python shell on the software. Along with this process I am to make a power point presentation of our work so far. I will discuss the basics on how to setup the software called OMERO and how it works. Not only will I discuss OMERO but I will discuss the setup of paraview as well and how that software is used. Before I began working on the power point I tagged along with Dr. Mount and Vera to collect oysters in Georgetown so they can collect data from them for future works.

Week 5: During the middle of this week I had to give a presentation on what I had done so far for the midterm presentation. I only discussed the basic setup of OMERO and how to use it. I did the same thing when it came to paraview as well. Overall the presentation seemed to be a success. Afterwards I began to continue to work on the paper but is having problems setting up Latex on my laptop so that I can work from home. I’m still in the process of trying to figure out how to transfer images and get everything to work smoothly and the process is taking its time but everything will get done as soon as possible.

Week 6: After doing more research into OMERO we were able to figure out how to visualize an image in Paraview. It was a simple process that we had to go through for both ImageJ and Paraview. The process of importing the data to ImageJ then along with that securing the important information such as the height, width, and the depth of the image. With this information a user would need it to input into Paraview in order to accurately get an image to display on the screen. This process started to not take as long once we realized everything that was going on within the ImageJ and Paraview interfaces.

Week 7: As this week approached I proceeded to working on the final report of this entire project. Along with the paper I have been working on a presentation for the final as well. With the final presentation I took the power point from the midterm and added some slides to that after receiving more information and understanding of what I was doing. With a paper and a final presentation I also had to make a poster. On the poster I was just going to add the abstract, introduction, conclusion, future works, and also images for people to view of the data that had been collected and ten visualized.

Week 8: During this last week of the summer REU I was able to finish my final power point presentation also finish the paper and poster. For the past 8 weeks not only was I able to finish everything that was assigned but I was able to learn things that I probably would have never attempted to learn while in school. I am glad to have received the opportunity to enjoy this experience and work with the people that were with me from the beginning of this program. For future occasions I wouldn’t mind experiencing an internship like this again. There is always a chance to learn something new when it comes to summer research.

Final Report

Ryan Canales: Engaging Students in Bullying Prevention Efforts through Visualization

Project: Engaging Students in Bullying Prevention Efforts through Visualization

Mentors:  Jan Urbanski, Susan Limber, Wole Oyekoya

About Me: I study Computer Graphics, Art, and Mathematics at Texas A&M University. I am also a guitarist and I like to sing for fun.

Here’s my rudimentary portfolio: http://ryancanales.wixsite.com/portfolio

And some music I’ve made a few years ago:  https://soundcloud.com/ryan-canales

Week 1:

After meeting with my mentors about the direction the project should go, I decided that building character models would be the best place to start. This week, I began making some iterations for a generic character model. I will more or less stick with the basic topology then modify it as needed (ie to change it to male, female or adult, and other little adjustments). I also UV unwrapped the model to begin texture tests in Unity for next week. Finally, I built a simple rig to be used on all characters and began animation testing within Unity.

Here’s the low poly character mesh (with previous iterations on the left). I will add more images as progress is made!

 

Week 2:

This week I went to workshops over Linux, using the palmetto cluster here, and several scientific visualization programs. I also met with one of my mentors to show her the progress I made. So far this week, I’ve added animations to the character and wrote scripts for character and animation control and camera control. I also began building the environment and added collision detection for walls. I also began an implementation for character customization; as of now the player can change the skin color and switch between male and female. I also learned a bit about shader programming for Unity and implemented a custom shader for the character, allowing for the outlined, flatter look. I plan on finalizing the environment and the characters by the end of next week.

P.S. I will add hair to the character!

 

Week 3:

This week, I learned some more about data visualization and how to get access to census information from multiple countries using IPUMS. I also met with my mentors and devised a quick schedule for the rest of the project. By the end of the week, I received a basic script for the bullying scene from my mentors (with dialogue suggested by a 6th grader) and I will begin implementing it into the game this next week. The plan is to have a working albeit rough demo of the scenario so that we can begin user tests for the age group the by the 5th or 6th week. Here is a video of what I’ve done so far, running on my phone (Android):

Not shown is the randomization of the characters in the scene (it would take more than one play). I anticipate this next week to be a very busy and productive one now that I’ve got a lot of the basics (including an animation plan) covered.

 

Week 4:

Most of the changes I made were to the environment and UI. I baked ambient occlusion onto most assets in the scene (to save from doing a global illumination pass at runtime) and created models for a computer, shelf, cabinets, and window. I also created the UI button images in Illustrator and implemented their functions. All UI elements positions are determined in a script, so their position changes based on the aspect ratio of the device the game is playing on. I had to update the shader program to allow for transparency in textures, that way I could have multiple materials on a single mesh. This will allow me to implement a more modular character customizer. I’ve added the shirt swapping feature to show this in action, I will add more options later. I also enabled toggling between male and female characters while retaining the customized options. Another important feature I implemented was my own method to create paths for the other characters to follow, which will be utilized for the bullying scenario scene. There were additional (subtle) changes that needed to be made to both the male and female character models and rigs, which I spent some time on (I don’t want to get into the details, but retaining paint weights was a small issue in Maya). Finally, I fixed some bugs and made some optimizations to minimize time and memory spent on rendering.

Besides just adding features and touching up the game, I also had a midterm presentation rehearsal and set a date to have a complete demo for testing and feedback from middle school kids. My mentors have created a dialogue script and suggested another feature for the game, so I will focus on implementing these things during the next 2 weeks. I am aiming to have the working demo by July 16th.

  • I only realized after exporting this video that I didn’t add the blob shadow below the male character model in the customization scene (It’s fixed now).

 

Week 5:

This week we had a break for Independence Day on July 4th and then midterm presentations on July 5th . I met with my mentors on Friday morning and came up with an alternate way to play the game. My work for this week and next will be implementing the new gameplay. The new idea is simply allowing the player to roam the class as they wish. There will be a clear goal that the player will have to meet, such as get to their desk before class starts or turn in their homework. While waiting for class to start, the bullying happens in the back of the class. The player can choose to talk to one of the students or to the teacher. What they say depends on if the bullying has started or not, for example, if it hasn’t, they just say something about the class or the homework, and if it has, they mention the bullying in some way. The plan is to also implement ways to interact with the bullying situation. Other than just dialogue queues from the other students, there will also be a visual representation of where the bullying is happening, based on how far they are from it. This can be incorporated as a mechanic for points or some other reward. The game finishes once the initial goal is met or if both the initial goal is met and if they helped the victim of the bullying. Afterwards, their score is based on several factors including their reaction time, how long it took to complete the initial goal, the order of events, and their choice to help or not. There will then be a lesson scene that will ask the player if they can identity who played what role in the bullying and teach them about the different roles bystanders can play. Here’s some progress:

 

Week 6:

This week, we toured Clemson’s data center and learned some fun facts about their super computing cluster, Palmetto. We also toured their digital media and production arts labs. As for my project, I added in some dialogue based on what the player chooses to do. I also added in the lesson at the end, which I plan to clean up later. There are some other less obvious things that I  implemented, including smooth interpolation between target points in my path maker, that way the character’s walking around have more natural changes in direction. Other features include responses, a goal seat for the player to go to, waiting for the player to be in the vicinity of the bullying before the bullies do anything (that way the player can see what happens), a bar at the top left that indicates how close or far they are from the bullying, and some other fixes and modifications. Here’s what it is so far:

 

Week 7:

This week was largely getting the game ready to demo to a small number of students. I added in some features including a rudimentary scoring system, some more assets (books on the shelf, stuff outside), some sound, and I tweaked the character selection menu. There are some other features I worked in and tweaked. The largest tweak was completely reworking the dialogue system I had, so now it’s way easier to manage and add to. This is also the first week I made builds for iOS. The video I made this time was recorded on an iPad.

 

Week 8:

This final week I just had some meetings focused on writing the paper for this project in the IEEEVR format and making my final presentation.

Final Report

Tony Sun: Dynamic 3D Visualization of Blood Flow in Brain Arteries

Mizzou Logo
Clemson Visualization Division Logo
Clemson Visualization Division Logo

Home Institution
University of Missouri – Columbia
Columbia, MO
Contact Email: tksqk6@mail.missouri.edu

Clemson Research Mentors
Dr. Ulf D. Schiller
Department of Material Science and Engineering
Mehrdad Yousefi
Department of Material Science and Engineering

Clemson Visualization Mentor
Dr. Wole Oyekoya
Visualization Director

About me: Tony Sun

Born and raised in Columbia, MO, I elected to attend my hometown institution in the hopes of strengthening ties to the community that has raised and shaped my formative years. I study mathematics and psychology and plan on attending graduate school in a related field. I am primarily interested in how socialization in competitive environments can enhance critical skill formation in areas that require quantitative competence. I would love to find a program that utilizes my interests in a diverse range of topics.

In my free time, I enjoy the outdoors, reading, and listening to music. I am also peripherally learning about football and basketball.

Project Description

My mentor is Dr. Ulf D. Schiller who has a focus in material science and engineering. He is interested in soft tissues as a research topic and is developing a method to visualize medical diagnostic tests for use in educational and clinical settings. In particular, commonly used MRI scans can only display 2D slices of an object. There are severe limitations for laypeople and trained professionals alike for whom these images only provide a snapshot of the current state. More troubling is the lack of knowledge about volatile conditions, such as intracranial aneurysms, where little concrete evidence exists for diagnosis and subsequent treatment options.

Our hope is to interpolate the data from the scans into a more accessible model with existing patient-specific clinical parameters to improve patient prognosis. We will do this by creating a virtual reality environment where an observer can step into a model of a brain to see hemodynamic blood flow and physical wall tension and interact with it in real-time to make decisions about these volatile conditions.

To do this, we must first convert the raw data from the scans into a form that can be accessed through virtual reality capable programs. Then we need to construct a model from the data. Finally, we must then program that data to be viewed through a virtual reality headset. You will find weekly documentation of my project progress below.

Week 1 Overview

Week 1 was a tumultuous time. I arrived and immediately began familiarizing myself with the lab’s technologies and culture. We were allowed to test out various virtual reality displays (HTC Vive) using current lab programs before being introduced to our mentors and our own projects.

After our orientation, we met with other visualization affiliated professionals and were trained in IRB for social and behavioral studies. The next day we went on a tour of the campus and worked on an outline for this program. We had a couple presentations on writing research papers, screen captures, and video editing. On Friday, we were taken on a tour of the Watts Family Innovation Center, where detailed attention was placed on the union of advanced visualization technologies and interdisciplinary research in a facility designed for maximizing digital communications.

This week in the lab was spent building comfort with these technologies, many of which are new to me. I have read through the existing knowledge base and am beginning the modeling soon. I will be using Paraview and OpenVR as well as Python to complete this assignment.

In addition, I’m very excited to see how everyone’s projects end up.

Week 2 Overview

This week began with a crash course in the command line interface commonly found in Linux to access the computational strength of the Palmetto cluster. We then spent some time reviewing current scientific visualization software and resources found here at Clemson and across other academic institutions. We finished the week up with an introduction to information visualization and visual analytics. Software used include OpenRefine, Tableu, SAS Visual Analytics, Gephi, and D3. There was an optional session on GIS technologies ArcCatalog and ArcMap which I found to be quite interesting.

At the beginning of week 2, I was determined to work with the Virtual Reality headsets. Before this could be done, we needed to find a way to access the simulation data with our scientific visualization program, ParaView. We accomplished this through the use of an open source extension of the software. However, animations were not supported in this extension, proving a major roadblock to utilizing a majority of our real-time data. Next steps include importing the data set into this extension and looking for animation support. I am also beginning a literature review on the implementation of virtual reality technologies in the hospital setting.

Week 3 Overview

Another week, another workshop. We covered more advanced topics today in our seminars. Monday and Tuesday were spent on learning Python. Python is a popularly used programming language for its both its accessibility and its readability. Newer programmers may have trouble with the logical flow, as with learning any language, but more experienced programmers will find the syntax relatively easy to pick up.  In addition, an emphasis is placed on limiting the exterraneous punctuation and spacing issues of older languages. I found myself to really enjoy the sessions on Python, finding it relatively easy to learn, not to mention useful. I’ve seen many applications that utilize Python scripts and a quick google search shows that it is currently and consistently ranked amongst the top 10 programming languages in use today. We also had a presentation on the capabilities of Unity as a game development platform. Unity allows one more interactive experiences with VR and AR as it supports a wide array of avatar functions. We also had a presentation on current and future uses of LiDAR technologies. LiDAR detects the returning light frequencies of various light emitting units using sensors that can be interpolated with several other pieces of data to determine the range of various scans as points. These points can then be used to map a corresponding area. Very interesting applications for UAVs.

The workshop on IPUMS ranks one of the most interesting thus far. This is a collection of international census data that is available for multidisciplinary scholarly pursuits. I would have liked to spend more time familiarizing myself with this dataset but it is beyond the scope of my current project.

Speaking of my current project, this week had slow progress. As we gear up towards our midterm presentations, I was a little disappointed in my current progress. I have spent a lot of time researching and learning new programming languages and visualization tools, yet I have still hardly anything to show for my own project. That said, we were able to demo the use of the headsets on Wednesday and are closer than ever to making Paraview compatible with the cluster. I am currently writing a draft of my paper, but the overall progress of my project, and consequently, my paper is limited. Although this may be discouraging, I look forward to beginning the next week with renewed vigor.

Week 4 Overview

Finally, something I have a foundation in! The week’s workshops were on R statistical software. I have been using R for a couple semesters now on independent research projects. Our crash course was quite useful for compiling all my experience into a cohesive whole. We learned how to use R to make graphs and write to images as well as a multitude of stylistic choices.

My project progress this week was focused on steps moving forward for the project. Ideas include the following: changing the peripherals of the clipping plane, creating a demo video, and building in real-time data visualization with the use of Unity.

Week 5 Overview

This week we had presentations. We were hosted in the Watt Family Innovation Center, which we toured earlier in the program. I must admit that I was nervous for the presentations, but I was also excited to see the progress of the rest of my program members. We all have such interesting and diverse applications united under big data visualization concepts so there are new ideas with enough overlap to be intriguing. I hope to learn from the other VizREU students during the presentations. Dr. Wole even managed to get live twitter action shots during the presentations.

We only had one workshop this week on editing 2D images. I was interested in the section on creating VR environments with 360 video through Premiere Pro. I was able to talk to Dr. Brian Adam Smith about emerging technologies regarding VR and AR, in particular, the use of AR for medical research.

Week 6 Overview

This week, I am working on implementing the Paraview Unity Plugin for Dynamic Data Visualization.

We were able to visit the ITC data center and view the Palmetto cluster. I felt like I was in a movie surrounded by the computing behemoth.

Week 7 Overview

We ran into a problem with building Paraview to communicate with Unity. We will spend the rest of this week looking for a solution.

This week’s workshops were about Leadership. The power of presence and building relationships through networking are very powerful industry skills. I’m grateful for the chance to practice them.

Week 8 Overview

This is the final week. We are presenting on Friday, so I need to finish my poster, video, and presentation before then. I am extremely stressed out but with the support of the lab, I was able to successfully complete the tasks.

This has been a tremendous experience for me. I was able to gain practical skills through the workshops and research. I was able to develop relationships with those around me and learn from their skills. I was able to take my academic research experience as an undergrad to the next level, benefiting my future academic endeavors.

I would say that this summer has been an overall positive experience. There were moments where I felt like giving up, that I would never accomplish anything I set out to do and that I was going to fail.

Thanks to Dr. Wole Oyekoya, Dr. Schiller, and Mehrdad Yousefi, I was able to complete and present at the Watt Innovation Center. Thanks again for their tireless efforts to put together the REU and for their ability to challenge, motivate, and inspire us, undergraduate students. It was a pleasure working with the Clemson Computing Division and the Visualization Lab, and it is something I would definitely consider doing again.

Final Report

Bayley Meyer: Visualizing Watt Energy Consumption

Home Institution
Westminster College
Salt Lake City, UT
Contact Email: bcm0926@westminstercollege.edu 

Clemson Research Mentors
Dr. David White
Research Professor of Electrical and Computer Engineering
Tim Howard
Project Manager

Clemson Visualization Mentor
Dr. Wole Oyekoya
Visualization Director

 

About Me:

I’m currently a senior at Westminster College studying Mathematics with minors in Data Science and French. After graduation I hope to become a data analyst or statistical consultant. I’m from Seattle, WA but moved to Utah for school.  I plan to move back to Seattle after I graduate in order to be closer to the tech industry.

Project Description:

The goal of my project is to visualize the energy data for the Watt Innovative Center, in order to discover and address potential inefficiencies in the building.  I will be using SAS Visual Analytics in order to show the data for the following subsets of the building’s consumption; receptacle, isolated ground, media lights, lighting, and HVAC. Visualizing this data will help to reduce the utility costs for Watt. Originally, we were planning on creating a prediction for future hourly consumption of Watt, but changed the goal of the project in week 6.

Week 1:

This week I met with my mentor, Dr. David White, and defined my goal for summer research: create dashboards for the resources that are consumed by the Watt Innovation Center including a prediction of future hourly consumption.  I started getting familiar with SAS Visual Analytics in order to analyze the data next week.  As a group, we had sessions on writing the paper and using adobe premiere software. We also took a tour of the Watt building to see the advantages of an innovation center.

Week 2:

I began drafting dashboards for the different subsets of the Watt center, unfortunately I haven’t been able to create any of the dashboards yet due to a technical issue of getting full access to SAS Visual Analytics.  So I began working in R in order to look at the many options for creating a predictive, time dependent model.  I began researching different methods that have previously been used for creating time predictive models both generally, and specifically for smart buildings.  As a group we had demos with Linux command line, the Palmetto cluster, GIS, and many of the options for 3D visualization, visual analytics, and information visualization.

Week 3:

Things finally started to come together this week since I was able to upload the Watt data into SAS Visual Analytics and begin drafting dashboards for the different energy subsets of the building. Unfortunately, SAS is not quite as user friendly as I’d hoped so I’m having to manually create categories such as weeks and hours of operations in order to get the program to graph my data. I was also given the code that was used for the original lighting dashboard and I’ve begun to see if that’ll help me to construct a predictive model for the Watt data. The group sessions that I attended this week were about Python, Unity 3D, retrieving and graphing census data. Attending these sessions has been really useful because it is helping me to understand many of the different programs that are available for visualizing all different kinds of data.

Week 4:

After going to the R sessions that were held this week I was able to collapse my data into hourly readings instead of readings every 15 minutes. Since this reduced my data to the quarter of the size, SAS was able to read the time column to allow for drilling down on specific time periods and animating consumption use over time. Once I finished making the interactive dashboards I focused on preparing for the midterm presentation that is coming up. I’m focusing my presentation on how commercial buildings (including Watt) consume 70% of the US’ electricity, and how prediction methods can be used to reduce total usage. Through my project I hope to find a predictive algorithm that will help to reduce energy consumption in Watt, and spread awareness of the amount of energy being consumed on campus. Next week I’ll continue using R to create predictive aspect for the dashboards.

Week 5:

This week started off with the midterm presentations which allowed us to share our progress with the rest of the group and our mentors. I thought that mine went fairly well, however I quickly realized that I did not have a strong understanding of all that the Watt center does. During my mentor meeting Tim Howard, the project manager, gave me a tour of the building that focused on all the technology that the building uses. It helped me to understand what makes this building an innovation center, and why some of the readings are so large. I was able to see the meters that are retaining the information I’ve been graphing, along with the systems that I connect to when using SAS Visual Analytics. The end of this week was focused on drafting the related works, and methods section to the research paper. I’ve started looking into the most efficient and accurate way to create the predictive piece of this project. ARIMA (auto-regressive integrate moving average) models seem to be the best way to forecast the Watt data, so next week will be focused on finding the best parameters for fitting it to the data.

Week 6:

At the beginning of this week we took a tour of the ITC Data Center which holds the Palmetto Cluster. I found it especially interesting to compare this building’s consumption to the Watt Center. We were also given tours of the animation, digital production arts, and virtual environment labs. This gave us an inside look of how much work goes into animating movies and video games, along with studies that will hopefully improve future VR environments.

As far as my project, I’ve worked on plotting time series in R, with the goal of plotting prediction data on the same plot in order to have a good estimate for the future. The three forecasting functions that I looked at were forecast(), HoltWinters(), and arima(). ARIMA models were the most informative of the three, but after meeting with David and Tim we decided a better direction would be to drill down on some of the largest power consumers in the building. I was given three things to look into; the relationship between HVAC and outside temperature, the relationship between receptacle power usage and occupancy, and lastly the cost differences when changing some aspects of the lighting.  So far I’ve done correlation and regression analysis on HVAC and temperature and found a moderate correlation. Next week I’ll look into the cost differences for making lighting more efficient, and hopefully I’ll receive the occupancy data, so that I can explore that as well.

Week 7:

The sessions that were held this week were about power of presence and building relationships where we created and rehearsed elevator talks, as well as identified which “color” we were in the work place. This helped us to have a better understanding of what to say to potential employers if given the chance to have a short conversation with them. The color identifiers were entertaining mostly because 6 of the REU students (including myself) were in the blue-detailed oriented group.

Its been a very productive week for my project. I’ve completed the dashboards for HVAC and temperature correlation, as well as receptacle data correlated with occupancy, and cost reduction dashboards for the lighting and the media lights. At the meeting with my mentors we discussed final touches to the dashboard along with a couple more things to add before the end of the program. I was also able to complete a draft of my research paper and get it formatted in LaTeX. On Friday my mentor, Tim, and I went over the format for my final presentation so that I can take full advantage of the hyper-wall in the Watt auditorium. Next week I’ll focus on getting my presentation together and planning a poster or demo for the breaks during the Visualization Symposium. Only one week left!

Week 8:

No group sessions were held this week so that everyone could have time to finish their papers, posters, and presentations. However, we did have a group dinner at Solé on the Green to celebrate the (near) completion of our REU. I decided to not make a poster, but to display my dashboard that I’ve been making these past 8 weeks instead, since I think that will give viewers a better understanding of what I did throughout this REU. I’ve had time to edit my paper a few times and am satisfied with the end product. Luckily, I was able to rehearse my presentation in Watt Auditorium a few times since we decided to use a dual screened approach which involves a lot of interaction with the hiper-wall.

Visualization Symposium:

The presentations were a success! I thought that all the REU students did a fantastic job, and it was really great to see everyone progress throughout these 8 weeks. It was fun to explore the other demos that were set up during the poster sessions, and using the entire hiper-wall worked very well for my presentation. A huge thank you to Dr. Wole for the nomination to the NSF REU Conference, and for the wonderful experience that I’ve had here at Clemson.

Final Results and Remarks:

By analyzing the Watt Innovation Center energy consumption data, we found that $22,485 is saved annually by having LED overhead lighting with motion and ambient light sensors. The current minimum consumption for the media light is 10 kWh, if we were able to completely turn off the media lights 12pm-6am on weekdays and all weekend utility costs could be reduced by nearly $4,000 annually. This cost reduction does not include the amount that would be saved by the ability to turn off the HVAC in the rooms where the media lights are controlled (these rooms get overheated if not controlled by the HVAC system).  Outside temperature and HVAC power have a moderate linear correlation with a r-squared value of .28. This knowledge can be used to further align the HVAC systems with the outside temperature to avoid unnecessary surges in energy use. Last we saw that receptacle usage and occupancy data had a strong linear relationship. The second floor had the highest correlation and an r-squared value of .64. The first floor had the lowest r-squared value of .45, this is because the first floor has a cafe that has appliances that run nonstop. These findings can hopefully be used in the future to reduce total utility costs and to even have Watt run its’ own utilities without intervention.

This REU program has been very beneficial to me, I was able to learn many new skills (SAS, linux, GIS, LaTeX, python, etc.), and have the opportunity to expand my knowledge base in visualization.  The biggest takeaway for me is my improved ability to search for solutions with programs where my knowledge base is limited. Being able to self learn is a something that can be beneficial in any field, and I’m happy that this experience has helped improve my ability to do so.

Final Report

Rebecca Butler: Seismic Hazard Mapping

 

 

 

 

About Me:

I am a junior at Seattle Pacific University majoring in Mathematics and Physics and minoring in Biology. I am also in the University Scholar’s program. I am the President of the Society of Physics Students and am a learning assistant for the introductory physics sequence. In my free time I enjoy hiking, rock climbing, and exploring the Seattle area. In the future I hope to obtain a PhD in either pure mathematics.

Project Description:

This quarter I will be developing methods to model tsunami inundation. I will do this using ArcGIS. This map will visually show which areas of Las Ventanas, Chile which have a higher chance of flooding in the event of a tsunami.

Week 1:

This week I met with my mentors and learned more about the details about my project. I have been reading technical articles on seismic hazard analysis and seismic hazard mapping. I was also introduced to LaTeX, OpenQuake, and OpenQuake. This week we attended sessions on writing research papers, using Adobe Premiere, and using Camtasia. We also went on a campus tour and a tour of the Watts Innovation Center. Overall, this week was a great introduction to Clemson and the Visualization Lab.

Week 2:

This past week we have had demos with the Linux command line, the Palmetto cluster, Paraview, VMD, VisIt, CUDA, OpenRefine, Tableau, and GIS. The GIS tutorial was especially helpful for my project as this is the software I will be using to map the data obtained from OpenQuake. I don’t have a lot of experience with coding or programing, but these demos have taught me what I need to be able to do my project.

In terms of progress on my project, I downloaded the OpenQuake software and got several demos to run. I had to do quite a bit of reading in the user’s manual before I was finally able to get it to work. I also read a bunch of articles on seismic hazard and risk analysis. 

Week 3:

Our tutorials this week covered the basics of Python programming, VR and AR in Unity 3D, Visualizing world census and survey data, and LIDAR in GIS. While I won’t be working with 3D data for my project, the LIDAR seminar helped me become more familiar with GIS, which is the program I will use to map the OpenQuake data.

I wanted to get OpenQuake running this week, but there have been problems installing in on the Palmetto Cluster. While waiting for this, I have been writing the introduction to my paper and learning how to use LaTex. I have also been working to plot a small set of sample data in GIS. I should have this data plotted by Monday or Tuesday.

Week 4:

This week we only had two tutorials which covered the basics of Programming in R. While this was interesting, I won’t be using R for my project. Most of my time this week was spent getting ready for my midterm presentation. We had a rehearsal on Friday which went fairly well and helped me get more comfortable with presenting.

My project changed slightly this week. Instead of mapping seismic hazard through all of South America, I will be mapping projected tsunami inundation off the Chile over the next 200 years. The basic process is the same as before, but with some added steps. I will still run OpenQuake to obtain the expected earthquake magnitudes and from these magnitudes I can calculate the expected tsunami wave heights. From this, I can map the expected inundation.

Week 5:

This week felt pretty short as we had Tuesday off for the 4th. I spent Mondays mostly doing presentation prep and a bit more research  on inundation modeling. On Wednesday, we had our midterm presentations. It was nice to give a presentation in the Watt conference room before the final presentation. Thursday I had a mentor meeting which really helped me get a better grasp of the scope of what I will be doing in the final three weeks.  Thursday and Friday were mostly spent writing my paper introduction, related works, and methodology. I’ve had some issues with LaTeX on my laptop, so I have just been borrowing someone else’s for now.

Week 6:

I have spent this week working with ArcMap to plot the projected inundation depth of 9 meters in Las Ventanas, Chile. The tool in ArcToolbox that would have made this process fairly simple is not installed on the GIS lab computers and the GIS staff is out of town this week at a conference, so I could not get their help with this issue, but I have sent them an email. As an alternative, I have been working in an open source program called QGIS. Instead of having smooth contours as I expected, I am getting an extremely pixelated image. I think this is because of the resolution of the digital elevation model I am using, but I cannot find one with a better resolution. I am trying to find a way to fill in the elevation contours with color, but this is still difficult in QGIS.

This week, we went on tours of the ITC Data Center and the School of Computing Labs. It was nice to see more of Clemson’s campus and some of the larger research projects currently underway. I have also been working on adding to and revising my paper.

Week 7:

This week, we attended several sessions on elevator speeches and building relationships. These sessions were helpful for thinking about how I can use this research experience to build relationships and market my abilities in future opportunities.

I spent most of this week writing my paper and working on finalizing my models. I have developed three methods to model tsunami inundation, all of which have their own advantages and disadvantages. I got access to the ArcMap tools I needed this week. The image still comes out pixelated when doing a raster calculation, which is because of the low resolution of the DEM. I figured out how to smooth out the contours, but I am still having trouble coloring them in ArcMap.

Week 8:

This week has been pretty busy. I have finished my paper and arranged all the images in LaTeX. I also made my presentation. We had a rehearsal on Wednesday, which went fairly well. I had to make a poster and assemble all files as well. Mostly, this week has been spent finishing up the project itself and putting together all presentation materials.

Final Report

Arjun Talpallikar: Visualization as the Interface for Machine Learning

Introduction

I am a second year undergraduate at The University of Texas at Austin.

I am working as a summer REU student at the Clemson University Visualization Lab, working on a project entitled “Visualization as the Interface for Machine Learning”.

Here, I will maintain a brief weekly blog describing my research intern experience. I will document both the events we’ve attended as a group, as well as the progress I’ve made on my individual project.

Project Overview

While Virtual Reality (VR) research has a long history, VR technology has only become commonplace very recently. As a result, little work has been done to describe and classify VR users. Specifically, while some users struggle to adapt to a VR environment, others adapt with relative ease. I will use clustering algorithms to group user data into discrete groups, with the aim of improving understanding of human interaction with VR systems.

Such machine learning techniques are rapidly gaining in popularity, but their high dimensionality and black-box nature make human comprehension and understanding of machine learning outputs difficult, diminishing the usefulness of such techniques. Dimensionality reduction algorithms, such as Sammon Mappingseek to address this problem in different ways.

In this project, I will apply machine learning algorithms to the previously described clustering and subsequent visualization problems in VR research.

I am working with several phenomenal members of the Clemson faculty: Dr. Wole Oyekoya is our REU mentor and supervisor, and Dr. Jerome McClendon and Dr. Andrew Robb are my faculty mentors for this project.

Week 1: Preparation

Events

This week, we toured the campus and attended lectures focused on preparing us for presenting and publishing research work. I became familiar with Adobe Creative Cloud products, specifically the video-editing software Premiere Pro, as well as with LaTeX typesetting software, bundled through the MiKTeX distribution.

Project

This week, my work was centered on setting up my development environment and getting the hang of the tools I’ll use for the remainder of the research project. I tried out the HTC VIVE VR system, produced a simple game using Unity, started working with C#, and explored various options in Axis Neuron, the software we will use for motion capture.

Looking Ahead

Next week, I hope to have finished the data pipeline from our raw motion capture data to the data processing environment (Perception Neuron -> Axis Neuron -> Unity -> C# -> CSV (tentatively) -> Python 3) , and to begin analysis on preliminary data. I’m also looking forward to checking out the Palmetto Cluster, Clemson’s high-performance computing environment, on Tuesday!

Week 2Development

Events

Events this week were focused on two areas: research computing and visualization tools. Early in the week, we reviewed BASH and then learned how to access the Palmetto Cluster. In the latter portion, we studied several visualization tools, including scientific visualization software, data visualization tools , information analytics tools, and GIS systems.

Project

I completed my goal of setting up a data pipeline and formatting. The project repo is also now online, so feel free to git-checkout! I also studied the theory and implementation of several machine learning algorithms for cluster analysis on our newly collected data. Finally, I’ve written most of the introduction for my project’s final report, and have become more comfortable working with LaTeX.

Looking Ahead

Next week, I hope to begin data collection with real people. I also hope to select a dimensionality reduction method for the final portion of the project.

Week 3Development Cont.

Events

Our morning events this week were varied in nature. Early in the week, we were invited to attend Python workshops, targeted at students without Python experience. On Wednesday, Joe James, a previous REU student and Clemson Mechanical Engineering graduate who is serving as our logistics coordinator presented his past work in Unity VR programming. As the week came to a close, we learned to use IPUMS, a powerful census and survey database for social science research, and ArcGIS, a commonly used GIS software package.

Project

While I had initially intended to spend this week on data capture, I spent most of this week working on dimensionality reduction instead.

A quick primer – dimensionality reduction is the problem of reducing data from a high dimensional space to a lower dimensional space, and is generally accomplished through feature selection or feature extraction. They serve two important purposes: (1)creating data that machine learning algorithms can consume and (2)visualizing processes in high dimensional space.

Specifically, I’ve been implementing several dimensionality reduction algorithms, including Principal Component Analysis (PCA), Sammon Mapping, and t-Distributed Stochastic Neighbor Embedding (t-SNE). You can check out this part of the project here.

The change in plans was necessary because of the high dimensions of my motion capture dataset. Working on dimensionality reduction earlier also keeps the scope of the project under control and increases the chance that I have useful results by the end of the summer.

I also spent a few days working on report formatting and contents in LaTeX.

Looking Forward

I’ll continue working on studying and implementing dimensionality reduction techniques into next week. By the end of next week, I hope to have implemented the three algorithms described above on motion capture data to see which technique works best for my specific use-case.

Week 4: Gearing up for Presentations

Events

Our morning workshops this week were minimal – I attended a couple of R workshops, but that was about it.

 

Project

I spent much of this week continuing my work with dimensionality reduction. I learned how to use some of the many machine learning libraries in Python, so I could stop implementing everything from scratch. I also spent some time preparing for our midterm presentation.

 

Looking Forward

My goal for next week is to have working t-SNE, PCA, and Sammon’s Mapping models on real motion capture data – at the moment, my models have only been successful on toy datasets.

 

Week 5: Midterm Presentation

Events

Because of the July 4th holiday, this was a short week – work only began on Wednesday, when we had our midterm presentations. After that, we didn’t have too many of our normal events, though we did meet as a group to discuss our projects and progress.

Project

While this was a shorter week, I was able to make a lot of progress. I implemented t-SNE on motion capture data, did a lot of data refining and reshaping, and then implemented it a few more times with different results. Here’s my most recent result, a t-distributed stochastic neighbor embedding projection of 700 or so frames (recorded at about 30 fps) of a tai chi master.

You can see how the points located closely together in time (points with the same color) are generally clustered together in the mapping.

Looking Forward

For next week, I need to learn and implement a few more algorithms – specifically, I’m looking at growing neural gas and one of its variants, a self organizing neural network.

 

Week 6: The Final Push Begins!

Events

We began this week with a tour of Clemson’s data center and the Palmetto cluster. The sheer scale of the university’s cyber-infrastructure was impressive, and it was cool to see how, at that scale, even relatively simple issues can require creative and robust solutions (substituting wiring for ducts filled with copper plates, for instance).

Project

I made a significant amount of progress in two areas this week.

First, I completed most of the proof-of-concept experiments I’ve been running for analyzing and visualizing motion capture data.

Second, Dr. Robb and I completed the remaining work for the final experimental protocol.

Looking Forward

While many of the experiments of this project will be completed after my REU work, I do need to produce a final report and presentation, so next week, I’ll spend most of my time working on that. On a related note, our REU will also be attending several sessions in the coming weeks about presenting with power, etc.

In addition, now that we are ready to conduct trials with participants, the methods and software I’ve used for visualization and analysis needs to be updated to work with the new data formats, so I’ll spend a lot of time on that as well.

Finally, the phenomenal Tania Roy, a PhD student in Human Centered Computing, is going to present on her dissertation work, so I’ll be there, too. Tania and the rest of the graduate students I’ve met this summer have really helped make me feel at home here at Clemson.

 

Week 7: It’s Crunch Time!

Events

This week, our events were focused on providing us with skills to succeed in academia. We worked on elevator speeches, management styles, networking techniques, and understanding the power of presence.

These workshops, while different from the technical training we had received earlier, were still valuable, especially for our group of fairly quiet, unassuming computer science research interns.

Project

Sadly, I fell sick on Sunday, so my progress has really slowed.

However,  I did conduct a pilot trial, as well as two full trials with participants! I’ve had redo large parts of the system I’ve developed over the past summer to account for all of the modifications and updates that Dr. Robb and I have put into place while iterating on the original experiment protocol.

After making those changes, I’ve now started working on re-implementing all of my past proof-of-concept experiments on the new production data.

Finally, I’ve also continued to work on both the final report, as well as the presentation and poster for the project.

Looking Forward

Next week is our last week, so I’ll be working hard to finish up. By the end of next week, I hope to have finished:

  1. Collecting more production data
  2. Refining and processing production data
  3. Visualizing and analyzing that data
  4. Complete the final report
  5. Complete the project poster
  6. Prepare for the final presentation

Week 8: Wrapping Up

This was our last week at Clemson, and it was a rough one!

I completed several more trials for my experiments, analyzed that data, and produced a paper, a poster, and a final presentation. We presented at the Visualization Symposium, organized by Dr. Wole, and the presentations went off well. I’m looking forward to extending the work I’ve done while in Clemson this summer, and hope to produce at least one publication.

 

Final Report

Blessing Leonard: Visualization in the Cloud

Home Institution
University of Baltimore
Baltimore, MD
Contact Email: blessingleonard@yahoo.com
Clemson Research Mentors
Dr. Kuangching Wang
Associate Professor of Electrical and Computer Engineering
Mehrdad Yousefi
Visualization Graduate Research Assistant
Clemson Visualization Mentor
Dr. Wole Oyekoya
Visualization Director

About Me

I was born and raised in Lagos, Nigeria. After getting my Associate’s degree in Cybersecurity, I transferred to the University of Baltimore in Maryland where I major in Applied Information Technology with a focus on security. At the end of this experience, I hope to gain some insight on the relationship between cloud architecture and data accessibility.

Project Abstract

This research focused on the effect of visualization on two Linux machines, each equipped with different Nvidia GPUs. The study used: two of Nvidia‘s GPUs (the K5000 and K2200), ParaView 5.3.0 for the visualization processes, and the nvidia monitoring tool in Linux command line for recording the amount of resources being used.

Week 1

We received a general briefing on what the research program entails during orientation on Monday, after which we met with our mentors. At the initial meeting with Dr. Wang, we established that the experience will be a combination of a theoretical and practical approach, went over existing technologies, the issues that gave rise to the research project, and the ideal output of the research experience. We also identified tasks to be completed at the end of the week which included me: drafting a research proposal, researching Nvidia’s role in cloud visualization with a focus on the available GPUs, and setting up my hands-on working environment which includes CaptureSDK and ParaView. The week ended with me having a basic understanding of visualization and how cloud visualization can be beneficial.

Week 2

Over the weekend, my roommates (Becca and Bayley) and I explored the botanical garden where I discovered my new favorite plant name: “formal attire”. We fell in love with the turtles, and made an amazing discovery on campus: a snack vending machine! With all these discoveries, I began the week slightly more confident in my understanding of the subject matter, however, with this week came the challenges of research. For the hands-on aspect of the project, I focused on finding software that can be used in measuring the impact running resource-intensive applications such as ParaView has on the GPU; unfortunately, that proved to be more tedious than expected. Instead of finding software for application-specific benchmarking, I kept finding ones that were preconfigured with some graphics and that measured the overall capacity of the GPU. The other obstacle encountered was finding datasets to be used in the benchmarking process that were acceptable in ParaView. The week ended with me trying to find papers on GPU streaming benchmarks for both clients and servers.

Week 3

Week 3 started on a high note, I found some datasets from the RCSB Protein Data Bank whose format was assessible by ParaView. Also, as against finding an application for benchmarking, Dr. Wole and I discovered some commands that could be used in the Linux CLI for logging resource usage. Thanks to these accomplishments I was able to do some test runs and get a good picture of the initial output, and what my final output should be. I am currently in the process of setting up the streaming component required for the project. The streaming setup will be in a client-server fashion with two implementations:  the first will be a client-server relationship between two Linux machines in the visualization lab, while the second will be between a Linux machine and the Palmetto Cluster (Clemson’s proprietary high-performance computing resource made up of 2021 computer nodes and 23072 CPU cores). Hopefully, the streaming component will be established and fully functional before the midterm rehearsal next Friday; once this is taken care of, I can fully dive into the benchmarking phase.

Week 4

More tests were conducted and the data on GPU memory usage were collected for both Linux machines. However, some changes were made to the environment setup: instead of streaming between two Linux machines, ParaView was run on each of them independently. I am working with the following setup: 1 Linux machine with an Nvidia K2200 GPU and the other with an Nvidia k5000 GPU. Due to some complications with the software upgrade on Palmetto cluster, I have been unable to setup the cloud architecture required for the cloud streaming portion of the research. So far, I Have collected data on both Linux machines using different data set sizes the lowest being 11.9MB and the largest 7.5GB. I have also been able to extract the data from the command line output into Excel spreadsheets; the next step is to visualize the data and any detected relationships in Tableau.

Week 5

There’s not much to report for this week as the July 4th holiday made it a little more relaxed than the previous weeks. Still working on the data extracted into Excel—noticed some information was missing for some of the datasets so parts of Wednesday and Thursday were spent finding and filling in the missing cells. Preliminary steps towards visualizing the data collected in Tableau have been taken but conclusions are yet to be drawn. Progress was also made on the Palmetto Cluster, we were able to load the 11.9MB dataset successfully. Although still a little finicky, there’s hope for advancement soon.

Line_Graph_Snip
A Snippet from One of the Tests. Top: Streaming Multiprocessors use in % (sm%). Bottom: GPU Memory Use in %(mem%).

Week 6

Armed with the initial depictions of memory usage from completed tests in Tableau, the search for trends in resource use began. I also worked on updating my research paper draft and I’m currently working with Ruben on merging the data from a test that was run on both machines to compare the readings and draw conclusions. Given Palmetto cluster’s restrictions, we decided to use the IBM Power8 server (a smaller cluster run by Dr. Wang’s graduate crew) to implement cloud rendering so tests can continue to be run. However, this move also came with its challenges as we almost immediately discovered some compatibility issues with OpenGL and Power8, and finding documentation about the two technologies has proven almost futile thus far. Upon feedback from an IBM representative, it was confirmed that the current driver does not support rendering and OpenGL although a new driver with the needed features is expected to be released at the end of the month (July 2017). This however, will be too late as the REU program is expected to wrap up at the end of the month so the most logical move given the circumstances would be to move back to Palmetto.

Week 7

This week involved running more detailed tests for credibility purposes. To achieve this, the measurements were taken in two forms: one was at the application of each filter and the other was of a continued test with no interruptions. Both were compared to find patterns that could provide explanations for the varied measurements and duration of the tests on each GPU. I was also able to complete working drafts for my final report and presentation.

Week 8

Made final changes to both my paper and presentation slides, as well as started and completed my poster. Had a run-through with Dr. Wang on Wednesday morning and the presentation rehearsal in the afternoon. Spent part of Wednesday and Thursday adding the feedbacks from both run-throughs into the paper and slides. Friday is all about the final presentation and poster sessions.

Project Conclusion and Future Work

Based on the results from the study, it can be inferred that a GPU’s memory and core capabilities should not be solely relied on for determining its ability to implement visualization of large data sets. Future work include improving the real-time collaboration on, and the availability of streamed media (which are usually resource intensive) stored in the cloud on mobile devices and eventually virtual reality gears.

Future REUs

Wondering if you are smart enough to do this? Well, you never know until you try. The cool thing is no one expects you to know everything about your project, just be open-minded and willing to learn; plus Dr. Wole is always there if you need him. Also, if you are stressing about the lab culture and dress code like I was, it’s chill: pack what you would for college but bring a couple dressy outfits for the midterm and final presentation. Good Luck!

Final Report

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar