Home » VR Portfolio (Page 2)

Category Archives: VR Portfolio

Driving Simulation using LEAP motion

Driving simulation created in Unity 3D, viewed using an Oculus Rift headset. Headset has a LEAP motion sensor that captures users’ hand position in real time.

driving simulation

Interactive Basketball Experience

Basketball simulation in VR

basketball

Virtual Reality Anatomy Showroom

Virtual Reality Anatomy Showroom

anatomy1

anatomy2

anatomy3

anatomy4

Bullying Scenario

Bullying Scenario

bullying

Jacovia Cherry: Campus Basemap

About Me:

I am a rising junior here at Clemson studying Computer Engineering.

Project:

Campus Basemap

Mentors:

Patricia Carbajales-Dale and Blake Lytle

Week 1:

This first week was spent getting familiar with the other interns participating in the REU, my mentors, and learning general knowledge of how the summer will go. Since my project is to create a map of Clemson’s campus, I also spent some time familiarizing myself with CityEngine through the use of ArcGIS site and their tutorials. This was very helpful with learning basic essential skills.

 

Week 2:

This week was spent attending seminars to learn more about the researching computing using the Palmetto cluster, Scientific Visualization, navigating Linux and the command line, and GIS.

After finishing all of the tutorials about CityEngine, I was able to start making some process on building the foundation for my map. This week I was able to import the terrain and the buildings that were already made. However, the terrain isn’t as appealing as I had hoped. Next week, my focus will be making the streets and sidewalks.

 

Week 3:

This weeks focus was bringing in streets and sidewalks. However, things didn’t go as well as planned. I ran into multiple obstacles with the CityEngine software. I was able to bring in the streets, but not able to modify them to my liking. This task will have to be finished in the first few days of next week.  After more complications, I was able to load my scene onto the Gear VR for the sprint review.

Additionally, this week was spent working on the intro for the research paper. This along with streets will be continued next week. I will also be preparing for my midterm presentation.

Week 4:

This week I had the pleasure of working with a new GIS intern. She is very knowledgeable when it comes to CityEngine, and were able to collaborate on adding the remaining street networks to the project. This week also consisted of adding athletic fields and trees to the project .

 

I had my midterm presentation this week which went pretty well. This presentation was very helpful in determining how far the project has come and what the next steps will be. Next week will be spent adding grassy areas and bodies of water to the project to be ready to present at the ESRI conference.

 

Week 5:

This week was spent adding the final touches to the project before the ESRI conference. We were able to successfully add the grassy areas, bodies of water, and parking lots to the project. Ellie, the newest addition to the GIS team, added a multi-use facility to the project that she will be presenting at the project.

Additionally, I spent some time researching articles to continue working on my research paper.  These last few weeks will mainly be dedicated to working on my research paper.

 

Week 6:

This week we visited the ITC Research Park in Anderson to learn a little more about the Palmetto Cluster that some of the REU students had been working with this summer.

This was the beginning of the push to the end. I spent some time writing rules to better the grassy areas and parking lots in the project. I also analyzed more articles for my final report.

 

Week 7:

This week were attended a few leadership sessions to sharpen our skills in the workplace. One was designed to help our elevator pitches and the other to identify our “color” in the work place. These sessions were very beneficial and gave a more introspective view of how we work with others and gave me more confidence in communicating my skills and accomplishments to others.

This week I experienced many technical issues. I spent a lot of time reapplying rules to all the models in the project after CityEngine continued to crash and it seemed as those some of my work was lost. I also had problems aligning the shapes to one set terrain after modifying the rules applied to the  grassy areas and Clemson area boundaries.

 

Week 8:

This final week was dedicated to the fine tuning of the final presentation and the final report. I was able to meet with Dr. Wole for some last words of wisdom regarding the final report. I added to my midterm powerpoint, and we had a chance to practice presenting these as a group for feedback. We also had a group dinner at Sole on the Green. It was great opportunity to relax from the stress of the final project and spent some time together.

 

Final Report

 

 

Andrew Tompkins: Capturing our world through interactive virtual reality field trip

 

Project: Capturing our world through interactive virtual reality field trips

Abstract: A wide range of emerging technologies are available to create virtual reality experiences that can take us on journeys to explore regions of the world that we might never otherwise be able to visit – ranging from polar ice caps to tropical forests. In particular, 360o imagery is becoming increasingly easy to capture, edit, and annotate to engage people in deeper, interactive virtual field experiences. This project will evaluate different techniques and methods for creating interactive virtual reality field trips, including experiences that integrate imagery and other content within the setting of the virtual world with an emphasis on educational field experiences.

Mentor: Stephen Moysey

About: I am a rising senior currently studies Digital Arts at Stetson University. I am also pursuing a minor in Computer Science and Business System Analytic. I enjoy creating interactive experiences and want to use this summer to learn more about the techniques used in the creation of VR experiences.

Weekly Reports

Week 1: The first week, I wanted to setup my work environment and begin familiarizing myself with Unity. I created a basic scene that allowed me to pick up objects and teleport. Next I explored custom interactions which allowed me to snap objects to certain orientations and spawn items that are automatically grabbed.

Teleportation – Snap Grab

 

 

 

 

 

 

 

 

Teleporter uses a raycast laser to find a location and project the reticle at that location. Once the location is selected it is a simple transform to the location of the laser to teleport there.

The Snap Grab required a me to create a class that allowed me to overwrite the builtin Vive Controller functionality. Once the overwrite class was made I simple made special interactions that are dependent on the object being interacted with. To create the arrow snap effect we transformed the arrows location to that of the controller  with a offset to compensate for its size/desired orientation and parented it to the controller.

Week 2: I met with my advisor to get a better idea of what is expected of me. Finished the project charter which outlined what I would be doing during. Started Literature review and collected all of the sources that will be used. formatted this web page and added the images. The project has shifted from creating a project to researching ways to make future projects and how the mechanics used in virtual reality can be used in geologic education.

Week 3: Continued to gather sources for my research paper and created the research introduction. Broke up the aspects that I will be researching which are movement, interactivity, and imagery. I have started on the movement category and have been collecting examples of mechanics being used.

Week 4: Finished gathering all of the reports needed to start the analysis for the movement section of the paper. Started working on the Midterm presentation and worked on the methodology section of the research paper.

Week 5: Finished gathering information for the imagery section of the paper and started to write out the bulk of the paper. The interactivity section of the paper was lacking in information, which will be my focus next week while working on the geoscience part.

 

Week 6: Finished all the research necessary and started to write it all out in the research paper. Started the research poster which acts as a summary of what is contained inside of the research paper. Created a demo using 360 imagery in unity. Started working on a photogrammetry demonstration went to the location and captured the room. Next week will have the first draft of the poster and paper done. I will also try compiling the images into a 3D mesh using Photoscan.

Week 7: Continued to work on the research paper, and started the final demo. The final demo is a platform inside of the Grand Canyon that allows users to pick up rock samples and see which layer the rock sample belongs to.

Week 8: Last week of the REU, finished the Grand Canyon demo and created a video demonstration for it. Updated the midterm presentation to include all of the work that has been since then.

Final Report

Ryan Canales: Engaging Students in Bullying Prevention Efforts through Visualization

Project: Engaging Students in Bullying Prevention Efforts through Visualization

Mentors:  Jan Urbanski, Susan Limber, Wole Oyekoya

About Me: I study Computer Graphics, Art, and Mathematics at Texas A&M University. I am also a guitarist and I like to sing for fun.

Here’s my rudimentary portfolio: http://ryancanales.wixsite.com/portfolio

And some music I’ve made a few years ago:  https://soundcloud.com/ryan-canales

Week 1:

After meeting with my mentors about the direction the project should go, I decided that building character models would be the best place to start. This week, I began making some iterations for a generic character model. I will more or less stick with the basic topology then modify it as needed (ie to change it to male, female or adult, and other little adjustments). I also UV unwrapped the model to begin texture tests in Unity for next week. Finally, I built a simple rig to be used on all characters and began animation testing within Unity.

Here’s the low poly character mesh (with previous iterations on the left). I will add more images as progress is made!

 

Week 2:

This week I went to workshops over Linux, using the palmetto cluster here, and several scientific visualization programs. I also met with one of my mentors to show her the progress I made. So far this week, I’ve added animations to the character and wrote scripts for character and animation control and camera control. I also began building the environment and added collision detection for walls. I also began an implementation for character customization; as of now the player can change the skin color and switch between male and female. I also learned a bit about shader programming for Unity and implemented a custom shader for the character, allowing for the outlined, flatter look. I plan on finalizing the environment and the characters by the end of next week.

P.S. I will add hair to the character!

 

Week 3:

This week, I learned some more about data visualization and how to get access to census information from multiple countries using IPUMS. I also met with my mentors and devised a quick schedule for the rest of the project. By the end of the week, I received a basic script for the bullying scene from my mentors (with dialogue suggested by a 6th grader) and I will begin implementing it into the game this next week. The plan is to have a working albeit rough demo of the scenario so that we can begin user tests for the age group the by the 5th or 6th week. Here is a video of what I’ve done so far, running on my phone (Android):

Not shown is the randomization of the characters in the scene (it would take more than one play). I anticipate this next week to be a very busy and productive one now that I’ve got a lot of the basics (including an animation plan) covered.

 

Week 4:

Most of the changes I made were to the environment and UI. I baked ambient occlusion onto most assets in the scene (to save from doing a global illumination pass at runtime) and created models for a computer, shelf, cabinets, and window. I also created the UI button images in Illustrator and implemented their functions. All UI elements positions are determined in a script, so their position changes based on the aspect ratio of the device the game is playing on. I had to update the shader program to allow for transparency in textures, that way I could have multiple materials on a single mesh. This will allow me to implement a more modular character customizer. I’ve added the shirt swapping feature to show this in action, I will add more options later. I also enabled toggling between male and female characters while retaining the customized options. Another important feature I implemented was my own method to create paths for the other characters to follow, which will be utilized for the bullying scenario scene. There were additional (subtle) changes that needed to be made to both the male and female character models and rigs, which I spent some time on (I don’t want to get into the details, but retaining paint weights was a small issue in Maya). Finally, I fixed some bugs and made some optimizations to minimize time and memory spent on rendering.

Besides just adding features and touching up the game, I also had a midterm presentation rehearsal and set a date to have a complete demo for testing and feedback from middle school kids. My mentors have created a dialogue script and suggested another feature for the game, so I will focus on implementing these things during the next 2 weeks. I am aiming to have the working demo by July 16th.

  • I only realized after exporting this video that I didn’t add the blob shadow below the male character model in the customization scene (It’s fixed now).

 

Week 5:

This week we had a break for Independence Day on July 4th and then midterm presentations on July 5th . I met with my mentors on Friday morning and came up with an alternate way to play the game. My work for this week and next will be implementing the new gameplay. The new idea is simply allowing the player to roam the class as they wish. There will be a clear goal that the player will have to meet, such as get to their desk before class starts or turn in their homework. While waiting for class to start, the bullying happens in the back of the class. The player can choose to talk to one of the students or to the teacher. What they say depends on if the bullying has started or not, for example, if it hasn’t, they just say something about the class or the homework, and if it has, they mention the bullying in some way. The plan is to also implement ways to interact with the bullying situation. Other than just dialogue queues from the other students, there will also be a visual representation of where the bullying is happening, based on how far they are from it. This can be incorporated as a mechanic for points or some other reward. The game finishes once the initial goal is met or if both the initial goal is met and if they helped the victim of the bullying. Afterwards, their score is based on several factors including their reaction time, how long it took to complete the initial goal, the order of events, and their choice to help or not. There will then be a lesson scene that will ask the player if they can identity who played what role in the bullying and teach them about the different roles bystanders can play. Here’s some progress:

 

Week 6:

This week, we toured Clemson’s data center and learned some fun facts about their super computing cluster, Palmetto. We also toured their digital media and production arts labs. As for my project, I added in some dialogue based on what the player chooses to do. I also added in the lesson at the end, which I plan to clean up later. There are some other less obvious things that I  implemented, including smooth interpolation between target points in my path maker, that way the character’s walking around have more natural changes in direction. Other features include responses, a goal seat for the player to go to, waiting for the player to be in the vicinity of the bullying before the bullies do anything (that way the player can see what happens), a bar at the top left that indicates how close or far they are from the bullying, and some other fixes and modifications. Here’s what it is so far:

 

Week 7:

This week was largely getting the game ready to demo to a small number of students. I added in some features including a rudimentary scoring system, some more assets (books on the shelf, stuff outside), some sound, and I tweaked the character selection menu. There are some other features I worked in and tweaked. The largest tweak was completely reworking the dialogue system I had, so now it’s way easier to manage and add to. This is also the first week I made builds for iOS. The video I made this time was recorded on an iPad.

 

Week 8:

This final week I just had some meetings focused on writing the paper for this project in the IEEEVR format and making my final presentation.

Final Report

Arjun Talpallikar: Visualization as the Interface for Machine Learning

Introduction

I am a second year undergraduate at The University of Texas at Austin.

I am working as a summer REU student at the Clemson University Visualization Lab, working on a project entitled “Visualization as the Interface for Machine Learning”.

Here, I will maintain a brief weekly blog describing my research intern experience. I will document both the events we’ve attended as a group, as well as the progress I’ve made on my individual project.

Project Overview

While Virtual Reality (VR) research has a long history, VR technology has only become commonplace very recently. As a result, little work has been done to describe and classify VR users. Specifically, while some users struggle to adapt to a VR environment, others adapt with relative ease. I will use clustering algorithms to group user data into discrete groups, with the aim of improving understanding of human interaction with VR systems.

Such machine learning techniques are rapidly gaining in popularity, but their high dimensionality and black-box nature make human comprehension and understanding of machine learning outputs difficult, diminishing the usefulness of such techniques. Dimensionality reduction algorithms, such as Sammon Mappingseek to address this problem in different ways.

In this project, I will apply machine learning algorithms to the previously described clustering and subsequent visualization problems in VR research.

I am working with several phenomenal members of the Clemson faculty: Dr. Wole Oyekoya is our REU mentor and supervisor, and Dr. Jerome McClendon and Dr. Andrew Robb are my faculty mentors for this project.

Week 1: Preparation

Events

This week, we toured the campus and attended lectures focused on preparing us for presenting and publishing research work. I became familiar with Adobe Creative Cloud products, specifically the video-editing software Premiere Pro, as well as with LaTeX typesetting software, bundled through the MiKTeX distribution.

Project

This week, my work was centered on setting up my development environment and getting the hang of the tools I’ll use for the remainder of the research project. I tried out the HTC VIVE VR system, produced a simple game using Unity, started working with C#, and explored various options in Axis Neuron, the software we will use for motion capture.

Looking Ahead

Next week, I hope to have finished the data pipeline from our raw motion capture data to the data processing environment (Perception Neuron -> Axis Neuron -> Unity -> C# -> CSV (tentatively) -> Python 3) , and to begin analysis on preliminary data. I’m also looking forward to checking out the Palmetto Cluster, Clemson’s high-performance computing environment, on Tuesday!

Week 2Development

Events

Events this week were focused on two areas: research computing and visualization tools. Early in the week, we reviewed BASH and then learned how to access the Palmetto Cluster. In the latter portion, we studied several visualization tools, including scientific visualization software, data visualization tools , information analytics tools, and GIS systems.

Project

I completed my goal of setting up a data pipeline and formatting. The project repo is also now online, so feel free to git-checkout! I also studied the theory and implementation of several machine learning algorithms for cluster analysis on our newly collected data. Finally, I’ve written most of the introduction for my project’s final report, and have become more comfortable working with LaTeX.

Looking Ahead

Next week, I hope to begin data collection with real people. I also hope to select a dimensionality reduction method for the final portion of the project.

Week 3Development Cont.

Events

Our morning events this week were varied in nature. Early in the week, we were invited to attend Python workshops, targeted at students without Python experience. On Wednesday, Joe James, a previous REU student and Clemson Mechanical Engineering graduate who is serving as our logistics coordinator presented his past work in Unity VR programming. As the week came to a close, we learned to use IPUMS, a powerful census and survey database for social science research, and ArcGIS, a commonly used GIS software package.

Project

While I had initially intended to spend this week on data capture, I spent most of this week working on dimensionality reduction instead.

A quick primer – dimensionality reduction is the problem of reducing data from a high dimensional space to a lower dimensional space, and is generally accomplished through feature selection or feature extraction. They serve two important purposes: (1)creating data that machine learning algorithms can consume and (2)visualizing processes in high dimensional space.

Specifically, I’ve been implementing several dimensionality reduction algorithms, including Principal Component Analysis (PCA), Sammon Mapping, and t-Distributed Stochastic Neighbor Embedding (t-SNE). You can check out this part of the project here.

The change in plans was necessary because of the high dimensions of my motion capture dataset. Working on dimensionality reduction earlier also keeps the scope of the project under control and increases the chance that I have useful results by the end of the summer.

I also spent a few days working on report formatting and contents in LaTeX.

Looking Forward

I’ll continue working on studying and implementing dimensionality reduction techniques into next week. By the end of next week, I hope to have implemented the three algorithms described above on motion capture data to see which technique works best for my specific use-case.

Week 4: Gearing up for Presentations

Events

Our morning workshops this week were minimal – I attended a couple of R workshops, but that was about it.

 

Project

I spent much of this week continuing my work with dimensionality reduction. I learned how to use some of the many machine learning libraries in Python, so I could stop implementing everything from scratch. I also spent some time preparing for our midterm presentation.

 

Looking Forward

My goal for next week is to have working t-SNE, PCA, and Sammon’s Mapping models on real motion capture data – at the moment, my models have only been successful on toy datasets.

 

Week 5: Midterm Presentation

Events

Because of the July 4th holiday, this was a short week – work only began on Wednesday, when we had our midterm presentations. After that, we didn’t have too many of our normal events, though we did meet as a group to discuss our projects and progress.

Project

While this was a shorter week, I was able to make a lot of progress. I implemented t-SNE on motion capture data, did a lot of data refining and reshaping, and then implemented it a few more times with different results. Here’s my most recent result, a t-distributed stochastic neighbor embedding projection of 700 or so frames (recorded at about 30 fps) of a tai chi master.

You can see how the points located closely together in time (points with the same color) are generally clustered together in the mapping.

Looking Forward

For next week, I need to learn and implement a few more algorithms – specifically, I’m looking at growing neural gas and one of its variants, a self organizing neural network.

 

Week 6: The Final Push Begins!

Events

We began this week with a tour of Clemson’s data center and the Palmetto cluster. The sheer scale of the university’s cyber-infrastructure was impressive, and it was cool to see how, at that scale, even relatively simple issues can require creative and robust solutions (substituting wiring for ducts filled with copper plates, for instance).

Project

I made a significant amount of progress in two areas this week.

First, I completed most of the proof-of-concept experiments I’ve been running for analyzing and visualizing motion capture data.

Second, Dr. Robb and I completed the remaining work for the final experimental protocol.

Looking Forward

While many of the experiments of this project will be completed after my REU work, I do need to produce a final report and presentation, so next week, I’ll spend most of my time working on that. On a related note, our REU will also be attending several sessions in the coming weeks about presenting with power, etc.

In addition, now that we are ready to conduct trials with participants, the methods and software I’ve used for visualization and analysis needs to be updated to work with the new data formats, so I’ll spend a lot of time on that as well.

Finally, the phenomenal Tania Roy, a PhD student in Human Centered Computing, is going to present on her dissertation work, so I’ll be there, too. Tania and the rest of the graduate students I’ve met this summer have really helped make me feel at home here at Clemson.

 

Week 7: It’s Crunch Time!

Events

This week, our events were focused on providing us with skills to succeed in academia. We worked on elevator speeches, management styles, networking techniques, and understanding the power of presence.

These workshops, while different from the technical training we had received earlier, were still valuable, especially for our group of fairly quiet, unassuming computer science research interns.

Project

Sadly, I fell sick on Sunday, so my progress has really slowed.

However,  I did conduct a pilot trial, as well as two full trials with participants! I’ve had redo large parts of the system I’ve developed over the past summer to account for all of the modifications and updates that Dr. Robb and I have put into place while iterating on the original experiment protocol.

After making those changes, I’ve now started working on re-implementing all of my past proof-of-concept experiments on the new production data.

Finally, I’ve also continued to work on both the final report, as well as the presentation and poster for the project.

Looking Forward

Next week is our last week, so I’ll be working hard to finish up. By the end of next week, I hope to have finished:

  1. Collecting more production data
  2. Refining and processing production data
  3. Visualizing and analyzing that data
  4. Complete the final report
  5. Complete the project poster
  6. Prepare for the final presentation

Week 8: Wrapping Up

This was our last week at Clemson, and it was a rough one!

I completed several more trials for my experiments, analyzed that data, and produced a paper, a poster, and a final presentation. We presented at the Visualization Symposium, organized by Dr. Wole, and the presentations went off well. I’m looking forward to extending the work I’ve done while in Clemson this summer, and hope to produce at least one publication.

 

Final Report

Touch Interfaces for Teaching STEM

The goal of this project is to create a design (using Unity3D and the Gestureworks API) that is suitable for STEM research, that may be applicable for Chemistry students of all ages.

stem

stem

LIDAR Scan of Clemson University

LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses pulsed laser light to measure ranges to the Earth i.e. aerial scans. Using KeckCave’s VRUI toolit, we displayed Lidar scan of Clemson University in the Oculus Rift.

lidar scan

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar