VR Grid for Joiners

Interactivity
The journey of experimental design in digital interactivity
Experimental Design in Digital Interactivity
Cameras, machine learning, python development environments, and audio and visual elements are combined using the popular interactive design application, TouchDesigner. Augmented Reality (AR) experiences are developed using Unity.
Joiners VR Panorama
A 360 VR Panorama inspired by David Hockney's Joiners project. This is located in Laurel Hill Cemetery in Philadelphia, PA.
Interactive Elements in the Cemetery VR
In this assignment we add audio and visual interactive elements to the Cemetery VR. There's background audio of a crow and a slight breeze, children playing, and a weed hacker being operated by a grounds keeper. In one area of the sky you can also see a looping video of a bird flying. All of these elements were present during the two-day photo shoot in the location to create the original Joiners VR assignment.
Facial Recognition and Tracking using Python and Touch Designer
Facial tracking data is integrated into Touch Designer to move the cube.
Augmentation using facial tracking.
In this assignment I used facial tracking to navigate the cemetery and manipulate the object. Audio is fed through to create an additional particle effects.
This is a homage to Future Sound of London's music video for My Kingdom. The audio is not present because it's copyrighted, however, you can view the original video here: https://www.youtube.com/watch?v=EdFKoZHzMQ0
Eye Spy Halloween Edition
In this assignment we were to make our version of the popular children's game "Eye Spy". Since the assignment was given out during Halloween, I decided to return to the tranquil cemetery and make it spooky.
The eye spy circle/locator changes color from green (safe) to red (danger) when spooky elements are spotted. Audio of the element plays as long as the locator is hovering over it.
For this assignment we used a mix of technologies including NDI virtual cameras, Google's Teachable Machine, python, tensorflow, and TouchDesigner.
Pokemon Augmented Reality
For the final class assignment we were tasked with creating machine learning recognition using vuforia's ML within Unity. I used Pokemon Playing/Trading cards for the ML and free Pokemon 3D models for the 3D augmentation.
Using a webcam, vuforia detected the cards and displayed the corresponding 3D models which Unity used to display in augmentation. I was able to navigate around the model and get perspectives from all angles. Music was added post-production in Adobe Premier, but in-engine sound elements within Unity could certainly be used for the same effect.