"Oscillate" is my thesis piece from the Computer Art MFA program at School of Visual Arts. Finished in May 2013, it has been shown in film festivals, galleries and events around the world, as well as winning a 2014 Student Academy Award in the Alternative category.
While in many ways a tribute to my love of art, computers and music, 'Oscillate' is mostly inspired by the emergence of complexity in sound from the combination of fundamental sine and cosine waves, which even on their own have a very beautiful, pure and calming effect both visually and aurally. This concept of complexity from simplicity is represented abstractly here, but is an artistic nod to Fourier Analysis, which is used in the countless areas that rely on signal processing, from speech recognition to computer vision to music production and far more.
The music was composed by myself, done in tandem with the visuals.
Volvox labs : Dubfire
Audio-reactive, real-time visuals and effects developed for Volvox Labs for the Dubfire Live HYBRID tour.
I created several different compositions for live playback with real-time controls to modulate different animations and effects to be controlled live on stage with the music. All done within Touch Designer.
This piece was part of Art Hack Day, a 48-hour hackathon where participants work alone or together to create a piece centered around art and technology culminating in an art gallery showing.
The theme of this event was 'Erause'. My piece centered around the idea of one's visual memory of faces fading as time progresses.
This was done by setting up a photobooth in which visitors would come and take a photo of their face and watch it slowly morph into a running average of the night. 180 people took their photo that night and the video you see here is a result of the progression of the average face's evolution.
This project was done in Unity, using C# and the native Unity libraries on a laptop with a webcam. Users would fit their face inside the circles for eyes and mouth, and then press a key to take a photo. Aside from a single soft box light graciously donated by a friend, nothing else was used.
Future plans for this piece include running average video experiments and facial feature recognition to replace the manual aspect of the piece.