I was pleased to perform with my good friend Ryan Schoenherr as NSFW at the end of 2019, before COVID-19 put us on forced hiatus; this gave us time to rethink what live performances of our future can look like. The following is a compilation of realtime work created in Touchdesigner in partnership with Ryan, paired with an original soundtrack of my own creation.
As I have in the past, I was able to return to the Savannah College of Art and Design once more with my industry peers to share thoughts with students. Though like the past years too, I was shocked at how much was shared back to us; it is wonderful to see how much the education at SCAD continues to prevail at putting out the top talent and networking opportunities. This occasion was particularly engaging, as many students I had met a year before had many updates to share with me on how they grew since we last spoke, and the growth was amazing.
I like to take the forum opportunity to speak on future technologies in the design industry, though many of the students appear to be well ahead on building those future technologies themselves. While COVID-19 has crippled much of the new-hire industry for post-grads, I see you all, and feel confident you will be the leaders for our industry in the not-so-distant future.
For more about SCAD Comotion, an annual student-run event for the design industry, reach out to their officers at: firstname.lastname@example.org. Or view their website at: scadcomotion.com.
2019 proved to be one of my most difficult years as a design director. We lost the most exciting project of my career, despite me and my team giving it our all. Fortunately so much was learned and created during that time that I am certain pieces of it will see the light of day once again. By leveraging depth tracking from a wide-angled Orbbec camera and a custom software solution, we were able to extract full XYZ tracking data in real time and translate that into Touchdesigner using a simple light orb. No IR tracking necessary.
Back in my days at Second Story, we were frequently approached by our retail partners about making their shopping experience—particularly that of a telecoms company—one that could be simple, or even enjoyable. While it never saw the light of day, my nearly finished Expression Mirror was a fun way to give visitors a playful experience to ease their shopping anxieties, while also providing thoughtfully placed information to help inform purchasing decisions.
For a long time, we delivered pre-rendered animations to our partners at Oblong Industries to build out in their codebase, but the translation between fidelities always seemed to nod to a gap in communication. Not for lack of ability on either side, but because the methods afforded in pre-rendered animations were thought to be higher than what could be done in real time. By leveraging the capabilities of Touchdesigner, we were able to create a method of transfer that more seamlessly blended the fidelities that we were looking for by meeting in the middle of code and animation.
Alongside Facebook AR's Anthony Dodero, Leviathan's Brittany Maddock, SMT Design's John Howell, Meptik's Nick Rivero, and Chris Finn from Gentleman Scholar, I had the privilege to represent Design and Technology with some of the industry's finest. We enjoyed answering difficult questions from SCAD faculty and students alike, and formed a closer bond while at it.
I had a little time to explore recreating a common 3D workflow of essential lighting and rendering techniques in a realtime environment, and was surprised by the general lack of approachable explorations from others on the subject that did not involve extensive GLSL. After riding the shoulders of a few other projects I found in the wild, I was able to compile something more true to what a 3D artist would be familiar with, and make it easy to approach in Touchdesigner. Then I decided it to share it here with you!
Working with our partners at Oblong Industries to translate custom PLY files to a realtime controllable environment built in C#, I developed a process utilizing Agisoft Photoscan, Cloudcompare, and Cinema 4D to blend actual photogrammetric scan data with 3D colored mesh information. The question: could photogrammetric point cloud data with color information run at high FPS while maintaining 15+ million particles and custom injected 3D meshes? The answer was yes.
My colleagues Angelica Jang and Crystal Law and I often found it hard to communicate what types of work and backgrounds qualified as motion design. While trying to answer this question for the purpose of a presentation at Local Projects, I cannot say we really landed on any definitive answers. However, we did expand our own insights through the process, and then recorded that and decided to share it with others as well.