Future Creators is a Google sponsored project at the Art Center College of Design. The project uses an ever-expanding array of Extended Reality (VR/AR/MR) technologies — spatial sensors, photogrammetry, motion capture, controllers, and displays — to interweave the digital and the physical, and imagine new hybrid tools for the future creators of 2023. With partner, I prototyped and designed an AR program and online creation and distribution platform working closely with an Adult Swim Costume Designer as our use case.
Chris Taylor & Tianlu Tang
Rachel Kinnard(Case Study)
The first week was spent exploring different methods of application and problem solving. The innitial tests involved using iron on patches of AR compatable image targets and a rudimental time based AR treatment. Focal points and conclusions of the innitial investigation were the correlation between plane relationships with tracking ability as well as the value of time based AR effects.
WEEK 2AR/VR MOCAP
The second week was used to explore a multi pronged approach of a mixture of the first week AR effects and a VR based motion tracking with a similar treatment. The rationale was to investigate possible solutions to the fabric based image tracking bottleneck. With VR Vive trackers it was found that while they were more accurate with consistent tracking they shared similar issues of “blind spots” to its AR based counter part. Ultimately this particular iteration confirmed the value of large image targets with time based AR treatments.
WEEK 3MULTI PLANE
Expanding upon the previous week’s iterations this week it was decided to focus more on a plane study of possible trackable body planes. Using a series of targets varying in size we applied them to all the primary body planes and explored as many camera angles as possible. The goal of this week was to flush out our existing beliefs of the chest, back, and arm targets for being the most succesful planes and the most likely to be used by future creators.
WEEK 4MULTI EFFECT
The fourth and final week of AR treatment exploration was one done in the wild of Los Angeles. We strived to explore real world scenario in a non lab based enviroment. Using a handful of different effects, much of which were primarily time based, we continued our investigations into trackability and effect value. We also explored different methods of appling the graphics to the fabric, using both our innitial iron on technique and then adding sewn on printed fabric. Ultimately we concluded that the self replicating AR treatment was successful and to continue our work with large image targets on primary planes of the body.
GARMET DESIGN WITH REALTIME 3D SIMULATION
SWITCH TO AR CONTENT DESIGN WORKFLOW
PUBLISH TO CAMERA NETWORK OR DEVICES
CONSUMERS CAN VIEW VIRTUAL RUNWAY LIVE