People’s Choice Award
The ‘Aerobanquets RMX’ is a series of immersive, augmented sensorial experiences focused on taste and perception.
Loosely based on the Futurist Cookbook, the (in)famous Italian book of surreal dinners and recipes first published in 1932, the ‘Aerobanquets’ was premiered at the Chronus Art Center in Shanghai in spring 2018. The project is a collaboration with chef Flavio Ghignoni Carestia, who created an original menu based on the futurist cooking style.
Part-manifesto, part-artistic joke, the Futurist Cookbook is a collection of recipes, experiments, declamations and allegorical tales: here are recipes for ice cream on the moon; candied atmospheric electricities; nocturnal love feasts; sculpted meats. The Futurist Cookbook is a provocative, visionary work on the future of nourishment, which well ahead of time touched on issues of post-capitalistic societies and labor that are so relevant to our times.
Aligned with the futurist notion of a ‘total work of art’, the ‘Aerobanquets RMX’ are veritable multi-sensorial journeys encompassing all the senses: sight, smell, hearing, taste, and touch.
The project deploys several technologies, softwares and custom solutions from the fields of 3D modelling, CGI, augmented reality, VR and motion tracking.
In collaboration with Flavio Ghignoni Carestia, we created thirteen original dishes inspired by the recipes of the Futurist Cookbook. The flavor profiles for each dish were consequently categorized into parameters for shapes, colors, texture and points, which were fed into a 3D engine to generate thirteen unique digital models. These models then were used as virtual counterpart to the actual food, which was tracked by a room-size motion tracking system (Opitrack) and by means of reflective markers embedded in custom designed utensils. The VR content was designed in Cinema 4D and Unity. Through a series of custom scripts and Unity assets from Optitrack, the relative position of utensils and headsets for each participants were broadcasted via local network, thus enabling each participants to interact with the others in a seemingly shared virtual environment.
A second level of immersion and interaction was enabled by the use of the IR hand tracking system LEAP motion, installed on the front of each headset. This allowed real-time hand tracking into Unity (via the Orion SDK), adding natural gesture and haptics to the experience. Several custom made scripts in Csharp were deployed within Unity in order to adjust parallax discrepancies between the LEAP and Opitrack systems, and to assess relative positions between hands and utensils, or utensils and the mouth of the participant.
By doing so we were able to track when the participant would savor the food, and design a responsive, immersive system focused around the perception of taste.