To learn about a hug one must actually experience a hug. Without wrapping your arms around a body, descriptions, visualizations, and models fall short. In parallel with plastic scale models, I am rendering body-scale models for viewers to interact with physically. The initial iteration of these huggable data is a mobile AR application. A small printed adhesive image triggers a virtual model to render in the world captured by a smartphone camera.
This floating model can then be hugged, held, measured, etc. via pantomime. Absent haptic sensation, a participant must imagine a body between their arms.
The app is designed for two users to achieve maximum value. One user holds the phone and aims the camera; the other wears the image and interacts. Performed in this way, the wearer cannot see the model with which they interact – they have to guess, or take direction from the other user. My absurd representation of a social interaction through a cooperative app therefore creates a second social interaction via its implementation.
This double enactment-reenactment folds embodied and virtual experience into an actively social medium.