top of page

Starfall - Interactive Exhibition

Publicly Interactive Exhibition

Projection Mapping

Soundscape Development

Artificial Intelligence

Computer Vision


Queen's University


Kingston, Ontario, Canada


This project emerged out of a call for artistic installations looking to transform the way we view our surroundings. 

We utilized computer vision techniques along with custom software that allowed guests to virtually interact with a projection-mapped visual space. It was further complimented by a user-generated soundscape meant to invoke a sense of belonging and communion to the piece.

A series of cameras captured a visitor's RGB + depth image and inputted it into an open-source AI image generation software that output a scene description and 'interpretation' of the captured material. This resulted in a 'digital mirror' that users could use props and body movement to interact with while seeing their reflection in the digital visualization.


Use cutting-edge AI tools and CV to create a one-of-a-kind public exhibition.


bottom of page