top of page

The Narratarium

With: K. Hayden, D. Novy, M. Bove, S. Alfaro, P. Colon, and R. Speer

Imagine what would happen if you could tell a story with a computer - as you read a book or make up a tale the computer would illustrate dynamically.  It would build color pallets to fit your topic or mood, build an immersive graphical environment, and find images or sounds to fit the story you tell.  We built several versions of a system called the Narratarium to do just that. 

 

The narratarium augments printed and oral stories and creative play by projecting immersive images and sounds. We are using natural language processing to listen to and understand stories being told, and analysis tools to recognize activity among sensor-equipped objects such as toys, then thematically augmenting the environment using video and sound. New work addresses the creation and representation of audiovisual content for immersive story experiences and the association of such content with viewer context.

This MIT Media Lab project was in collaboration with the Object Based Media Group and Hallmark Cards. 

bottom of page