FURI | Fall 2018
Rendering of Patterns on Social Mirror for Individuals who are Blind
This research seeks to assist the blind and visually impaired in conversations by allowing them to feel others’ facial expressions through vibrations. The research has developed a method to project the mapped facial expressions from videos and images onto a haptic face display (a rectangle grid of rounded motors) through vibrations, by dividing the recorded frame from the OpenFace software into a similar grid, and through that actuate the motors on the face display according to the recorded facial action units. Future plans for the project includes designing and performing user studies to record results of user experience.
Hometown: Tempe, Arizona
Graduation date: Spring 2021