Harvard University, Graduate School of Design
Artifacts as Media, Spring 2022
Instructor: Jose Luis Garcia del Castillo Lopez
Neil Harbisson is a well-known cyborg artist who has an implant that allows him to hear colors. In an interview, he famously said, "When I look at someone I hear their face." He also pointed out that someone might appear beautiful to the eye, but their face might sound terrible to him because of the unique combination of colors that make up their features. This idea of hearing a person's face has inspired the exploration of whether it would be possible to use technology to convert a person's facial features into sound, down to the pixel level of precision. One potential application for this technology could be on a platform like Instagram, where there are many selfies posted, as it could potentially be used to help people with hearing impairments better understand what others look like.
Utilizing p5.js and tone.js, the following exploration sonifies each pixel by mapping each of its RGB channels to a different instrument. This is analogous to how each pixel of the image can be sampled for its specific RGB code or color. When the mouse hovers over a pixel, it can be heard, just as each pixel of the image can be sampled for its color.

base image source: happycoding

Hear how different each portrait sounds. Turn sound on when watching the following videos. 
base image source: thispersondoesnotexist.com