EYEAR





"Eyear" is a human-centered device which is developed to augment human's perception of environment.

Human get used to see things through eyes and hear things through ears. This conventional model has a lot of potential for improvement. Utilizing digital technology, we can process the sound wave information around us, and use them to modify our vision. Eyear overlays one’s hearing to vision, prioritizes layers of vision according to the audio levels of various sound sources. By creating new rules to augment/restrict human sensory organ, this device enable men to see what they hear, and let them only see the surroundings which have sound. This digital intervention to human's sensory organ created a totally new way for human to perceive the environment.

Eyear consists of a camera, an ipad(remote display), four sound sensors oriented in four different directions, an arduino board with a circuit connecting all the sensors.

The visualization is enabled through codes written in Processing. The value differences of 4 sound sensors are used to calculate the sound source location. And its position is used to change the transparency of the grid which covers our view. The sizes of the highlighting area is determined by its corresponding audio level, while its relative location on the display indicates the orientations where the sounds come from, with the upper and lower parts representing front and back.