This project is an investigation into the use of micro-controllers to simulate the function of one section of the human hearing system. This was achieved by creating a visual representation of the hair cells inside the ear reacting to a sound stimulus. This was accomplished using an Arduino board, which executed code to detect the individual frequencies from an audio input. This data was then used to trigger one of a series of light bulbs, on or off.
The installation is a live, reactive visual representation of the section of the ear which detects different frequencies. It consists of a sequence of light bulbs which are suspended on a frame. An Arduino is used to access an audio input, which can be a piece of music, human voice or any other type of sound. A written piece of code analyses the sound. The analysis determines the individual frequencies, including the loudest or most dominant. Each bulb is switched on in response to a particular frequency range. When a certain frequency range is detected and determined to be the most dominant frequency range, the bulb associated with that range switches on. Each time a new dominant frequency is detected a new light bulb is switched on, and the previous one switches off.
Human hearing involves an extremely complex series of functions. The aim of this project was to show one aspect of this by using an audio input and turning this into a visual output, with the use of of a micro-controller. The project succeeded in providing a visual representation of one aspect of the hearing system. It would also be possible to expand on the project in the future, creating a larger, more responsive installation.