Robot Senses with Sound
Robots that perceive their environment through video cameras are common nowadays, but SonicSense is a robot, developed by Boyuan Chen, Director of the General Robotics Lab at #dukeuniversity, that uses sound to perceive and evaluate items by measuring their acoustic vibrations, converting that data into a spectrogram, and using state-of-the-art machine learning to understand that data.

Through a holistic design of hardware and software working together, contact microphones inside the “fingers” of the robot hand tap an object to listen to the sounds and feel the vibrations of the object they are handling. Then, through an integrated AI system, it can interpret the shape, size, weight, and even the material of which the object is made.

Almost everything is open-sourced; apart from the body-frame base mechanism, the hand parts are 3D printed, and the microphones and other electronic components are all consumer-level, which helps to keep the costs low (just under $200.00 for the hand and microphones).

In-hand acoustic vibration sensing is significant in advancing the future of robot tactile perception. Eventually, robotic hands with more dexterous manipulation skills will allow robots to perform tasks that require a nuanced sense of touch to integrate multiple sensory modalities, such as pressure and temperature, for more complex interactions. Also, in the medical domain, where it might be difficult to see certain tissue types in the body visually, audio technology like this makes the unseeable tangible to doctors for treatment.