Researchers at MIT have worked out a way of getting invisible details on graphics visible to the naked eye.
MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL)’s software amplifies variations in successive frames of video that are imperceptible to the naked eye.
It makes it possible to see “see” someone’s pulse, as the skin reddens and pales with the flow of blood.
It can also exaggerate tiny motions, making visible the vibrations of individual guitar strings or the breathing of a baby in a neonatal intensive care unit.
It works a bit like an equaliser in a stereo by boosting some things while cutting others. Instead of sound the equaliser plays with colour changes in a sequence of video frames.
The prototype of the software allows the user to specify the frequency range of interest and the degree of amplification.
It works better with phenomena that recur at regular intervals – like the beating of a heart – because it can only amplify changes that happen more than once. It also works best if the motions are extremely small.
The software could compare different images of the same scene, allowing the user to easily pick out changes that might otherwise go unnoticed.
Michael Rubinstein, Eugene Shih and professors William Freeman, Fredo Durand and John Guttag intended the system to amplify colour changes, but they also found that it amplified motion too.
Rubinstein thinks the idea could be used for “contactless monitoring” of hospital patients’ vitals.