The ability to see through walls would rank pretty high on most people’s superpower wishlist, particularly if you are a soldier fighting in some warzone.
Luckily, engineers at MIT’s Lincoln Laboratory have been working on real-time ‘x-ray vision’ that can detect any movement – even on the opposite side of an eight inch thick concrete wall.
The scientists use radio waves that are able to bounce back through solid walls to show a rudimentary picture of what is happening on the other side.
More impressive, they are able to do so from a distance.
The radar is made up of two rows of antennas, eight receiving at the top and thirteen transmitting at the bottom. These are all mounted on a movable cart along with the necessary equipment for computing the sensor’s findings.
The systems were tested on four and eight inch thick concrete walls, with radio waves sent at certain frequencies. Many of the radar waves hit the wall and were sent back, with only a 0.0025 percent rate of effectiveness.
However, this is relatively easily overcome with signal amplifiers able to enhance what is available. The main problem has been processing the images at the speed, resolution and range required for them to be useful in realtime.
The team was able to reach a real-time frame rate of 10.8 frames per second, even at a range of 60 feet away from the wall.
This is possible using the same signals sent in wi-fi, S-band waves, as larger wavelengths would require much larger apparatus. While S-band is a fairly short wavelength, the use of amplifiers solves that problem, meaning that the device can be kept to roughly eight feet long.
Even when equipped with amplifiers, it’s problematic that the wall will still appear brightest when an image is computed. This is solved by filtering the image so that only waves of a frequency corresponding to the distance of the moving target are shown, essentially deleting the wall from the video.
One of the problems is that inanimate objects are not picked up. While this is not so bad for detecting enemies busying themselves, it presumably means that its use in search and rescue would be a tad more limited. Small movements can be picked up by the systems sensors, the researchers claim.
At the moment, any movement that shows up is digitised into a ‘blob’ that moves around the screen in a bird’s eye view perspective, but the researchers are working on algorithms to make the interface more user friendly.