Boffins trying to get computers to see are being helped out by the latest computer-on-a-chip technology.
Apparently real-time object recognition requires huge wodges of processing power and a whole room of servers to pull off.
But if you could manage it on a mobile phone or hand-held device the world would be your oyster.
A small group of boffins at Yale University claim to have developed a “supercomputer” able to do real-time object recognition.
Dubbed NueFlow it is a neural network mapped onto an FGPA, and can easily fit on the top of a desk, in a car, and eventually into a smartphone.
Top boffin Eugenio Culurciello of Yale’s School of Engineering & Applied Science, said NueFlow works by simulating the mammalian visual system.
It uses vision algorthims, based on convolutional neural networks (ConvNets), mapped onto a Xilinx Virtex FPGA. It was managed on a 200 MHz Virtex 6 processor.
Using custom hardware was important because it can speed up the ConvNets systems. It enables larger networks with more object discrimination capabilities.
The system delivers a modest 100 gigaflops of computer power and is 60 times faster than that of an Intel i7 CPU running the equivalent software. It is twice as fast as a high-end (512-core) Nvidia GPU.
The boffins want to jack the technology under the bonnet of a car so that it can navigate properly, and maybe drive itself.