A new algorithm for intelligent video surveillance could allow for greater accuracy by doing a bit of detective work, raising concerns over the future of automated surveillance by a privacy and civil liberties watchdog.
Researchers at MIT have developed a way to allow for instant analysis of surveillance footage that can help pick out specific individuals without the need for a human eye.
Surveillance networks often involve constant monitoring by humans to watch for individuals or events that could pose a risk.
It is difficult for humans to monitor multiple screens for hours at a time, and traditional software aimed at analysing footage is unable to make intelligent use of tools.
Existing computer vision systems are slow and prone to mistakes, according to the researchers, and a method for immediately alerting staff to a real risk – and not a cat out on an evening stroll.
Camera based surveillance systems can already analyse a number of algorithms within a video feed, giving information skin detection, or background detection to sense when something is moving through a scene.
The algorithm-based system the researchers are working on uses contextual knowledge to decide which piece of analytical software is relevant in deciphering the importance of a moving object.
This could mean, for example, that in an airport setting, a skin detection software could then trigger a cross reference with a database to find a specific individual, before alerting staff to their presence.
In other settings an object moving in an unusual way could set an alarm, or too many objects or people in one scene.
Of course, there are concerns that, while the technology could lead to increased surveillance over innocent civilians.
Nick Pickles, director of civil liberties and privacy campaign group Big Brother Watch, believes ‘passive surveillance’ through automation could be used on a larger scale.
“The main area of concern is how artificial intelligence can enable passive-surveillance such as CCTV to become a directed way of tracking individuals or certain characteristics irrelevant of whether someone is suspected of wrongdoing,” Pickles said, speaking with TechEye.
“Existing surveillance infrastructure will take on an entirely new role as the front line of population screening, looking for pre-determined characteristics and behaviour.”
According to Pickles, while there could be efficiencies measured by certain metrics, handing power over to automated tools will have a knock on civil liberties.
“It’s far from clear that delegating more decisions to computers is going to improve security,” he said, “rather than swamping staff with false positives and infringing on the civil liberties of those who, through no fault of their own, trigger algorithms in some distant control room.”