Should CCTV systems make use of facial recognition software?

More companies are implementing facial recognition software, despite the fact that it lends itself to potential exploitation

facial recognition software

This week, a property company came under fire following the use of controversial facial recognition software at Kings Cross in London. As a result, the Information Commissioner’s Office (ICO) are reportedly investigating businesses using the surveillance technology.

Regulator investigates use of facial recognition software

According to reports, the UK’s privacy regulator has launched an investigation into the use of facial recognition technology by the private sector. Indeed, the body has warned that it would “consider taking action where we find non-compliance with the law.”

On Monday, the owners of the Kings Cross site confirmed the use of the technology across the 67-acre area. They also defended its implementation, insisting that it was “in the interest of public safety and to ensure that everyone who visits has the best possible experience.”

A spokesperson also said that the tool was one of “a number of detection and tracking methods” used at the site. However, the local council said that they were unaware that the system was in place.

Unethical use of tech

Last year, a civil liberties group launched a legal challenge against the UK police for their use of facial recognition. The system sparked a mass amount of controversy following its pilot in London, Humberside, South Wales and Leicestershire.

While the police described it as “an extremely valuable tool,” the director of the civil liberties group disagreed. “When the police use facial recognition surveillance they subject thousands of people in the area to highly sensitive identity checks without consent,” Silkie Carlo said.

The Orlando Police Department also dropped its pilot of Amazon’s facial recognition program, Rekognition, amid public outcry. The public widely criticised the software since its launch two years ago, but the department could reinstate in the future.

Indeed, facial recognition software lends itself to exploitation or the introduction of mass surveillance. Considering its potential for abuse, it is thus imperative that governments now introduce legislation to curb potentially disastrous ramifications.

What are the ethical implications of adopting AI? We spoke to CEO of Satalia, Daniel Hulme, in order to find out