Glance up while strolling through parts of downtown Hong Kong and, chances are, you’ll notice the glassy black lens of a surveillance camera trained on the city’s crowded streets.
And that sight will become more common in the coming years, as the city’s police pursue an ambitious campaign to install thousands of cameras to elevate their surveillance capabilities.
Though it consistently ranks among the world’s safest big cities, police in the Asian financial hub say the new cameras are needed to fight crime – and have raised the possibility of equipping them with powerful facial recognition and artificial intelligence tools.
That’s sparked alarm among some experts who see it as taking Hong Kong one step closer to the pervasive surveillance systems of mainland China, warning of the technology’s repressive potential.
Hong Kong police had previously set a target of installing 2,000 new surveillance cameras this year, and potentially more than that each subsequent year. The force plans to eventually introduce facial recognition to these cameras, security chief Chris Tang told local media in July – adding that police could use AI in the future to track down suspects.
In a statement to CNN, the Hong Kong Police Force said it was studying how police in other countries use surveillance cameras, including how they use AI. But it’s not clear how many of the new cameras may have facial recognition capabilities, or whether there’s a timeline for when the tech will be introduced.
Tang and the Hong Kong police have repeatedly pointed to other jurisdictions, including Western democracies, that also make wide use of surveillance cameras for law enforcement. For instance, Singapore has 90,000 cameras and the United Kingdom has more than seven million, Tang told local newspaper Sing Tao Daily in June.
While some of those places, like the UK, have started using facial recognition cameras, experts say these early experiments have highlighted the need for careful regulation and privacy protections. Hong Kong police told CNN they would “comply with relevant laws” and follow strong internal guidelines – but haven’t elaborated in depth on what that would look like.
And, some critics say, what sets Hong Kong apart from other places is its political environment – which has seen an ongoing crackdown on political dissent, as it draws closer to authoritarian mainland China.
Following unprecedented and often violent anti-government protests that rocked the city in 2019, local and central authorities imposed sweeping national security laws that have been used to jail activists, journalists and political opponents, and target civil society groups and outspoken media outlets.
Hong Kong’s leaders have said the laws are needed to restore stability after the protests in the nominally semi-autonomous city, and argue their legislation is similar to other national security laws around the world.
“The difference is how the technology is being used,” said Samantha Hoffman, a nonresident fellow at the National Bureau of Asian Research who has studied China’s use of technology for security and propaganda.
Places like the United States and the UK may have problems with how they implement that technology, too – but “this is fundamentally different… It has to do specifically with the system of government, as well as the way that the party state… uses the law to maintain its own power,” said Hoffman.
What this means for Hong Kong
Hong Kong has more than 54,500 public CCTV cameras used by government bodies – about seven cameras per 1,000 people, according to an estimate by Comparitech, a UK-based technology research firm.
That puts it about on par with New York City and still far behind London (13 per 1,000 people), but nowhere near mainland Chinese cities, which average about 440 cameras per 1,000 people.
Fears of mainland-style surveillance and policing caused notable angst during the 2019 protests, which broadened to encompass many Hong Kongers’ fears that the central Chinese government would encroach on the city’s limited autonomy.
Protesters on the streets covered their faces with masks and goggles to prevent identification, at times smashing or covering security cameras. At one point, they tore down a “smart” lamp post, even though Hong Kong authorities said it was only meant to collect data on traffic, weather and pollution.
At the time, activist and student leader Joshua Wong – who is now in prison on charges related to his activism and national security – said, “Can the Hong Kong government ensure that they will never install facial recognition tactics into the smart lamp post? … They can’t promise it and they won’t because of the pressure from Beijing.”
Across the border, the model of surveillance that protesters feared is ubiquitous – with China often celebrating the various achievements of its real-time facial recognition algorithms, and exporting surveillance technology to countries around the world.
According to an analysis by Comparitec, eight of the top 10 most surveilled cities in the world per capita are in China, where facial recognition is an inescapable part of daily life – from the facial scans required to register a new phone number, to facial recognition gates in some subway stations.
During the Covid-19 pandemic, the government mandated a QR “health code” to track people’s health status, which in some places required facial scans.
But the technology has also been used in more repressive ways.
In the far-western region of Xinjiang, Beijing has used cameras to monitor members of the Muslim-majority Uyghur population. And when unprecedented nationwide protests broke out in late 2022 against the government’s strict Covid policies, police used facial recognition along with other sophisticated surveillance tools to track down protesters, The New York Times found.
“(China’s) public security surveillance systems … tend to track lists of particular people, maybe people with a history of mental illness or participation in protests, and make a note of people who are marked as being troublesome in some way,” Hoffman said.
The systems then “track those specific people across the city and across its surveillance network.”
“I think it’s fair to anticipate that the use of CCTV and facial recognition technology in Hong Kong will begin to look a lot like those in mainland China over time,” she said.
Hong Kong police have argued the cameras help fight crime, pointing to a pilot program earlier this year of 15 cameras installed in one district. Already, those cameras have provided evidence and clues for at least six crimes, Tang told Sing Tao Daily – and police will prioritize high-risk or high-crime areas for the remaining cameras.
The first five months of this year saw 3% more crimes than the same period last year, Sing Tao reported.
In their statement, police told CNN the new cameras would only monitor public places and delete footage after 31 days. They will follow existing personal data privacy laws, as well as “comprehensive and robust internal guidelines,” police said, without elaborating on what those guidelines entailed.
When considering AI-equipped cameras, “the police will definitely comply with relevant laws,” the force added.
But several experts interviewed by CNN cast doubt on whether those existing laws, written decades ago with broad exemptions for police, will be enough.
Steve Tsang, director of the SOAS China Institute at the University of London, warned that the new cameras could be “used for political repression” if they are employed under the “draconian” national security law.
Unless authorities assure the public that the cameras won’t be used for that purpose, “this is likely to be a further step in making Hong Kong law enforcement closer to how it is done on the Chinese mainland,” he said.
How to regulate facial recognition
Other experts argued it’s far too soon to say what the impact will be in Hong Kong, since authorities have not laid out in detail how they would use the technology.
“Hong Kong law doesn’t, in all measures, mirror what happens in mainland China,” said Normann Witzleb, an associate professor in data protection and privacy at the Chinese University of Hong Kong,
But that’s why it’s all the more important for authorities to address a raft of yet-unanswered questions, he said.
For instance, it remains unclear whether Hong Kong will deploy live facial recognition that constantly scans the environment, or whether the tech will only be applied to past footage when certain crimes occur or when legal authorization is granted.
Witzleb also raised the question of who would have the power to authorize the use of facial recognition, and what situations may warrant it. Would it be used to prosecute crime and locate suspects, for example – or for other public safety measures like identifying missing people?
And, Witzleb added, will police run the technology through their existing image databases, or use it more broadly with images held by other public authorities, or even publicly available imagery of anyone?
“It’s important to design guidelines for those systems that take proper recognition of the potential benefits that they have, but that also acknowledge they’re not foolproof, and that they have the potential to interfere with (people’s) rights in serious ways,” Witzleb said.
Regardless of how facial recognition might be used, both Hoffman and Witzleb said the presence of that technology and the increased number of security cameras may make Hong Kongers feel less free under the ever-watchful eye of the police.
“When you feel like you’re being monitored, that affects your behavior and your feelings of freedom as well,” Hoffman said. “I think that there’s an element of state coercion that doesn’t need to have to do with the effectiveness of the technology itself.”