King’s Cross train station in London is a busy public space in the comfortable heart of a busy city. Within a short walk of the station, you’ll find a Harry Potter store, the Royal National Institute for Blind People, schools and colleges, and Google’s UK headquarters. Recently, the Financial Times also discovered something else: the private developers of a 67-acre site around the station were reportedly using facial recognition technology to track people as they passed over their land.

The Toronto Region Board of Trade took an unusual position in January of this year: for the sake of business and for the sake of the general public, Canada’s largest urban chamber of commerce called on all three levels of government to speed up the pace and strength of regulations governing data collection and use in the public realm. The King’s Cross situation is an excellent example of why this call is urgent for Canadian citizens and industry.

Facial recognition (FR) tech uses biometric features and identifiers to identify faces from images or videos. Using algorithms, FR matches these biometrics to a database of facial features to identify individuals or demographic characteristics such as gender or age. Though still a developing technology, FR is already used to provide a wide range of services that we probably already take for granted. Some people may have used FR to unlock a phone or laptop. Others might have gone through an e-passport gate at an airport that uses FR to match a person to their passport. But, as the example of King’s Cross shows, the challenge of so-called public realm data is that technologies like FR can be deployed without members of the public even knowing they’re on camera.

The technology offers the prospect of greater security and public safety: for example, in helping identify criminals. FR can also be used to improve user experiences; it allows travellers to bypass passport queues at airports. However, studies have found FR to be biased in favour of white men and against women and individuals with darker skin tones. There have also been notable cases of false positives, where FR linked the wrong individual to a crime. Given its use by police and security services, misidentification could have dire consequences.

Regulation hasn’t kept pace with the deployment of FR technology, and it should be no surprise that the result has been public unease about even its most benign uses. Some US cities have already responded by limiting the technology in their jurisdictions. San Francisco has banned the city’s 53 departments, including the police, from using FR, and other cities have followed suit.

The private developers of a 67-acre site around Kings Cross station in London, UK, were reportedly using facial recognition technology to track people as they passed over their land. This is an example of why it is urgent for Canadians and industry that we regulate the technology.

In Europe, the UK House of Commons Science and Technology Committee called on the UK government to issue a moratorium on the use of FR until there is an updated legislative framework and further guidance on trials. Some broader legislation touches on FR, like the EU’s General Data Protection Regulation (GDPR), and these laws already make its use unlawful in some contexts. The UK Information Commissioner has announced an investigation to see if the FR use around King’s Cross contravened the GDPR. The European Commission has gone further and is planning regulation to give EU citizens explicit rights over the use of their biometric data. And for those who think regulatory uncertainty on this issue is just a business problem, note that the Toronto Police Service recently trialled the use of FR to match still images from surveillance videos to its database of mug shots, bringing complaints from critics like former Ontario Information and Privacy Commissioner Ann Cavoukian, who insisted such a pilot needed more transparency.

For these innovations to deliver a net positive outcome for both the economy and society, the public must be an active part of the conversation about their use. This is especially true when it comes to private data, an area where 92 percent of Canadians express some level of concern about privacy protection. Without a regulatory regime that sustains public support, backlash will be inevitable.

Getting this right also matters for the Canadian economy. We suffer from weak rates of adoption and commercialization of advanced technologies. Without clear laws and principles that are compatible with rules in like-minded international markets, we risk nudging Canada into a scenario where we have an ethical, regulated industry but no markets to export those solutions to.

The inner workings of government
Keep track of who’s doing what to get federal policy made. In The Functionary.
The Functionary
Our newsletter about the public service. Nominated for a Digital Publishing Award.

The UK Information Commissioner’s Office is charting a potential way to address the regulation gap between privacy and emerging technology with a regulatory “sandbox” for businesses and public bodies to work together as they trial new products and services, helping them ensure that they have data protection built in from the start. Among the first ten participants: London’s Heathrow Airport, which is experimenting with FR to smooth passenger journeys. This is a model that Ontario should look to replicate in its Data Strategy (under development), and the next federal government could incorporate it to follow up on the Digital Charter announced in May 2019.

Ethical considerations must be a focus of these efforts. It is well established that humans exhibit bias, both implicit and explicit, in our actions and decisions. As new technologies are introduced, it is essential that biases aren’t amplified and exacerbated. In the UK, the independent London Policing Ethics Panel, set up by the city’s mayor, reviewed the use of live FR, considering the impact on civil liberties and the risks of injustices as well as its benefits for security and safety. It recommended that live FR should be deployed only when certain conditions can be met. The onus is on the tech’s proponents to demonstrate that the technology will not import gender and racial bias into police operations. Canada’s Directive on Automated Decision-Making, announced this year, is a positive step in applying ethical principles to government procurement of AI, though its effectiveness remains unproven.

None of these issues are isolated to any one country. Canada needs to work with international partners to create effective common approaches to the regulation of emerging technology. Work has been done at the OECD to develop Principles on AI, and the G20 this year, under the Japanese presidency, included big pushes on a “Data Free Flow with Trust” system and on helping to create ethical, trustworthy data privacy frameworks that don’t restrict cross-border data flows. These are important efforts, and Canada should be a leader in their development and implementation.

But the lesson from King’s Cross is that boundary-breaking new technologies will be deployed one way or another, and if it is not with Canadian participation and with Canadian concerns at heart, then it will be by others who may not be waiting for our rules to catch up. In the Toronto Region Board of Trade’s view, the best response isn’t to be worried. It’s to act fast, and to specify through enforced regulations what responsible data governance and protection looks like in new situations. And we must be consistent with both Canadian values and trade objectives when we do.

The technological reality is that, unless we act, a Canadian King’s Cross moment is sure to pop up at some point — so we had better get started soon.

Photo: Shutterstock, by MONOPOLY919.


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Thomas Goldsmith
Thomas Goldsmith is the Toronto Region Board of Trade’s policy director for innovation and technology. Previously, he worked as a policy adviser to techUK and the Royal Society.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License

More like this