The clay pipes in my front yard are as old as the house I live in. While society developed new materials, plastics, and production techniques, this older technology lived on under the earth, doing its job. At some point, however, tree roots grew through cracks and blocked the flow. It was inevitable – clay isn’t strong, certainly not as strong as the new high-density polyethylene plastic pipes we put in after digging up the front yard.
Sidewalk Labs, a project of Google’s parent-company Alphabet, has pulled out of the futuristic smart city project it spent two years pitching to Toronto. The company was proposing to build a city-of-the-future on acres of brownfield land near Toronto’s waterfront. It was looking for all sorts of public/government concessions as it created a living lab of technology woven into city infrastructure. The company cites a changed real-estate market (thanks to COVID-19) as the reason for cancellation, but the project also faced significant backlash from the citizens about its sensors-and-AI vision of life in the modern city.
I’ve been thinking about our concepts of privacy as something similar to my clay pipes: old and worn, doing their job underlying society without my sparing them a thought … right up until they don’t anymore.
The right to privacy is the definition of something we take for granted so completely that it’s hard to even pinpoint what we mean by it: it’s not explicitly mentioned in the Canadian Charter of Rights and Freedoms or the US Constitution. While there are protections to personal effects being unreasonably searched or seized, and explicit notions of our rights to life and liberty, the implications of a right to privacy in our modern digital life are more subtle. The Smart City pulls surveillance-tech from our smart phone, walks it out the front door of our smart home, and embeds it into the built environment. And the controversy about Sidewalk Labs and Waterfront Toronto was at the heart of the conversation.
Announced in 2017, the proposed partnership would have developed Quayside, a 12-acre plot of land near the waterfront of Toronto, using new techniques, technologies, and interventions in urban design. Sidewalk Labs was formed when the behemoth that was Google diversified its various efforts under Alphabet, a holding company for standalone tech plays ranging from wi-fi-dispersing balloons to self-driving cars. Google remains the search and advertising arm, which generates up to 86 percent of Alphabet’s revenue. Sidewalk Labs is its smart-city play. And the biggest playground to date was the Quayside proposal.
The city is a palimpsest, built up, layer by layer, over generations, with new infrastructure piled onto half-erased technology from years ago. We edit and overwrite it piece by piece, typically when it breaks. Like my new plastic sewer-pipe living next to my neighbour’s clay ones, a city is a patchwork of these various technologies from various eras co-existing in an uneasy peace.
City staff have to deal with legacy systems at the same time as they are trialing and installing shiny new upgrades. LED streetlights are on one block with older incandescent lights on the next. The overwhelmed Unemployment Insurance system in the US, where modern web-systems needed programmers of COBOL (introduced in 1959) for repairs, is a contemporary example of this. The cost of replacing everything at once, in labour, lost productivity, capital costs, and political will, makes it just not tenable.
To view it in the most generous light, Sidewalk asked a question: If we didn’t have all of the legacy infrastructure of a modern city, what would we build? What if we developed something using today’s best practices without the layers of legacy technology – to use its catchphrase, designing a city “from the internet up”? Ideas on the table included “tall-timber” sustainably produced apartment towers, replaceable glowing hexagonal street tiles, underground garbage tunnels inhabited by robots, and computer-tracked crosswalks. That all sounded like sci-fi – or, at the very least, a tech-fantasy for the niche enthusiast, beneath the notice of most city-dwellers.
Imagine, as Sidewalk Labs did, an intersection camera capable of identifying a senior in a walker, able to increase the time given to cross before the lights change. Question: Who has access to that camera? Is the video-feed stored? Can law enforcement access it? Or immigration officials? Truancy officers? Civil tort lawyers? Sidewalk is a New-York-based entity; would the data be stored in Canada, subject to Canadian laws around search, seizure, and monitoring, or in the US, subject to American law and interests?
If these sound like the niche concerns of a tinfoil-hat conspiracy theorist, you can see why it’s hard to get the average citizen riled up about data privacy. Most of us still subscribe to the “I have nothing to hide, so I don’t mind if I’m monitored” school of thought. There are compelling reasons that shouldn’t be the case: for one thing, minority groups and people of colour face more threats from surveillance than majority groups, and a digital stop-and-frisk program could subject some people to more oversight than others. For another, we all benefit from and need to guard those who speak truth to power. If the only people who try to protect their privacy are watchdogs, they stand out from the crowd. If we all preserve privacy by default, then flagging those who use encryption or avoiding street-level cameras isn’t itself a way for those in power to find their critics. It’s the principle of dazzle camouflage – individual zebras blend in with the herd, making it harder for a predator to single them out. Increased surveillance makes it harder for those who are targets – journalists, activists – to do their jobs.
There is also a chilling effect from surveillance. To understand this, ask yourself: do you behave differently if you know you’re being watched?
There is also a chilling effect from surveillance. To understand this, ask yourself: do you behave differently if you know you’re being watched? Studies have found that people are less likely to share political opinions in an environment of surveillance. And the availability of more information about us leads directly to more abuses by those with access: Police officers have abused databases to track down ex’s; some Uber drivers have used home addresses and phone numbers shared by the rideshare app to stalk former passengers; Clearview AI allegedly tracked the faces of known journalists in its software; and the UK’s Big Brother Watch has documented thousands of cases of breaches of data security by local officials.
Lastly: who extracts benefit from all that data? If we think about it at all, we probably know that we trade our behaviour data for a service: Facebook can track my likes in exchange for, well, Facebook. It’s a bargain we’ve struck – or at least seem to be comfortable ignoring. But the data about our real-world activities, our interactions with public health services, our movements in the community, our engagement with municipal authorities – this also has actual financial value. Even if we’re comfortable having it warehoused and analyzed, shouldn’t there be a negotiation (among people who know more about it than most average citizens) on our behalf? What’s the exact value of the data? What do we get in return? How long do we agree to share the information? What terms govern the agreement – who can end the relationship, how can we move to a different vendor, how can we set up a bidding war or a request for proposal to make sure we are getting fair value? Sidewalk said it wouldn’t commercially exploit the data it gathered – but was this promise guaranteed by contract or by law?
Sidewalk Toronto promised to make Quayside “the most measurable community in the world.” But technology encodes beliefs about the way society works. It makes concrete and functional the mores and beliefs of the day. One person’s “measurement” is another person’s surveillance.
Let’s make it clear: it’s not that there is a particular technology of monitoring in play that is the focus of critique. Rather, it’s that the rules of engagement and the protections of our rights aren’t explicitly outlined. It’s a simple fact that the companies themselves, the business entities, respond to different rules of engagement than the bodies that govern or are governed. We can hope for change, but for the moment, companies maximize shareholder value instead of social value. This shouldn’t surprise us – we should act in full knowledge of this truth. And perhaps the way the Sidewalk Toronto story unfolded is how it needs to work. We need the tech giants to be the force for creative disruption. They think differently. They move fast and break things. But it’s time for governments and citizens to be deliberate about which of their inventions we want, and which we reject.
I believe a public discussion about civic privacy was never going to arise on its own. It seems we needed something to threaten our privacy, our underlying rights, before we got bothered to dig them up, examine them in the light of day, and replace them with something modern, robust, and thoughtful. We can and should use the scandals of the day as a forcing function for change. Sidewalk Labs was a stress test for democracy. How strong are our bulwarks? What do we need to patch and repair? Just because a company revealed the opportunity doesn’t mean we have to let it take it. We can inspect its work, we can partner with it if appropriate – but we can turn its ideas into our own version of change.
When the ever-searching tree roots cracked the clay pipes in front of my house, I called the city. It sent workers to come to find out what would be my responsibility, and what the city would need to maintain and repair. The city measured where the sidewalk ends. I think this is the moment we’re now in. In some sense, I wanted the ever-searching self-interest of Sidewalk, spidering along below ground, looking for opportunities. I wanted the promise, the experimentation, the funding and the benefits. But it’s entirely within our purview to declare where Sidewalk should end, and where our rights (and the duties of our elected and appointed officials) should begin. We only fight for the rights that break.
We needed the company to make us care about the opportunity. And now that its search has revealed the cracks in our privacy infrastructure, it’s time to shore up, define, and declare a new set of tools using modern techniques and technologies. By rejecting Sidewalk, we’re not rejecting technology – that tech simply revealed something deep underground in need of repair. Thanks for pointing out the cracks, Sidewalk Labs. We’ll get to work on them.