The past year has brought a sea change in users’ and governments’ attitudes toward social media platforms. Just over a year ago, the Cambridge Analytica / AggregateIQ scandal over massive personal information breaches, and revelations about how that personal information was being used in efforts to manipulate elections around the world, changed the way many see social media platforms.
The recent attacks in New Zealand and Sri Lanka, linked to online hate, have led to new international efforts to regulate platform content. Here in Canada, the Liberal government recently announced that it is putting “all options on the table” for social media regulation. The year is now bookmarked by the release of findings by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia in their joint investigation sparked by the Facebook scandal of a year ago.
What stands out in the commissioners’ report is not the privacy breach or the potential user manipulation that the report outlines; these were already known. The most important conclusion is about Canada’s failure to regulate Facebook, primarily due to inadequate privacy legislation.
Facebook and Canadians’ privacy
Some of the privacy commissioners’ findings have been covered extensively in the media: For two years – November 2013 to December 2015 – Facebook users’ name, gender, Facebook ID, profile picture, birthdate, city, and “likes” were collected by a personality quiz app called This is Your Digital Life. It was installed by Facebook users, including 272 Canadians. The information of those 272 Canadian users’ 622,000 friends was also collected, along with that of 87 million Facebook users worldwide.
This information was used to build psychographic profiles of Facebook users’ political leanings, personality types and a range of other sensitive characteristics. These profiles were used by Cambridge Analytica to target Facebook advertising based on individuals’ private information: both the information Facebook and apps collected directly from individuals, and the inferences that could be drawn about those individuals by correlating their data with the characteristics of people in larger data sets.
The commissioners’ report documents their recommendations, which came out of a dialogue with Facebook over the past five months with the goal of bringing the company into compliance with Canadian privacy law in light of the Cambridge Analytica scandal.
Facebook has taken various steps to improve its privacy practices since the scandal broke a year ago. It has just now banned personality quizzes like the one at the heart of the Cambridge Analytica scandal. None of these steps amounts to the level of accountability that the commissioners currently recommend.
One of the commissioners’ recommendations is that Facebook should expand its review of apps that have had access to personal information and notify users of all apps that had access. In its recent discussions with privacy commissioners, Facebook has refused to do this. Instead, the company has committed to reviewing only a selection of its third-party apps’ access to personal information. Facebook also rejected the commissioners’ recommendation to inform users about friends’ apps that might have accessed their personal information, and to submit to privacy audits.
Facebook’s most recent failure to comply with the commissioners’ recommendations is not surprising given Facebook’s history of privacy problems. The Privacy Commissioner addressed a Facebook privacy complaint 10 years ago that concerned the very issues that led to the Cambridge Analytica scandal: Facebook’s practice of allowing apps to access not only the personal information of users who installed their app, but also that of those users’ friends. Back then, Facebook agreed to prevent apps from accessing personal information without consent — something the commissioners now note it ultimately did not do; instead it relied on the third-party apps to obtain consent, a process it did not monitor to ensure consent was obtained.
The need for stronger privacy laws
Setting aside the question of whether the commissioner in 2009 was wise not to press Facebook further — its reasons for not doing that are outlined in the current report (paragraphs 21-22) — the commissioners now take a harder line with Facebook, declaring that they will take the company to court.
However, federal Privacy Commissioner Daniel Therrien notes that penalties might only amount to “tens of thousands of dollars.” He therefore calls on the federal government (as he has repeatedly) to strengthen privacy law to permit the imposition of meaningful fines.
The important question is why Canada does not yet have stronger privacy laws to regulate social media companies that would allow the imposition of meaningful fines and permit the monitoring of privacy enforcement through audits.
When Canada’s Personal Information Protection and Electronic Documents Act was founded in 2000, Facebook did not yet exist. A penalty in the tens of thousands might have meant something to a startup. In 2009, when the Privacy Commissioner concluded its original Facebook investigation, Facebook’s revenue was $777 million. Tens of thousands of dollars might not have meant much then, but it would have had more impact than it would today, when Facebook’s annual revenue has grown to over $55 billion.
Many are asking whether Facebook is too big. Should it be broken up? Would breaking up Facebook make the company easier to regulate?
While there might be strong arguments for breaking up Facebook, it would not solve a bigger problem: the federal government’s failure, to date, to act to sufficiently strengthen privacy legislation.
There is a problem even bigger than Facebook’s size, and it lies in the relationship between political parties, governments and Facebook. The BC Information and Privacy Commissioner recently found that political parties in BC populate their voter databases with social media information from Facebook and Twitter, and they upload voter information to Facebook as they use the platform to target ads. They use the data they collect to facilitate contact with voters, track voters’ intentions, and predict support. Parties can also put the data they collect to other uses, such as vetting judicial appointments.
Globally, Facebook’s government and politics team has worked as de facto political campaign workers, sometimes embedding in parties’ political campaigns. That kind of cozy relationship can translate into political and regulatory influence. It’s the kind of relationship that should be broken up. Governments are hamstringing themselves by giving up regulatory power to social media platforms in exchange for Facebook tech support.
While governments should be cautious about regulating online platforms, privacy regulation is one area where the public interest in enforceable law seems clear and necessary. If all options are on the table, action number one should be legislating strong privacy enforcement.
Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.