Over the last decade, predictive statistical models have emerged that can uncover private traits about individuals without their consent. These traits, such as personality or mood, are predicted through various machine learning methods, using digital records of online activity such as social media data. Predictive models have allegedly been used by “propaganda machines” that target individuals with ideas or advertising.

The use of predicted private traits has been shown to be an effective means of mass persuasion that can significantly increase product sales. Now we are seeing  firms  like Cambridge Analytica and Aggregate IQ employing these tools for political causes like Brexit and candidates such as Donald Trump. Psychological profiling using social media data was reportedly used for voter suppression — discouraging people from casting their ballots — in the 2016 US presidential election. Cambridge Analytica claimed it used 5,000 data points per adult voter in the United States to create targeted ads for the Trump campaign.

In Canada, it is unclear how this technology is in use.  In 2015 the Liberal Party of Canada hired Sean Wiltshire, a microbiologist, to overhaul Liberalist, its in-house analytics platform. The overhaul was modelled after analytics tools used in the 2008 and 2012 Obama campaigns, which predicted the behaviour of individual voters with unprecedented accuracy.

A 2018 study involving 3.5 million individuals revealed that using predicted private traits to design advertising increased the number of purchases from such advertisements by 50 percent compared with control advertisements. If the use of these techniques in political advertising can influence voters’ choices to the same degree, they could pose a risk to fair elections and the democratic process.

What your digital footprint reveals about you

A predicted private trait is a piece of information that you have not explicitly disclosed but that is predicted by using other data collected about you. For example, your personality can be predicted by using data collected about the pages you “like” on Facebook.

There are many reasons to predict private traits. It is more cost effective to predict private traits of a population than it is to measure them through surveys. Businesses predict private traits to make their advertisements more personalized and more attractive to consumers.

In theory, anything that can be measured about you can be predicted. Psychologists use surveys to measure private traits such as personality, narcissism, psychological stress, depression and risk of addiction. Your digital footprint, including your Facebook likes, how often you comment or your profile picture, can be combined with survey data measuring private traits to train a machine learning model.

University researchers have successfully predicted at least 25 private traits using social media data, according to a review conducted at the University of Ottawa (not yet published). They include sexual orientation, political beliefs, whether a person’s parents were divorced before the person was 21 years old, drug and alcohol use, ethnicity, age group, personality, interests and mood. Many methods of predicting private traits are just as accurate as psychological surveys and more accurate than even the assessments of a person’s closest friends.

Keeping private traits private

When you disclose data to a platform by liking pages, you are consenting, via a user agreement, to the collection of your data. User agreements are generally clear in telling you that by consenting, you are allowing your information to be used for marketing purposes. But they are not explicit in stating that your data may be used to predict private traits.

The inner workings of government
Keep track of who’s doing what to get federal policy made. In The Functionary.
The Functionary
Our newsletter about the public service. Nominated for a Digital Publishing Award.

The Personal Information Protection and Electronic Documents Act (known as PIPEDA), the Canadian privacy law for private-sector organizations, states, “Personal information shall not be used or disclosed for purposes other than those for which it was collected.” It also says “the knowledge and consent of the individual are required.” These data cannot be used, legally or ethically, for undisclosed purposes such as predicting private traits. So when Facebook data are used without informed consent to predict private traits that are used to develop targeted ads, it’s a contravention of Canadian privacy laws.

The same principles apply whether the ads are for cars or candidates. Obtaining the informed consent of citizens when information is collected, or predicted, about them is a standard to which our leaders, candidates and the political parties that support their campaigns should be held.

A risk to democracy

Cambridge Analytica and AggregateIQ, which were active on behalf of the Leave side during the Brexit referendum and worked for the Trump campaign in the 2016 US presidential election, were able to predict personality and voting behaviour so they could predict who would vote for which campaigns or candidates. They could then send intimidating ads to people likely to vote for the opposition. AggregateIQ confirmed that Cambridge Analytica gave it violent ads to run in the 2015 Nigerian election to suppress turnout (but the AggregateIQ executives say they never actually ran the ads).

Political parties should appeal to voters on the basis of policy, not on the voters’ personalities. Targeted messages based on personality can be more effective in convincing people, but by using personal, private and emotional appeals, voters may be discouraged from thinking rationally about their own best interests, which is the ideal goal of debate in a healthy democracy. Communication tactics that use private traits undermine our ability to talk reasonably about political issues by inciting visceral reactions of fear and resentment and by demonizing political rivals.

Not all uses of predicted private traits need to be regulated. However, intervention is warranted if campaigning parties engage in tactics that violate our privacy and act in opposition to democratic ideals. The way political advertisers have used unrelated data in the US and the UK to predict private traits is especially invasive, and in the context of influencing voting choices that should be free and informed, it is particularly exploitative. This use of predicted private traits is damaging to the project of creating and maintaining a free and functioning democracy and should be restricted. We have launched a petition to the Minister of Democratic Institutions on this topic.

Before the 2019 federal campaigns get rolling, political parties should agree to a code of conduct that prohibits their use of predicted private traits in any way during elections, including by having an outside firm do the prediction. In the long term, legislation should be introduced that prohibits political parties and third parties registered with Elections Canada from predicting private traits for use during elections

Illustration: Shutterstock, by MNBB Studio.


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Trevor Deley
Trevor Deley is a PhD candidate in e-business at the University of Ottawa. He has a BSc in neuroscience and an MSc in biology and data science and was a software developer at IBM.
Julia Szwarc
Julia Szwarc is a master’s student in the Department of Communication at the University of Ottawa, where she focuses on broadcasting policy, environmental communication and digital media.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License

More like this