As someone who lives and works in Calgary and tends to identify with political views that lean left of centre, I’m often labelled as an anti-industry, antigovernment thinker. I confess that some of my nonacademic writing has been critical of both industry and government. I have written about similarities I see between the Calgary of today and Detroit decades ago and the statistical certainty that there will be oil spills from pipelines. I have also written from a perspective on energy strategy that deviates sharply from the views held by Prime Minister Stephen Harper’s Conservative government.

In spite of my views on specific issues, I have little interest in taking an ideological stand against industry or government. I see work in government as a noble and underappreciated calling (maybe I’ve watched too many episodes of The West Wing) and admire the emphasis on creativity and outside-the-box thinking of the best and brightest in business and industry.

So, if I’m not antigovernment and anti-industry, what am I? The answer is simple: I’m anti-lousy-decision-making, especially in public policy, where the outcomes affect us all. And if being against something requires that one must also stand for something, I stand for a decision-making process that meets a high, but eminently achievable, standard of quality.

A high-quality decision-making process is one that accounts both for how rational decision-makers should behave and for how people actually behave. High-quality decision-making strives toward the idealized benchmarks for rationality set forth by the economic sciences, while recognizing that people —working individually or in groups — face significant psychological hurdles trying to get there.

Because of its repetitive nature, decision-making seems straightforward and intuitive. The commonly accepted narrative around decision-making goes something like this: When people are faced with a problem or opportunity, they gather information about it, identify a series of options for solving or addressing it, weigh the pros and cons of the options and then make a choice that results in maximum benefit for minimum cost. Good decision-makers are optimizers.

In reality, however, research and practice show that high-quality decision-making is neither straightforward nor intuitive. Research in the decision sciences, which can best be described as an uneasy — yet rarely boring — marriage between economists and psychologists, reveals several obstacles.

The largest and least surprising of these is the robust finding that people are not strict optimizers. Rather than evaluating their options by thoughtfully evaluating pros and cons, people take shortcuts. And, while taking shortcuts is commonplace, most decision-makers fail to understand or recognize the practice and — importantly — the systematic biases that accompany it.

The German psychologist Gerd Gigerenzer views these “fast and frugal” shortcuts as an essential and effective (in that they may lead to quasi-optimal decisions) aspect of human decision-making. Without them, the vast majority of the decisions people face in their daily lives — what to wear, what to eat, which movie to watch, etc. — would be too time-consuming and, hence, overwhelming. It’s impossible to argue against this position.

On the other hand, it’s equally worth noting that complex policy decisions involving national affairs such as energy and pipelines, or international affairs like choices about humanitarian interventions and a national policy on climate change, require more than the application of fast and frugal shortcuts.

These decisions require more accuracy, for the costs associated with getting them wrong may be significant and felt for generations to come. And because these decisions require greater accuracy, they demand more effort from decision-makers.

What does more effort in decision-making look like? In decision science parlance, “effort” isn’t a matter of trying harder or, in some cases, even working harder. It is a matter of adding much-needed structure to processes that are too often unstructured and diffuse in the minds of decision-makers.

At its most basic level, decision-making is about two things: solving problems and addressing opportunities. To do a good job of either means taking the time to understand what the real problem or opportunity is. Will it require a single decision at a single time with a discrete choice? Or will it require a series of overlapping and interrelated decisions made over an extended period of time?

arvai img1

Take the Canadian government’s position on oil sands. The problems and opportunities in the oil and the overall energy sectors are not limited to whether to develop the oil sands. They require choices about investments in a much broader national energy landscape, such as whether a one-size-fits-all approach to energy works for all of the provinces (it doesn’t), the question of carbon taxes, and incentives for the development and deployment of renewables. These decisions will necessitate a series of interconnected choices over time about a very wide range of investment options.

Nothing in the federal government’s policy agenda suggests it is prepared to make these kinds of interrelated choices. Indeed, the real leadership in energy decision–making is coming from those who understand the importance of looking past investing in a single energy source, the ”adapt or die” industry players who embrace the innovative thinking that today’s energy decisions require.

Making high-quality decisions also means choosing options that best address decision-makers’ values and objectives. Straightforward, right? Nope. History is replete with examples of people doing a poor job of aligning choices and policies with their objectives. Think of the Donner Party’s fatal decision to take a “shortcut” to California, which left them in the snowbound Sierra Nevada; prohibition in the US to curb crime, which instead fuelled the gangster class; and George W. Bush’s obsession with overthrowing Saddam Hussein instead of finding Osama bin Laden (or instead of dealing with the decay of domestic infrastructure and financial regulations, for that matter), which resulted in the bloody quagmire of the Iraq War.

Why does this happen? On the one hand, it’s because making trade-offs among competing objectives is hard intellectual work. It’s much easier to make a decision to go ahead with an oil and gas pipeline because we’ve convinced ourselves it makes short-term economic sense than it is to make a decision to go ahead — or, perhaps, not — based on thoughtful trade-offs between long-term economic costs and benefits, environmental risks, social welfare and the concerns of First Nations.

People routinely fail to make decisions that are in line with their objectives for an even more simple and troubling reason: They don’t take the time to think deeply about what it is they want to achieve — what their objectives are — in the first place.

This is not just a matter of coming up with some vague notion of a desirable outcome. It’s about being clear about the relationships among different objectives; for example, the relationship between growing the economy and protecting the environment. Decision scientists refer to these as “ends” objectives. It’s also about understating the relationship within objectives; for example, how different kinds of investments — the Canadian government’s recent focus on the digital media and the tech sector or its historic interest in natural resources — are “means” to the “end” of growing the economy. The bottom line here is that values and objectives are prime for all of our decisions, big and small. We neglect them at our peril.

We are far from the point where we can hand decision-making over to an algorithm.

The inner workings of government
Keep track of who’s doing what to get federal policy made. In The Functionary.
The Functionary
Our newsletter about the public service. Nominated for a Digital Publishing Award.

The rise of what is broadly called Big Data has given new impetus to those who believe that more and better information will solve our decision-making troubles. The term Big Data is generally taken to refer to the digital exhaust from our time spent online: our searches, commercial transactions, social media interaction, movements tracked by GPS devices and so on. The sheer amount of data we generate has spawned hopes that we can take the guesswork and estimation out of decision-making. The number of data points will eliminate the need for statistical sampling. Big Data will simply tell us what we need to know and what to do. And it is cheap.

But Big Data can be misleading. Viewed out of context, or out of order, data can give us wrong answers, or be used to reinforce our prejudices. They can show correlations among events (think of the success of Google Flu Trends in predicting the rise in illness in places where flu-related searches had spiked). But uncovering the causes behind the data are still susceptible to our biases (did the search spikes occur because more healthy people were provoked to do so by media reports of a possible flu epidemic?).

Big Data and the algorithms that seek to analyze it are certainly valuable, and they are only in the early stages of development. But we are far from the point where we can hand decision-making over to an algorithm. The human component — asking the right questions of the data and making sure they are used to serve our objectives and not the other way around, and deriving the right answers from them — remains crucial to good decisions.

We can easily see the difficulty in outsourcing decisions to the data sets in the struggle to craft climate change policies. No one can say we lack data on the issue. The Intergovernmental Panel on Climate Change has now produced five reports, based on comprehensive information gathering and modelling produced by thousands of scientists. Yet all those data do not provide us with iron-clad projections about the precise impact of different variables on the future climate. They can’t, and they never will.

Because many policy decisions about climate lack a structure that would help policy-makers design strategies that are sensitive to the trade-offs between competing objectives, Big Data doesn’t help as much as it should to improve the decision process. The physical science does not resolve the social science challenges.

Is the boom in natural gas fracking a route to a lower- carbon energy substitute; does it have unintended consequences for the health and environment of local communities? Is reviving nuclear power a viable option, or are the perceived risks too high for a world spooked by Three Mile Island, Chernobyl and Fukushima? Is the massive capital expenditure required by adaptation technologies more acceptable to citizens than the considerable slowdown in economic growth from the shift away from carbon-based fuels?

The decisions that are at the ends of these questions require trade-offs that Big Data alone can’t make. If used wisely, Big Data can help us make better decisions. But it does not remove the need for human insight.

So what is a policy-maker to do in light of these judgmental challenges? The obvious answer is to slow down, to take the time needed to bring structure to complex decisions. In my research lab at the University of Calgary, we have been working extensively with decision-makers in industry, government and communities on how to improve the quality of decision-making, from daily decisions to high-level ones where the impact will be felt by many for years to come.

We hear a frequent argument that the speed at which decisions must be made is ever-increasing. Social media, short news cycles and an impatient desire for quick fixes to all kinds of problems are ramping up the pressure on decision-makers. Not only is there no time to slow down, but the amount of time that seems to be available is shrinking.

This is a real concern. Making decisions in a time-heightened atmosphere can lead to a greater reliance on the emotional side of our decision-making capabilities rather than the analytical side. The psychologists Robert Zajonc, Paul Slovic and Daniel Kahneman have written at length about two systems of decision-making: one based on rapid emotional responses, gut instincts and intuition; the other on analysis and calculation. The best decisions come when both systems are working in unison.

The perceived need for speed in decision-making and the demand for quick responses to complex problems play to the strengths and weaknesses of our emotional rather than our deliberative side. Emotions bring much-needed feeling to data, which is otherwise sterile and without meaning. But too much emotion acts as a blinder. It can keep us focused on elements like fear or anger that can distract us from more mundane but vital factors, such as the data on risk and probability of different outcomes.

Given the dual role of emotion in decision-making, the best advice remains to pause and reflect. Does the decision really require speed? If it does — as in the need to flee from a threat — you’ll know. So, by all means, let emotions rule. If the decision doesn’t require speed, the best advice is to take a slower, more deliberate tack.

But it’s not just a slower pace that matters. It’s also structure: the need to think about multiple objectives, a diversity of alternatives and informed trade-offs. Take, for example, the recent deliberations led by the National Energy Board (NEB) on the proposed Enbridge Northern Gateway pipeline, which would carry bitumen from the oil sands in Alberta to tidewater in northwestern British Columbia before it is loaded onto ships bound for Asia. In spite of the “no time to lose” rhetoric from the federal government on oil sands development, coupled with aggressive lobbying and public relations efforts by industry, it has been over four years since the NEB issued its terms of reference for the regulatory review of the pipeline.

arvai img2

Yet for all the time taken, the decision-making process about Northern Gateway has barely progressed. Even though the NEB recently recommended federal approval of the pipeline, its construction — if it ever happens at all — is, at best, years away, with no sign of achieving broad acceptance.

The decision-making process for Northern Gateway was marred by chaos and frustration. Many stakeholders wonder why other alternatives aren’t on the table. Many analysts view the hearings and its outcome as illegitimate. Many experts still squabble about the meaning of the data characterizing pipeline risks. And many people have vowed to continue the fight against the pipeline. If ever a recent Canadian policy decision screamed out for a better structure, it is this one.

Fixing the process to get good decisions in a timely way requires fresh thinking from governments. Here, the behavioural sciences can help. The Democrats in the White House and the Conservatives on Downing Street have begun to formally embrace insights from the behavioural sciences as a way to improve the quality of their decision-making, strengthening their in-house and science-based decision-making capabilities. Behavioural science has moved from theory to action, in government and, increasingly, in corporations.

Are Canada’s civic and corporate leaders ready to commit to a path of capacity building around better decision-making? To do so would require understanding that the stakes are high in the choices we face, from getting the energy and environment mix right to how we respond to the changing shape of our economy in an age of technological transformation. These are questions with implications for humanity and the planet. We can’t afford to fail.

Photo: iStock

Joseph Árvai
Dr. Joseph Árvai is the Max McGraw Professor of Global Sustainable Enterprise in the School of Natural Resources & Environment, and the Ross School of Business at the University of Michigan. He is also the Director of the Erb Institute for Global Sustainable Enterprise at the University of Michigan. He can be reached on Twitter at @DecisionLab.

You are welcome to republish this Policy Options article online or in print periodicals, under a Creative Commons/No Derivatives licence.

Creative Commons License