Now budget-strapped Statistics Canada is abandoning some of its longitudinal studies, raising the question of whether it is time to hand responsibility for managing these surveys to institutions with a longer time horizon than governments.

Notice was given quietly by Statistics Canada last June: “This is the last release of longitudinal data from the Survey of Labour and Income Dynamics (SLID). Effective with next year’s release of 2011 data, only cross-sectional estimates will be available.”

A short, simple, slightly obtuse statement that packed a blow to researchers and any Canadian interested not just in statistics, but in the shades of data that help us better understand ourselves.

The SLID’s demise was a casualty of federal budget-cutting and it raises important questions about how Canada manages long-term data projects.

There is no doubt that Statistics Canada recognizes the value of this survey, and others like it. The SLID is an example of what is called a “longitudinal survey,” a series of questionnaires posed to the same individuals over successive periods of time. These types of surveys allow analysts to examine social and economic dynamics over weeks, months, years and even over generations depending on their design and purpose.

As Statistics Canada’s Web site notes, the SLID “complements traditional survey data on labour market activity and income with an additional dimension: the changes experienced by individuals over time.”

By “traditional survey data” the agency is referring to what is commonly called “cross-sectional data,” information on individuals at a particular point in time. This is how we acquire what we know as “time series data,” a consistent series of statistics that map the evolution of an aspect of the economy or society. The monthly unemployment rate is an example, and there are many others: the inflation rate, retail sales, and even estimates of the Gross Domestic Product are all based on regularly repeated cross-sectional surveys of individuals or establishments, charting the macroeconomics of a country.

Time series data are the bread and butter of national statistical agencies, and the principal reason they undertake individual level surveys. But over the years it became clear to outside researchers that cross-sectional surveys are valuable not just as a way of calculating these macroeconomic data. They have value in their own right.

The responses individuals give to a series of questions about their background characteristics give researchers statistical clues about why economic outcomes occur.

How much of the observed difference in male-female wage rates is due to underlying differences in education or work experience, as opposed to their differential evaluation? Knowing the answer gives us a sense of the degree to which there is wage discrimination.

Or why do recent immigrants have lower wages than their counterparts decades ago? Is it language skills, education levels or intangibles associated with the changing mix in countries of origin?

Statistics Canada can start these surveys. But can it keep them going?

Statistics Canada slowly accommodated this need to know. In the mid-1980s it started giving researchers access to the individual responses to cross-sectional data, always in a way that respected confidentiality.

But as valuable as this information proved to be, it was immediately apparent that it could not fully explain how Canadians live their lives.

Observing someone at a point in time cannot tell us how long they will be unemployed, poor or rich, or what caused the spell to start, and what caused it to end.

Knowing that the average monthly unemployment rate during a particular year is 10% does not tell us if the same individuals are unemployed each and every month, experiencing very long spells of unemployment, or if a new group of unemployed start a very short spell each month. It lacks the detail needed to better understand poverty or, for that matter, high income.

This is the “additional dimension” that Statistics Canada is suggesting is valuable, and it comes from following the same individuals over time. The National Longitudinal Survey of Children and Youth (NLSCY for those keeping score of acronyms), which followed successive cohorts of children beginning in 1994, is another example.

But Statistics Canada has decided that it, too, must go. The last year of available information from this survey will be for 2009. When I recently described the loss of the survey to a coauthor over the telephone, she paused and said “Ahhh…” with the sort of sadness you’d hear if a friend had died.

Statistics Canada is proposing to stop the longitudinal dimension of SLID, but to continue using parts of the questionnaire as a cross-sectional survey. As such it will retain the capacity to calculate poverty rates, but have a much more limited capacity to explain how long they last, what causes them to start or end, and how many people experience income losses or gains.

corak photo
The secrets of crowds: researchers get clues from studying individuals over long periods of time. Photo: CP Photo

Innovative as longitudinal surveys sound, Statistics Canada was actually late to this party — and always early to leave. It was late because it is a very cautious and careful statistical agency. Indeed, when the SLID began in the mid-1990s, it was only after more than a decade of experience with successively more complex surveys. A series of surveys called the Survey of Annual Work Patterns in the late 1970s and early 1980s followed the labour market activities of individuals over the course of a particular year; then the Labour Market Activity Survey followed individuals for periods of two or three years, finally giving rise to the SLID. But even this survey had a limited horizon, following individuals for a maximum of six years.

By comparison, consider the United Kingdom.

About 17,500 babies were born during a particular week in March of 1958 in England, Scotland and Wales. They were part of a health survey designed to examine the factors associated with stillbirth and death in early infancy. But we know a good deal more about them.

They were surveyed again at the age of seven in 1965, and again in 1969, again in 1974 and 1981, and yet again in 1991 and in 2000, 2004, and 2008 and even now there are plans to survey them in 2013 when they turn 55 years old. These surveys are known as the National Child Development Studies, or simply the NCDS.

They have inspired the popular British TV documentaries in the “Up” series, but even more importantly they have inspired successive waves of surveys on other cohorts of children: a group born in a particular week in 1970 called the British Cohort Study and another group born in 2000-01 called the Millennium Cohort Study.

Consider Germany.

The German Socio-Economic Panel began in 1984 with a representative sample of about 12,000 individuals. They have been surveyed every year, for more than 25 consecutive years now.

Or take the United States.

The grand-daddy of all longitudinal surveys is the Panel Survey of Income Dynamics — only one of many, many US longitudinal surveys — but whose Web site proudly proclaims it as “the longest running longitudinal household survey in the world.” The PSID, as it is affectionately called, began in 1968 as an outgrowth of Lyndon Johnson’s “War on Poverty.” It was based on about 18,000 individuals who have been followed on an annual or bi-annual basis since. So have the children, and even now the grandchildren, of the original cohort, who have been followed once they became old enough to leave home and form their own households.

Alas here in Canada the NLSCY is dead, and now the longitudinal part of SLID has joined it. Statistics Canada hopes to fill some of the gaps with another new longitudinal labour market survey, which has already been tested and put into the field. The agency claims that this Longitudinal and International Study of Adults (why “international”? Who knows, but the acronym — LISA — sounds great) will be truly longitudinal, extending beyond the limited six-year horizon of the SLID.

It requires is a major feat of management and organization to put a survey of this sort into the field and sustain it over decades. The capacity to run longitudinal surveys is now part of Statistic Canada’s skill set. It can start these surveys. But can it keep them going?

The managers and mathematicians at Statistics Canada are surely as capable and energetic as their British, German and American counterparts, but what do these countries have in common that has led to such longevity, and that Canada may lack?

Many of the foreign surveys are housed and managed by independent research institutes like the German Institute for Economic Research (DIW), or by research institutes affiliated with universities like the University of London or the University of Michigan. The financing comes not from particular government departments, but from established agencies responsible for the social sciences, the equivalent of Canada’s Social Sciences and Humanities Research Council.

The research community has independent control over not just developing content, but managing the surveys and disseminating the data. Further, the data are disseminated easily and widely, creating a broad constituency of users across disciplines, and between public and private sectors.

These agencies, institutes and the user community all have an interest in the long term. At Statistics Canada funding is annual, subject to the trade-offs in managing a whole portfolio of statistical products. It is also dependent on financial support and direction from particular government departments whose interests and priorities ebb and flow, and are tied to broader government objectives.

In a recent interview, the current Chief Statistician of Australia, Brian Pink, made a revealing and important comment: “Neither the Treasurer nor Prime Minister can tell me how to go about my business. They can tell me what information to collect, but they can’t tell me how to do it, when to do it or how often to do it.”

It is telling that the Australian longitudinal labour market survey — the Household, Income and Labour Dynamics in Australia Survey — which was started in 2001 and has funding for a multiyear period, is not being run by the Australian Bureau of Statistics but rather by an institute at the University of Melbourne.

The current Chief Statistician of Canada is in a more challenging position. He must manage surveys whose value is in the long term, running much longer than a fiscal year. And even longer than an electoral cycle.

Photo: Shutterstock