The Fraser Institute Wait Time Reports: Madness in the Method, but Method in the Madness
Suppose you want to know how well road traffic moves in your community. The method: once a year, ask 100 road builders to stand by various roads and eyeball the traffic. Their job is to estimate the speed at which the next car will make it down the street. Do not give them radar guns, or other speed-measuring devices. Ask them only about that one day, not a sample of days throughout the year. Don’t survey drivers, who have speedometers and clocks. Publish the results with great precision— traffic moves at 36.5 km/hour this year compared to 38.2 km/hour last year.
Along the way, you run into a few problems. Most of the people you ask to take part decline – 84% of them in 2011. A 16% response rate would alarm any competent survey researcher. Pussies! There’s an easy solution: assume the 16% who sign on are representative of the 84% who don’t. Presto! A perfectly valid survey.
And while we’re at it, ignore the possibility that these road builders’ responses may be coloured by, say, their interests in building more roads. Only a conspiracy theorist would imagine that a road builder might – even subconsciously – estimate ever-slower speeds to generate more spending on public roads (or better, pay-as-you-go toll roads). And only a prissy statistician would even raise the notion that the 84% who don’t participate think traffic moves along just fine.
But why let science and rigour get in the way of a good story? Each year you release the survey data with great fanfare, the results touted as factual, accurate, and proof that public roadways have gone to hell. Some data by dumb luck occasionally approximate reality, just as I might correctly guess that it will snow in Quebec City next February 6. My predictive feat does not make me a meteorologist any more than the traffic survey method qualifies as science, let alone common sense.
I have just described to you the method the Fraser Institute uses to generate its annual Christmas gift to Canadian health care: the report on wait times. Knowing that its method is essentially absurd, it goes to great lengths to reconcile its “data” with provincially reported figures. Never mind the 16% response rate in 2011, which alone cashiers validity. Even more fundamentally, the questionnaire asks respondents for neither the sources of their estimates, nor whether they consult any real data to support their responses. The report spends 14 pages on methodology but unlike scientific publications, contains not a word about limitations. By its own reckoning it is perfect.
Such is the “wisdom of crowds” approach, except the crowd is getting thinner every year, the crowd is not disinterested, and the survey solicits hunches rather than facts.
There is of course an option. Instead of persistently repeating a fatally flawed survey of physicians to estimate wait times, the Fraser Institute could a) examine and interpret the real data that exists, and b) call for or produce more comprehensive data where it doesn’t. Why doesn’t it do so?
The answer lies in its mission statement, helpfully reproduced at the end of the wait times report: “Our vision is a free and prosperous world where individuals benefit from greater choice, competitive markets, and personal responsibility.” Public health care sucks because it must suck, because it’s public. Therefore, let’s gather skewed estimates on a hot-button issue, retail them as hard data, and lure Canadians toward the promised land of private medicine.
But you know that – the Fraser Institute hardly conceals its raison d’etre. The more important question is why the media so willingly and uncritically reports its findings. Some share the FI’s ideology and are happy to retail its sortafacts and promote its prescriptions. Others neither read nor understand the methods underlying the report. Regardless, the Boys on Burrard don’t really care about the carping from the geeks. Their report supplies an annual fix for their donors’ ideological addiction.
There are wait time problems in Canada and many people suffer because the system is badly organized and wait lists have been poorly managed if they have been managed at all. Some, like the Fraser Institute, welcome each failure as a gift to the profiteers’ agenda. It’s one thing to draw your own conclusions from accurate, meaningful data; quite another to let your ideology trump science. “The Fraser Institute maintains a rigorous peer review process for its research” trumpets its reports. Just like Soviet science in the 1930s and Chinese food production reports in the 1970s. Comrades, a grateful nation salutes you!
About the AuthorSteven Lewis is a health policy consultant based in Saskatoon and Adjunct Professor of Health Policy at Simon Fraser University.
Catherine Richards wrote:
Posted 2012/01/24 at 09:52 AM EST
Brilliant! I agree with your post in its entirety and I appreciate your creative and engaging writing style which certainly helps to make a very dry topic entertaining as it enlightens.
As one person among ordinary Canadians affected by the healthcare system, I think I have the right to comment that all the time and money spent on collecting and surveying data is a huge waste of resources. Survey results are misleading to the public, and it is often conducted by those with vested interests in the outcome. How can we trust the data, the methodologies, or the motives behind the associated survey reports and recommendations that our healthcare leaders then interpret to their own advantage hoping to convince us that they are healing rather than killing public healthcare in Canada? I for one am a skeptic, and for good reason. Keep writing. I'll be paying attention.
Personal Subscriber? Sign In
Note: Please enter a display name. Your email address will not be publically displayed