Umar Farouk Abdulmutallab (UFA), the thwarted Christmas Day Northwest bomber, has an almost comically perfect résumé as a potential terrorist. His own dad outed him to the authorities. He spent an unusual amount of time in Yemen. The computer screen virtually screamed DON'T LET THIS GUY ON THE PLANE. Nabbing the culprit should have been no more difficult than whipping Sarah Palin on Jeopardy. Yet the man figuratively wearing an orange suit with an arrow pointing to his pant leg labelled "explosives" managed to board the plane and begin to assemble his bomb until other passengers beat him up.

The solution is obvious - at least it is to the Americans and copycat Canadians. Make everyone come to the airport earlier and wait in long lines for screening. Make 85 year old grandmothers take off their shoes, abandon their shampoo, walk through whole body scanners, and don't let them pee during the last hour of the flight. More activity, more cost, more delay will mean better security. Everyone, don't you know, is a terrorism threat.

There is not a citizen with an ounce of common sense unaware of how bone-stupid this is. The Americans are simply too rich for their own good. In spite of the hundreds of billions they spend on intelligence, homeland security, and the military, the Yanks seem always to end up with a Rube Goldberg contraption that never works. They are the Masters of Muda - wasted time and effort, useless work.

What if it's not in spite of, but because of? Only those with huge resources can afford to pursue inelegant solutions. When looking for a needle, they begin with the biggest haystack; often they heap more hay on the hill before they start the search. When looking for a terrorist, suspect everyone - or at least pretend to, even if you don't, just so you look like you're not profiling. Above all, look like you care, and run around like you're committed. Call a code orange. Examine everything and you'll miss nothing. Achieve perfect sensitivity, and the specificity will take care of itself.

But it doesn't. Casting the net too wide creates two fundamental problems. One is that it creates cynicism and ennui among the personnel who need to be motivated and alert to the very small number of passengers who pose a real threat. Second, it creates too many false positives and too much data: even a massive intelligence system can't process it all and prioritize vigilance effectively. The result of that is truly terrifying: even when the system worked - UFA was a known, fingered threat - it failed. And where anything short of 100% success is potentially fatal, surely it's time to rethink the approach, or outsource the whole enterprise to Israel, which seems to have learned a thing or two about how to safeguard El Al.

Healthcare faces this dilemma every day: the more we seek sensitivity (picking up every conceivable case), the more we forego specificity (identifying only the real cases). How many false positives (healthy people labelled as sick) is one less false negative (sick people labelled as healthy) worth? What risk is worse - overtreating thousands for a false alarm (a common result of overscreening for prostate cancer) or undertreating some for an undetected remediable condition (a common result of colon cancer underscreening)?

Here's where we need to find a way to trust calculation over psychology. If you or your loved ones die because a bomber used the airplane washroom as a lethal weapons lab or a battery of physicians missed a diagnosis, there is cold comfort in numbers and probabilities. The natural reaction is to rail and demand more, damn the cost and the collateral damage. We must not stop until tiny risk becomes zero risk. Every failure is a system failure; a stone unturned is an opportunity lost.

But there is another option. Acknowledge the reality of our times and accept that the quality of individual and civic life goes hand in hand with a certain amount of risk. Terrorists are going to kill innocent civilians. The pool of potential terrorists remains very small and it is sloppy thinking to infer that if you don't put 85 year old women through the security wringer, Al Qaeda wizards will recruit one to blow up the flight to Dubuque. It is folly to respond to every failure with a massive assault on common sense; learn its root causes and deal with it more strategically, not least by recognizing the consequences of too blunt an approach.

So it is in healthcare. People are going to die because of missed diagnoses, just as people will die because of overdiagnoses. If you spend your life worrying about every tic and blemish morphing into a lethal tumour, you will climb onto a diagnostic treadmill and lead a life that is literally pathological. Hypersensitivity is the spirit of the times; it drives up utilization and costs and defines normal as an exceptional condition.

Put another way, there is danger in investing too much confidence, money, and psychic energy in increased sensitivity while paying too little attention to specificity. And given the way humans work, it's a self-defeating strategy. When too much is important, nothing is important; an obsession with sensitivity will become an analgesic that numbs us to the real deal. Somewhere, sometime, a bleary-eyed, conscientious person will miss something important amid the expanding clutter of supposedly relevant phenomena. The key to great detective work is to narrow the list of suspects. In airports and in healthcare, the challenge is to know whom to leave alone.

About the Author

Steven Lewis is a Saskatoon-based health policy consultant and part-time academic who thinks the healthcare system needs to get a lot better a lot faster. Contact him at: Steven.Lewis@shaw.ca