Clicking on banner ads enables JWR to constantly improve
Jewish World Review August 10, 2005 / 5 Av, 5765

Walter Williams

Walter Williams
JWR's Pundits
World Editorial
Cartoon Showcase

Mallard Fillmore

Michael Barone
Mona Charen
Linda Chavez
Ann Coulter
Greg Crosby
Larry Elder
Don Feder
Suzanne Fields
Paul Greenberg
Bob Greene
Betsy Hart
Nat Hentoff
David Horowitz
Marianne Jennings
Michael Kelly
Mort Kondracke
Ch. Krauthammer
Lawrence Kudlow
Dr. Laura
John Leo
David Limbaugh
Michelle Malkin
Chris Matthews
Michael Medved
Kathleen Parker
Wes Pruden
Sam Schulman
Amity Shlaes
Tony Snow
Thomas Sowell
Cal Thomas
Jonathan S. Tobin
Ben Wattenberg
George Will
Bruce Williams
Walter Williams
Mort Zuckerman

Consumer Reports

Making intelligent errors | We're not omniscient. That means making errors is unavoidable. Understanding the nature of errors is vital to our well-being. Let's look at it.

There are two types of errors, nicely named the type I error and the type II error. The type I error is when we reject a true hypothesis when we should accept it. The type II error is when we accept a false hypothesis when we should reject it. In decision-making, there's always a non-zero probability of making one error or the other. That means we're confronted with asking the question: Which error is least costly? Let's apply this concept to a couple of issues.

The stated reason for going to war with Iraq is that our intelligence agencies surmised Saddam Hussein had, or was near having, nuclear, biological and chemical weapons of mass destruction. Intelligence is never perfect. During World War II, our intelligence agencies thought that Germany was close to having an atomic bomb. That intelligence was later found to be flawed, but it played an important role in the conduct of the war.

Since intelligence is always less than perfect, we're forced to decide which error is least costly. Leading up to our war with Iraq, the potential errors confronting us were: Saddam Hussein had weapons of mass destruction and we incorrectly assumed he didn't. Or, he didn't have weapons of mass destruction and we incorrectly assumed he did. Both errors are costly, but which is more costly? It's my guess that it would have been more costly for us to make the first error: Saddam Hussein had weapons of mass destruction and we incorrectly assumed he didn't.

Another example of type I and type II errors hits closer to home. Food and Drug Administration (FDA) officials, in their drug approval process, can essentially make two errors. They can approve a drug that has unanticipated dangerous side effects (type II). Or, they can disapprove, or hold up approval of, a drug that's perfectly safe and effective (type I). In other words, they can err on the side of under-caution or err on the side of over-caution. Which error do FDA officials have the greater incentive to make?

If a FDA official errs by approving a drug that has unanticipated, dangerous side effects, he risks congressional hearings, disgrace and termination. Erring on the side of under-caution produces visible, sick victims who are represented by counsel and whose plight is hyped by the media.

Donate to JWR

Erring on the side of over-caution is another matter. A classic example was beta-blockers, which an American Heart Association study said will "lengthen the lives of people at risk of sudden death due to irregular heartbeats." The beta-blockers in question were available in Europe in 1967, yet the FDA didn't approve them for use in the U.S. until 1976. In 1979, Dr. William Wardell, a professor of pharmacology, toxicology and medicine at the University of Rochester, estimated that a single beta-blocker, alprenolol, which had already been sold for three years in Europe, but not approved for use in the U.S., could have saved more than 10,000 lives a year.

The type I error, erring on the side of over-caution, has little or no cost to FDA officials. Grieving survivors of those 10,000 people who unnecessarily died each year don't know why their loved one died, and surely they don't connect the death to FDA over-caution. For FDA officials, these are the best kind of victims — invisible ones.

When an FDA official holds a press conference to announce its approval of a new life-saving drug, I'd like to see just one reporter ask: How many lives would have been saved had the FDA not delayed the drug's approval?

The bottom line is, we humans are not perfect. We will make errors. Rationality requires that we recognize and weigh the cost of one error against the other.

Every weekday publishes what many in Washington and in the media consider "must reading." Sign up for the daily JWR update. It's free. Just click here.

Walter Williams Archives


© 2004, Creators Syndicate