Thursday, November 11, 2010

Moneyball and health care quality statistics - Part 1 of 2

At a recent conference, Harvard surgeon and best-selling author Atul Gawande told the audience of health professionals and policy makers that he always assigns his medical students a book about baseball called Moneyball: The Art of Winning an Unfair Game. It's the story of how the 2002 Oakland Athletics, which had one of the lowest payrolls in professional baseball, were able to consistently out-compete better financed teams due to their general manager's unrivaled ability to evaluate and appropriately value players. Oakland took advantage of the tendency of other teams to overvalue players based on word-of-mouth assessments of talent or commonly measured statistics—such as batting averages and number of stolen bases—that had little relationship to winning games.

Gawande's point was that many of the Moneyball lessons can be applied to medical care when it comes to evaluating the performance of doctors and hospitals. Since I have a policy of not accepting close friends and family members as patients, I refer them to doctors with whom I've worked in the past or have met through medical conferences. My loved ones may be reassured by my recommendation, and I assume I'm pointing them to a good doctor.

But I also have a nagging worry that my gut instincts about a doctor I'm acquainted with may not correlate with the quality of care that doctor provides. I have no way of knowing if my friends and family members will get better or worse care from my referral than if they had randomly selected a name from a list. In fact, a recent study published in the journal Archives of Internal Medicine found that publicly available data on physicians such as medical school attended, malpractice lawsuit history, and specialty board certification are poor predictors of their adherence to accepted standards of medical care such as checking cholesterol levels in patients with diabetes and performing Pap smears in adult women at least every 3 years.

Take the well-known U.S. News & World Report "Best Hospital" Rankings. Whether a hospital is ranked in a particular specialty depends on its score, almost a third of which comes from its reputation with specialists. There may be a good argument for doing that, but to me it's too much like my physician recommendations. The website of the federal Centers for Medicare and Medicaid Services features a Hospital Compare tool that allows patients to search for and compare up to three hospitals at a time based on statistics such as the percentage of patients with heart failure who receive appropriate discharge instructions, medications, and smoking cessation counseling. It tells you if readmission and death rates for patients with heart attacks, heart failure, or pneumonia are better, the same as, or worse than the national average. (Higher rates of readmission could indicate that the hospital didn't do a good job of treating the patient in the first place or provided inadequate instructions for follow-up care.) It also provides subjective measures like patient surveys about the responsiveness of nurses and doctors, the cleanliness of rooms and bathrooms, and how well pain was controlled.

**

The above post (to be continued in a few days) originally appeared on my Healthcare Headaches blog at USNews.com.