Maybe health care consumers aren't as dumb as they look

It’s been clear for some time that consumers don’t tend to make use of health care quality reports even when they’re available. If anything, consumer reliance on such reporting is dropping. According to the Kaiser Family Foundation:

Fewer U.S. residents use Web sites that rate health services when selecting insurance plans, hospitals or physicians, according to state and national studies…

A survey released in October by the Kaiser Family Foundation found that fewer than 15% of U.S. residents used quality ratings services to help them make decisions about health insurance plans, hospitals or physicians, compared with about 20% of people who said they had used comparative quality ratings in 2004 and 2006. Most people said they never have seen or used comparative quality information services, the survey found…

Bryan Liang, executive director of the Institute of Health Law Studies at California Western School of Law, said, “The basic problem of these kinds of ranking systems is that patients do not choose on the basis of scores,” adding, “They choose on the basis of personal familiarity and experience with the health care entity or provider”

It sure sounds like consumers need more education to act on the information that’s out there. The CMS website Hospital Compare has even been running an advertising campaign to encourage consumers to make use of such data.

But maybe consumers are rational to ignore quality ratings, at least for hospitals. An article in the current Health Affairs (Choosing The Best Hospital: The Limitations of Public Quality Reporting) reveals why this may be so.

The authors identified five public reporting services (Hospital Compare, HealthGrades, Leapfrog Group, US News and World Report, and Massachusetts Quality and Cost Council) and used them to compare various Boston-area hospitals on four non-emergent conditions: community acquired pneumonia, total hip replacement, percutaneous coronary intervention (PCI) and coronary artery bypass graft (CABG).
The services provided wildly inconsistent rankings, even on the same measures. For example:

…the two hospitals ranked first for CABG by at least one service were also ranked fourth and last by another… Conversely, the two hospitals that were ranked last for CABG by at least one service were ranked first or second by another.

But even more damning than that is that the ratings don’t reveal any serious differences.

Most rating systems did not perform statistical tests, but when they did, all nine hospitals were indistinguishable. In fact only one hospital (out of 71) in the state had cardiac mortality that was statistically better than the mean.

It’s actually even worse than that. The scores for each hospital represent an average across physicians and cases. Who’s really able to say what the quality will be for a given patient with a given doctor?

Considering the state of the art, no wonder people rely on personal experience, relationships, and anecdotes when choosing a hospital?

By the way my preferred way to choose a hospital or physician is to speak to the fellows, or –better yet– have a family member who’s a doctor do so. Fellows are in the best position to see and understand what really goes on, and are still young and idealistic enough to level with you about it. I admit this is not a practical route for most people, but that doesn’t stop me from recommending it.

December 3, 2008

6 thoughts on “Maybe health care consumers aren't as dumb as they look”

  1. The idea we should rely on the anecdotal opinions of medical fellows because public quality report cards on hospitals don’t all agree with one another is like saying we should rely on our mechanics to tell us which car to buy and ignore all the varying consumer ratings guides on the market. Whether you are choosing a car or a health care provider, you ought to have access to reliable comparison data, not just anecdotes and opinions. Not all quality reporting is alike, and the idea of public reporting is so new consumers are still learning how to use which data. I won’t speak for all the report cards, but employers and large purchasers created Leapfrog to help pioneer this new strategy of selecting hospitals based on reliable information–which means evidence-based, not anecdotal. These purchasers ensure that Leapfrog’s data and measures are evidence based, NQF-endorsed, and vetted by the nation’s top experts in hospital quality. The data in Leapfrog’s survey shows variation in quality among hospitals, which is why purchasers, health plans, and hospitals track survey trends so closely.

  2. Yes,

    The answer is TO PAY patients to go to the best selected hospitals and doctors. Sounds stupid right? Of course, but there is really no incentive for people unless they get paid.

    How are these hospitals, physicians and health institutions ranked? In terms of cost to insurers, outcomes reported? That seems to mean more to the insurers and not the patients, because patients are not directly benefitting. Yes, they may have a chance to get better results, but patients have a hard enough time getting motivated enough to take responsibility FOR THEIR HEALTH.

    Its a shame that people don’t take more advantage of these systems…but what’s the real motivation for them?


  3. Leah, I don’t think your analogy is quite right. It would be more like expecting consumers to rely on Consumer Reports even if there was no real difference between the full red dots and full black dots.

    Having said that, ratings from Leapfrog and others are useful for purchasers and providers, but I’m not sure they’re really so useful for patients yet. Unfortunately in Eastern Massachusetts at least, purchasers are unwilling to exclude Partners from their networks or even to charge patients differential rates for using different hospitals, as Steve alludes to.

Leave a Reply

Your email address will not be published. Required fields are marked *