NCQA Puts Pressure on Plans That Don’t Report HEDIS Scores

Overall, HEDIS performance didn’t change much from 1997 to 1998. More troubling is an apparent move away from accountability.

Now hear this: If you’re a health plan that doesn’t participate in HEDIS, you’re one of the lower forms of HMO life. If you do participate but aren’t willing to share your scores with the public, one would have to wonder if you’re much better.

OK, nobody at the National Committee for Quality Assurance actually said this, but that was NCQA’s drift when it released Quality Compass ’99, a compilation of its Health Plan Employer Data and Information Set scores for calendar year 1998.

In previous years, NCQA very tenderly walked around the question of what to make of health plans that don’t play along with HEDIS, the reigning standard of HMO-quality measurement. We know little or nothing about plans that did not participate, NCQA stressed last year when it released Quality Compass ’98.

Note this year’s different tone: “Given the mountain of evidence we now have, it’s difficult to understand why anyone would want to do business with a health plan that doesn’t participate in accountability efforts,” says NCQA President Margaret E. O’Kane. “They just don’t measure up and, worse, they don’t appear to be trying.”

Three assumptions

Three years into the modern HEDIS era, which began with version 3.0 in 1996, NCQA has enough apples-to-apples data to make some generalizations:

  • Plans that publicly report scores consistently show steady performance improvement.
  • Publicly reporting plans outperform their peers, whether those other plans participate in HEDIS but keep scores private — or don’t bother with HEDIS at all.
  • NCQA-accredited plans demonstrate better HEDIS performance than those without the group’s approval.

Accountability is the fire under health plans’ feet that forces improvement, NCQA contends. When this year’s results from plans that always have been willing to bare their HEDIS scores are compared against those for whom the reporting process is new, old hands outperform new by 6 to 25 percentage points, depending on what’s measured.

Some of that is due to the learning curve. First-time plans have to master nuances of data collection and reporting, and they often blame those hard-learned lessons for relatively poor first-time scores.

Perhaps that’s one reason NCQA asks the media not to be too harsh on plans whose scores fall near the bottom of Quality Compass ’99 measures. “Evidence suggests that nonreporting plans have lower aggregate HEDIS results than plans that report,” NCQA writes in its Quality Compass press kit. “It would be wrong to conclude that plans appearing at the bottom of the performance spectrum are the ‘worst’ plans. Plans that truly deserve that title likely do not appear in Quality Compass at all.”

The secrecy aspect troubles NCQA, which has seen the number of publicly-reporting plans drop 19 percent since 1997, despite an overall increase in health plan participation. Market pressures apparently have not encouraged plans to come clean, a development that may stem partly from numerous surveys suggesting low employer interest in health plan quality. It will be interesting to see how NCQA accreditation affects HEDIS participation and plans’ willingness to report their data publicly; last month, HEDIS performance began to count for one fourth of an HMO’s accreditation score.

Of New England and beta blockers

Of those that do report, NCQA says, the results are meaningful. Eighty-five percent of HMOs’ data submitted to NCQA was audited. General findings:

Regionally, New England leads the nation — for the third straight year — in performance on eight core measures. And again, the South Central U.S. has some catching up to do, averaging 10 to 30 percentage points behind New England on key measures.

There is a strong link between HEDIS rankings and member satisfaction. Better HEDIS performers tended to earn higher marks from consumers.

The gaps between lowest- and highest-scoring plans in a given measure are still wide. In the most stunning example of disparate performances, rates of eye exams given to diabetic members range from 80 percent all the way down to 10 percent.

Worse, aggregate health plan scores for many other core measures showed little or no improvement, or even a slight decline, from the previous year’s survey.

But for a few high-profile issues, average scores increased substantially. One shining example is use of beta blockers after a heart attack. Since 1996, plans’ combined performance on this measure has jumped from 62.5 percent to almost 80 percent. Considering the amount of attention given this measure in the last two years, it could be argued that HEDIS has done more to elevate awareness of beta-blocker use as a proven and relatively inexpensive way for physicians and health plans to save lives than did all of the relevant clinical guidelines that preceded it.

Further information about Quality Compass ’99 and HEDIS can be found at NCQA’s web site,

Our most popular topics on