Will Integrity of HEDIS Data Improve with ’98 Version?

At first glance, the ’98 HEDIS looks a lot like the ’97 model. But lift the hood and you’ll find that NCQA re-engineered HEDIS to give better performance, thanks to improved data-collection techniques.

Sometimes, it’s what you don’t see that matters. Eight months from now, the 1998 Health Plan Employer Data and Information Set will give purchasers virtually the same kinds of information it provided last year. But the difference, says the National Committee for Quality Assurance–which developed HEDIS–will be in its reliability.

After changing the recipe for measuring health plan performance every few years, NCQA thinks it now has it right. HEDIS 1.0, released in 1991, was a starting point for purchasers to gauge the value they got for their health care dollars. Two years later, HEDIS 2.0 added performance measures intended to hold plans accountable for meeting enrollees’ needs. The maturity of HEDIS blossomed last year with version 3.0’s inclusion of yardsticks relevant to Medicare and Medicaid populations, as well as measures meant to nudge plans into addressing public health issues.

From here on, NCQA will merely fine-tune its HEDIS recipe, adding a pinch of this and a bit of that annually. With minor revisions, HEDIS 3.0/1998 contains the same measurements as its predecessor. But by tightening plans’ sampling and reporting procedures, and by unleashing independent auditors to check plan-reported data, NCQA hopes to reassure purchasers who are wary that some plans might take advantage of procedural loopholes to try to inflate their scores.

Subtle differences

Unlike previous remakes, the ’98 edition introduces no new performance measures. One existing measure, treatment of children’s ear infections, was moved from the reporting set (measures that participants report to NCQA) to the testing set (those under development) because some plans misinterpreted it.

“The intent was to show percentage of children with first-time infections on inappropriate antibiotics,” says Joseph Thompson, M.D., M.P.H., NCQA’s vice president for collaborative research. “But the way it was specified, it looked like appropriate antibiotics.” NCQA hopes to clarify reporting specifications in time for HEDIS ’99.

HEDIS ’98 drops 14 narratives that had been required in ’97. These are written descriptions of operational aspects such as physician compensation, utilization management processes and rate trends. NCQA felt the information wasn’t comparable across plans and therefore not useful to purchasers.

Consumer advocates may argue that disclosing information about physician compensation, for instance, would indeed be useful. But Philadelphia lawyer Alice Gosfield, who lectures frequently on issues of accountability–and who is chairman of NCQA’s board of directors–says market, regulatory and political forces under way address those issues. HEDIS, she says, does not have to be the sole source of information about plans.

“When a plan presents narrative information, it’s naturally self-serving,” she suggests. “Whether removing this step impedes accountability depends on how a plan provides that information. HEDIS is about quantification of performance, and narrative information is not about quantification of performance.”

For plans, the biggest change governs how they report HEDIS data. NCQA says the days that they could, as Gosfield wrote in Health Affairs last year, “turn themselves inside out to report good HEDIS data” are gone. NCQA now specifies methodology by which plans must establish samples for collecting data.

“If a plan pulled a random sample on a specific measurement–such as breast cancer screenings –and scored poorly, they could discard the sample and pull another,” says Thompson. “Now we have forced plans to pull only one sample.”

Similarly, NCQA strengthened exclusion criteria so that for each measure, groups with like characteristics are compared across health plans.

Honor system fading?

While plans are not required to submit audited data under HEDIS ’98, NCQA expects more of them to do so–thanks in part to competitive pressures. Last year, some HMOs that transmitted audited data complained that NCQA’s Quality Compass ’97 report listing health plan HEDIS scores didn’t provide apples-to-apples comparisons, because they were ranked against plans that submitted scores on a take-our-word-for-it basis.

NCQA has licensed six organizations to audit HEDIS data. One is Ipro, a Long Island, N.Y.-based health care quality evaluation company whose clients include managed care companies and the Health Care Financing Administration.

Ipro will examine plans’ information systems and data collection reporting processes. Herman Jenich, Ipro’s senior director of managed care, says several variables can affect the validity of data reported to NCQA–among them, plans’ technical capabilities and their grasp of HEDIS. “HEDIS is hard,” says Jenich. “If one line of programming is off, that can be a significant error.”

There are two components to an audit: a general review of health plan data systems and processes, followed by examination of samples for specific measures. “You take lessons you learn from the general systems review to identify measures that may be risks for a plan,” Jenich explains. Ipro also will interview HMO personnel onsite. “The documentation tells you so much, but you really want to talk with people who did the work to find out what they did and why.”

HEDIS ’98 may not satisfy critics whose perception of earlier versions was that they paid plenty of attention to process but not enough to outcomes. But should it? While outcomes are one way to define health plan quality, Gosfield thinks employers and consumers are likely to have different perceptions of what quality means. “Outcomes speak significantly to the overall health of populations, but they are not how individuals experience their health care interactions,” she says.

Gosfield says there are intermediate outcomes and there are ultimate outcomes, the latter requiring longitudinal perspective. “HEDIS measures intermediate outcomes–beta blocker use after a heart attack, prenatal screenings, immunizations–which are not ultimate outcomes. They are process measures.”

Whether NCQA will move in that direction after collecting several years of data is unknown. But for the short term, the work of NCQA’s internal measurement advisory panels give a glimpse of how HEDIS may continue to evolve.

The panels develop and refine measures for future versions of HEDIS. “Their highest priorities are measures of care for those with chronic diseases,” Thompson points out. In March, NCQA will announce which measures, based on the panels’ recommendations, will be in HEDIS ’99.

Those recommendations will focus on asthma, cardiovascular disease, behavioral health and women’s health. Several asthma measures under consideration–including use of controllers and school and work days lost to asthma–may be refined in time to appear in HEDIS ’99.

As HEDIS evolves, health plan input is crucial. Several hundred plans have representatives on NCQA’s HEDIS users group, which gives the agency feedback about performance measures that are either in place or proposed. Thompson says that input helped to shape HEDIS ’98, and will continue to do so as the program grows.

“We are cognizant of the burden adding measures places on plans. That is counterbalanced by purchasers’ needs,” he says. “As we add measures, we reassess the burden, so that the information we collect is appropriate for everyone.”