In the world of health care quality measurement, perhaps nothing these days is more unloved than process measures. Providers are frustrated by the documentation these measures require, while payment models increasingly drive accountability for outcomes over process.

So you might be surprised to learn that process measures have their defenders in large health systems. But they’re not necessarily the same measures CMS wraps into payment programs. Take joint replacement procedures, which tether providers to Surgical Care Improvement Project (SCIP) process measures like timeliness of antibiotic administration or removal of urinary catheters.

Robert Pendleton, MD, chief medical quality officer at University of Utah Health Care, an integrated health system serving patients in six Western states, says the evidence correlating several SCIP measures with outcomes is relatively weak. “We challenged our providers and said, ‘We may want you to include one or two SCIP measures because we have to, but let’s walk through care delivery. What, in your mind, has the biggest impact on outcomes?’ And they came up with a different list than SCIP. They came up with things like patient engagement and getting the patient out of bed immediately after surgery.”

Robert Pendleton, MD

Giving physicians real-time data can be quite effective in improving care processes, says Robert Pendleton, MD, of University of Utah Health Care.

Those are process measures, but two that made a difference—and two that proved the treatment team right. Their implementation led to lower rates of readmission and surgical site infections.

Choosing the right blend of quality metrics for each DRG is one of the many intricacies of Value-Driven Outcomes (VDO), Utah’s answer to the challenge of how to “do” value-based care. An initiative five years in the making, VDO matches indicators of quality to DRGs with substantial variation in cost within Utah’s own system. Those indicators are a combination of process and outcomes measures—some defined by CMS and some developed internally by care teams.

The results, which Vivian Lee, MD, and her colleagues at Utah published in JAMA last fall, have been impressive: an 11% reduction in direct costs for total joint replacement after two years—one of three VDO pilots reported in JAMA.

(Approaching) perfect takes practice, saves money

Results for University of Utah show that as scores on a perfection care index went up, costs for total joint replacement went down.

Perfect care, %

Perfect care was achieved if:

  • Patient was admitted to the Orthopedic Trauma & Surgical Specialty unit
  • No patient safety indicators occurred
  • Patient was discharged to home with Home Health
  • Patient did not come to the ED within 90 days of discharge
  • No hospital-acquired conditions occurred

Source: University of Utah

“This is not incremental. It’s a big deal,” Michael Porter, the Harvard Business School professor who is credited with developing the idea of value-based care, said in a JAMA podcast after publication of the study. “I hope this will be the shot heard ’round the world. Now that we have these results, there are no more excuses. Everyone needs to get on the bus.”

The other two pilots were also encouraging. One was an effort to reduce unnecessary hospitalist use of lab tests; average lab test costs per person per day fell 12% in 4,276 patient encounters. The other was aimed at reducing mean time to anti-infective delivery after signs of sepsis; time was cut almost in half over 76 encounters. Results from all three pilots were statistically significant.

Pinning responsibility

VDO is complex, with multiple operational variables. Its “opportunity index,” which measures cost variability among DRGs, helps to prioritize areas for improvement. For each DRG, six to eight national and locally developed process and outcomes metrics compose a “perfect care index,” adherence to which is measured to gauge effects on outcomes. VDO also tracks costs per patient and care team, all the way down to each bandage and minutes of nursing time. Data generated from these variables feed an analytics system that creates tools for providers to see their performance in real time.

One of those tools is a scorecard for each admission, showing component costs of care (e.g., pharmacy, labs, or facility use) and whether each measure in the perfect care index was met. Utah has developed scorecard templates for about 30 DRGs and intends to create 50, but none is cast in stone.

Yoshimi Anzai, MD

The right processes matter, says Yoshimi Anzai, MD, the associated chief medical officer at University of Utah Health Care. It could be something simple like saying “your mother is going home today after surgery.”

That’s because the measures in the perfect care index for each DRG change periodically. “Once we reach close to 100% [performance], we take that metric out and add something new,” says Yoshimi Anzai, MD, MPH, associate chief medical quality officer at Utah. “It’s always continuous quality improvement for every scorecard.”

Often, the focus of perfect care index measure development and refinement comes back to processes. When care teams were not consistently hitting the “early mobility” measure for joint replacement patients, the root cause turned out to do with in-house physical therapist scheduling. Adjusting physical therapists’ work hours to get more patients out of bed on Day 1 led to lower lengths of stay and costs. Engaging the care team in troubleshooting and refining the measure created champions for it.

A second VDO engagement tool is an online “value explorer,” which allows each provider to see his own costs of care, lengths of stay, and other metrics compared with peers in real time.

“The immediacy of VDO allows us to tap into behavioral economics principles,” says Pendleton. “If we can show within my peer group of 20 hospitalists that two take better care of patients with pneumonia than the rest of us, that is a powerful way to get the other 18 to question [why] and learn from the best.”

Yet a third analytic tool within VDO calculates supply costs at the team and provider levels. This tool untangles the complexity of accurately assigning costs to, say, a specific surgeon. For some diagnoses, like joint replacement, costs are easy enough to isolate, but for a complicated condition like heart failure, cost granularity can be difficult but is essential for driving change.

“The data have to be actionable, or you get into this notion of ‘I didn’t order that. That’s not my responsibility,’” says Anzai.

Utah applies the same thinking to lab and imaging tests to curb waste and help physicians understand what they can control. “If you attribute all the lab tests for the entire admission to one discharge provider, they’ll say ‘Well, I wasn’t caring for that patient on those days,’” says Anzai. “So we took one more step to say ‘when you are the attending for these service dates, that order is attributed to you. You have to own the data.’”

The intent, she says, is not to chasten physicians; rather, Utah depends on them to standardize care practices and drive culture change. Physicians often know which process, imaging study, or instrument will lead to better outcomes. “Go to the people who provide patient care every day and ask,” says Anzai. “We have to be open to listening to them.”

Small changes, big improvements

The execution of VDO has yielded a few lessons. One pleasant surprise has been the degree to which physicians have been willing to explore new ways of doing things when presented with real-time data. “That has exceeded our expectations,” Lee told JAMA after publication of her team’s article.

Pendleton believes the real-time approach is more effective than many CMS value-based programs, however well intentioned they may be. The two- and sometimes three-year-old data those programs use to spur behavior change, he says, “is distant enough for providers that it’s not nearly as motivating.”

Another lesson is that cost variations within a DRG aren’t always what they seem. The challenge lies in identifying patient subpopulations. For patients receiving hip and knee replacements—a seemingly homogeneous group—the outliers turned out to be joint replacements done on patients with bone cancer in Utah’s cancer hospital. “Those patients are a very different group,” says Pendleton. “The supplies they need are completely different. The complexity of the surgery is different.” Once they were carved out of the index, there was far less variation.

A final lesson is that care redesign does not have to be a major breakthrough; processes that may seem trivial could improve outcomes. A clinical pathway team at Johns Hopkins demonstrated this last year when it reported that setting the bed angle in the ICU to 30 degrees reduced the risk of ventilator-associated pneumonia in patients undergoing CABG.

The same is true at University of Utah: The right processes matter. Like adjusting PT schedules. Or in how surgeons talk with family members. Saying “your mother is going home today after surgery” gets the family thinking about what needs to be done to improve post-op outcomes, says Anzai.

And therein lies the difference between how CMS imposes process and how Utah uses it to its advantage.

“The history of process measures has been a somewhat disappointing story,” says Pendleton, who thinks that by the time process measures from on high are scaled up, they have less impact than organically grown measures that encourage providers to think about how to deliver better care. “When you have 2,000 payer-mandated process measures, then health care systems are really stymied. All they can do is keep up rather than innovate.”

The perfect care index, by contrast, “becomes an intrinsically motivating tool to engage our providers in defining the processes in our own culture that we believe will deliver on health outcomes,” he says. “The data don’t have to be perfect because you’re not worried about reporting it to some public entity. You’re using it for the sole purpose of improvement and learning and driving toward better outcomes for patients.”

A string of little experiments

One of the underlying principles of the Value-Driven Outcomes program at Utah is continuous quality improvement (CQI), which has its origins in the postwar Japanese manufacturing boom. The CQI process of measure, analyze, engage, improve, and measure again is reflected in the use of real-time cost data to motivate providers to improve performance and in adjustments to the perfect care index when performance on measures tops out.

It requires a shift in mindset for many physicians, says Yoshimi Anzai, Utah’s associate chief medical quality officer. “In research, you have a definable, controlled environment, and you study it and get results. In between, you won’t change parameters, because you want clean data. But value improvement in the real world requires iterative change.”

To help physicians bridge that gap, Utah brought in professors from the David Eccles School of Business at the University of Utah. They trained physician-led provider teams in how to understand root causes of variability in care and how to apply principles of process improvement to reduce it.

“Physicians have this inquisitive, scientific methodology mind—we’re comfortable with things like hypothesis testing and trying a little experiment and seeing if it works. At the core, CQI is not fundamentally different. It’s just stringing together little experiment after little experiment to see what works,” says Robert Pendleton, Utah’s chief medical quality officer. But when providers are trained to think about quality assurance, it challenges what he calls “some long-held false assumptions—and that is, ‘Well, of course everything I do as a doctor is perfect.’”