JCAHO’s Oryx Initiative Links Outcomes with Accreditation

By March 2, hospitals and long-term care organizations must inform the Joint Commission which clinical outcomes they will measure. The plan telegraphs a change in accreditation standards that few care to predict.


At first blush, it seems so simple. Post-surgical infections, high Cesarean-section rates, deaths following cardiovascular surgery: These are negatives for health care organizations. So why not accredit only organizations demonstrating good results?

Each year, the Joint Commission on Accreditation of Healthcare Organizations evaluates 18,000 such entities and programs in the United States, including hospitals, home health agencies, organizations that provide long-term care, behavioral care, lab services and ambulatory care, not to mention health plans and integrated delivery networks. Accreditation is based on many criteria, such as staffing, plant and internal procedures, but until now, how all of these factors actually affected patient care was not the focus.

Several years ago, JCAHO proposed requiring hospitals to submit outcomes data. The outcry was immediate: too much paperwork, numbers not comparable or meaningful, and so on.

“The way this is playing out,” admits Jerod M. Loeb, Ph.D., JCAHO’s vice president for research and evaluation, “is not the way we planned. We thought everyone would use the measurement system we developed. We figured the C-section rate in Hospital A could be compared to that of Hospital B. But we learned that the Joint Commission was not the only voice in the wilderness, that other entities had developed systems of measurement.”

Loeb says the Joint Commission decided to make its program more flexible by using those other systems of measurement. “We are all moving toward the same goal,” he says, which is standardization.

Perhaps to set the new approach apart, JCAHO selected an unusual name–Oryx, conjuring up a misleading, but no doubt comforting, image of a carefree antelope leaping across a plain. Oryx lurches into being March 2, the deadline for hospitals and long-term care facilities to report which standards of measurement they will embrace as a first step toward outcomes reporting. Networks, health plans and preferred-provider organizations seeking JCAHO accreditation are governed by different standards.

Choosing measurements

An earlier deadline of Dec. 31, 1997, was extended by 60 days, but the moment of truth is arriving. The requirements and schedule are a complicated stew of compromise and anxiety. Basically, hospitals and long-term care organizations select an outcomes measurement program from one or more JCAHO-accepted vendors. The program or programs must track outcomes for at least two clinical areas that, together, cover at least 20 percent of a hospital’s patient base. Hospitals may choose up to five performance measures — but if they do choose five, they do not have to cover the 20 percent.

It helps to know what a measure is. JCAHO defines a performance measure as a quantitative tool (e.g., a rate, ratio or index percentage) that indicates an organization’s performance in relation to a specified process or outcome. In general, this means measuring an organization’s delivery of clinical services against an external benchmark. Example: a hospital’s C-section rate as a percent of all deliveries, measured against statistics for a comparable group.

The measures chosen, moreover, have to be relevant to the hospital’s internal performance improvement activities. If the hospital is already engaged in a vigorous program to prevent postoperative infection, the rate of postoperative infection in some segment of its surgical populations would be a good measure to choose.

Why only a fifth of a hospital’s census? In an attempt to make the numbers specific and meaningful, JCAHO nixed global measurements, such as overall mortality figures. “It is difficult to look at hospitalwide data,” says Loeb. “Within an organization, though, it is possible to drill down from the global to the specific. For instance, if we see eight-percent hospitalwide mortality, we don’t know if it’s from a bad doctor, poor intensive care techniques or bad determination of surgical risks, but the hospital can determine that.”

There are stepped increases in the reporting requirements — increases which, too, have been scaled back from earlier plans. Hospitals start collecting data by the third quarter of this year and report to JCAHO in the first quarter of 1999. This fall each institution will have to choose at least two more measures, and two more in 1999.

Will these data affect a hospital’s chances for accreditation? Not directly — for now. According to Loeb, when a hospital’s accreditation comes up for review, JCAHO will look at its measures and if a problem is detected, will require the hospital to show what steps it has taken to deal with it. Implied, but not stated, is the possibility that lack of attention to the measure might reflect negatively on the institution, but no one is sure how. At present, JCAHO will hold results confidential. They will not be released to purchasers to assist them with quality comparisons, but that, too, could change.

Differing requirements

Because many health plans, integrated delivery networks and PPOs are subject to reporting under the National Committee for Quality Assurance’s Health Plan Employer Data and Information Set, as well as other existing programs, these groups follow a different pattern. Instead of contracting with a commercial vendor, these JCAHO-accredited entities had until the end of last year to choose 10 measures from one or more of the consensus-based measurement sets already in existence, such as HEDIS. Other acceptable sets include those developed by JCAHO; the Foundation for Accountability; University of Colorado Health Science Center (for home care), and University of Wisconsin (for long-term care).

Networks and health plans will collect data on clinical performance, health status, satisfaction and administrative/financial aspects. This, too, is something of a liberalization — formerly, allowable measures were limited to specific performance improvement standards from the Joint Commission’s network and PPO accreditation manuals.

Loeb says such entities do not have to report performance data to JCAHO: “This is complicated by the fact that the plans have hospitals associated with them that function under other requirements.”

George S. Conklin is chief information officer and vice president of information systems for Integris Health in Oklahoma City, part owner of WellCor America, which operates a hospital and several HMOs and PPOs. For the network side, Integris will select HEDIS measures, although Conklin is not focusing on that yet. He also won’t reveal what measures are being selected for the hospital requirements. “Not until the deadline,” he says.

Theory vs. practice

His problem is, how do you determine which measures are integral to your operations? Among his group’s primary areas of business, the largest volume is in cardiology and several postoperative areas. Integris contracted with MedStat to measure hospital outcomes.

But Conklin says his group would have done this, Oryx or no. At his hospital, a performance council identifies problems and establishes corrective action, but as Conklin points out, “Now the accreditation process is involved.” Conklin is worried about being compared to other hospitals. “We have a transplant program here,” he says. “Our patients are of higher acuity.” Although data will be adjusted for severity, Conklin has doubts.

“We think this is an excellent initiative,” says Rick Siegrist, president and CEO of HealthShare Technology Inc. in Acton, Mass., one of several hundred JCAHO-accepted vendors of software and measurement products for hospitals and long-term care facilities. “Our clients start with data they have access to, and over time they will be able to report more sophisticated data.”

HealthShare’s hospital clients will submit data quarterly to Taryn Vian, director of operations. Using the company’s software, she will make sure the data are good, then calculate a comparison to other group indicators, such as a statewide average for the previous year or other customized measurement. She will then feed that back to the client as well as sending it directly to JCAHO. “Over time,” she says, “the client can see trends in its own indicators. It can say, ‘We’re not within three standard deviations. What should we do?'”

Understandably, teaching hospitals are a preoccupation of the Clinical Administrative Data Service, a JCAHO-accepted vendor that also happens to be an arm of the Association of American Medical Colleges in Washington. “All institutions using CADS have a heavy commitment to quality improvement,” says David M. Witter Jr., director of CADS. “The value of Oryx is that it will push other institutions that have not thought so much about it to get involved.

“The system affects teaching hospitals in a different way,” Witter stresses. “Typically, the kind of measures they are talking about provide no value to a research hospital that does few surgeries or has no Ob/Gyn department. They have to get creative to put together measures that will cover 20 percent of patients. Comparing teaching hospitals to community hospitals makes no sense.”

One vendor, Baltimore-based HCIA Inc., spent the 18 months between the time it applied to JCAHO and its approval wisely. “During that time, we adapted our existing products to meet Oryx requirements,” says Senior Vice President Jean Chenowith. When Oryx jelled, HCIA had eight products, including data collection systems.

“A key requirement of the Oryx initiative is sound comparative data,” she continues. “Our experience has been that by providing clinical, financial and productivity benchmarks — instead of averages for the past several years — hospitals have made very rapid changes. They prefer to compare against best performances rather than averages, which point them toward mediocrity.”

The rules to follow

Chenowith says JCAHO gave HCIA general guidelines for developing measurement systems. “Their target was to identify problems that are not narrowed to a single disease or physician, but are systemic across services, diseases or procedures.”

What has been the reaction of hospitals to this requirement? “A lot of smaller hospitals are having difficulty with the process,” she notes, “while some teaching hospitals have such sophisticated programs that they have difficulty making it relevant to their operations.”

Bill Berman, Ph.D., is president of BHOS Inc., a White Plains, N.Y.-based company that offers JCAHO-accepted behavioral health measures. He points out that a JCAHO measure can be as simple as the ratio of psychiatric patients showing improvement in health status upon discharge. “The problem is comparison,” he says. “Hopefully, hospitals will pick measures that say something about their populations.”

Ted P. Patras, M.D., is vice president and director of health care for JBS & Associates, a consulting firm in Chicago. As a physician board-certified in emergency medicine, he has watched the process with interest. “I have been through Joint Commission site visits. The month before the visit, things become more polished. When they leave, what is the result?” he asks rhetorically. “The same concept applies with Oryx.

“Now that hospitals are going to focus on outcomes data,” Patras continues, “should we expect managed care organizations to spend money on clinical practices and provide better health care delivery if outcomes are unfavorable, or will they focus on creating data with better outcomes? You can conduct hundreds of performance studies but it does not guarantee good practice of medicine. Are we really going to improve health care delivery with Oryx, or should we concentrate on standardizing clinical delivery methods?”

Some vendors will help clients orchestrate change, in addition to collecting and reporting numbers for them. Deerfield, Ill.-based MMI Companies Inc. is not only a JCAHO-accepted vendor, but as a risk management company, it can show clients how to repair performance flaws revealed through data, including Oryx data.

As an example, MMI targeted poor Apgar scores for a group of its hospital clients, helping them cut their average of 1.30 low-scoring births per 100 to 0.93. “The outcome was the Apgar scores,” explains Dorothy Berry, R.N., assistant vice president for health care information services, “but the ways to change it were using fetal monitors more effectively and addressing the human factor in interpreting and responding to that monitor information. Most organizations want to go beyond the numbers to create change.”

The bottom line

Because Oryx results, for the most part, will not be made public, and because they only indirectly affect accreditation, what will be the bottom line on improving outcomes? “The responsibility lies with the institution,” says JCAHO’s Loeb. “But as we begin to identify trends, we can serve as a means of stimulating change. And we will also be able to identify the high performers.”

Berman disagrees. “I don’t think the Oryx process itself will improve performance. It’s not designed to do that. It’s designed to set minimum standards for quality. The real question is, can organizations take outcomes measurements and combine them with process measurements to see what should be done?”

Hospitals that have concentrated on clinical outcomes for a while and that are willing to be held accountable for them may choose to be included in JCAHO’s Oryx Plus. Under this program, hospitals voluntarily release their information to the public. In some markets, hospitals may feel competitive pressure to participate in Oryx Plus, which kicks off July 1. There are no restrictions on how hospitals can use comparative information.

“Hospitals participating in Oryx Plus,” promises Loeb, “will be recognized by consumers, employers, payers and government for their commitment to self-evaluation and accountability.

“This is our vision for the future.”

Jean Lawrence is Managed Care’s “Employer Update” columnist.