A number of years ago, a friend, also a physician, experienced chest pain. His symptoms were serious enough that a reputable cardiologist at a nationally recognized facility performed an invasive coronary angiogram. Afterward he was rushed to the cardiac surgery suite, but not to get a revascularization procedure. Instead, the surgeons worked to repair damage to the heart from the catheterization procedure.
For decades, invasive coronary angiograms have been the gold standard for determining whether a patient has blockage in the coronary arteries and the extent of the blockage. Angiograms involve inserting a long flexible catheter into the bloodstream, usually by way of the femoral artery in the thigh, to deliver contrast agent so the arteries can be seen on an X-ray. Coronary arteries are quite small and moving because the heart is beating, so getting a clear image is technically quite difficult. Still, about 1 million patients have invasive coronary angiography in the United States each year.
The diagnostic angiogram is a fork in the road. It allows the cardiologist to determine if a patient has nonobstructive coronary artery disease (no visible blockages present) or obstructive disease. The former typically leads to medical therapy while the latter may lead to angioplasty and stents or to a coronary bypass graft.
If you perform a Google search on recognized complications of coronary angiography, you will see issues such as renal damage, tears in the heart, stroke, injury to the arteries being visualized, heart attack — the list goes on and on. The risk of serious complications is small (between 1% and 2% of patients), but they do occur.
So patients who present with chest pain present a dilemma to cardiologists. Missing obstructive coronary artery disease could result in myocardial infarction — or worse. Finding and successfully treating obstructive disease does prevent serious heart problems and saves lives, yet an all-comers approach exposes many people to unnecessary risks. It’s also expensive.
When dealing with patients with chest pain, physicians usually start by taking a detailed history concerning the nature of the pain. How severe is it? Does it come on with exercise? Is it shooting or dull? Where does it hurt exactly? Information about family history and risk factors, such as smoking, will be gathered. A physical exam is part of the workup. All of this helps physicians decide whether the patient presenting with chest pain should be categorized as having unstable chest pain, which should lead to an immediate hospitalization, or stable chest pain, which is less critical and buys some time so tests can be performed outside the hospital.
Guidelines for the evaluation of stable chest pain recommend progressive noninvasive testing, such as ECGs, a variety of different stress tests, and CT angiography that involves using a CT scanner to get an image of the coronary arteries. Results of noninvasive tests can be used to stratify patients by risk and to identify those who may need invasive coronary angiography.
The hope had been that the gantlet of noninvasive tests would winnow down the number of invasive tests. But several years ago, Manesh Patel, MD, and colleagues reported the results of a huge study (663 hospitals, nearly 400,000 patients) in the New England Journal of Medicine that suggested that noninvasive tests didn’t do a good job of identifying patients for whom an invasive test was warranted. Of patients undergoing an invasive test, only about one in three turned out to have obstructive coronary artery disease. In a follow-up study published two years ago in the American Heart Journal, Patel and colleagues reported that only 9% of patients referred for noninvasive tests, like an exercise ECG, had high-risk coronary artery disease. Noninvasive test findings have little value beyond clinical factors for predicting obstructive coronary artery disease, they concluded.
So there’s a need for noninvasive tests that will do a better job of identifying serious blockages in the coronary arteries.
The HeartFlow FFRCT system is designed to determine the hemodynamic significance of coronary stenoses by computing fractional flow reserve values using CT angiography data based on computational fluid dynamics and simulated maximal coronary hyperemia.
A California company says it has developed just such a test. HeartFlow recently launched the second generation of its HeartFlow FFRCT Analysis. This technology is based on 15 years of scientific research conducted by Charles Taylor and Christopher Zarins at Stanford University. FFR in FFRCT stands for fractional flow reserve, and the CT refers to the fact that it is derived from a CT scan. The fractional flow reserve traditionally has been done during the invasive angiogram. It’s a measurement of the functional consequences of a partial obstruction — basically the impact the blockage has on the blood flow to the heart. If a partially blocked coronary artery is not functionally impairing the heart’s need for oxygen, it can be safely treated with medication and lifestyle modification.
HeartFlow uses advanced algorithms and pure science to balance the risks and benefits of the various coronary tests. By using the enormous amount of data that can be obtained from a CT angiogram, certified analysts, and a well-honed proprietary algorithm containing millions of mathematical calculations based on the physics of fluid dynamics, an accurate and clinically relevant Heart Flow’s FFRCT can be calculated.
Long-term results from HeartFlow’s fourth clinical trial, PLATFORM, were presented at the American College of Cardiology annual meeting in April. PLATFORM was a European, multicenter, controlled, prospective, pragmatic, comparative effectiveness trial utilizing a consecutive cohort design. It compared standard diagnostic strategies to a FFRCT-guided strategy in 584 patients with stable chest pain.
The study compared usual care to a testing strategy using CT angiography and FFRCT when needed in 584 patients with stable chest pain. If a CT scan shows the coronary arteries to be free of blockages, then all the calculations of the FFRCT aren’t needed. Patients were divided into one of two groups: those with a planned invasive angiogram and those with a planned noninvasive test. Patients in each group were then divided between those who followed the usual diagnostic path and those whose diagnosis was guided by the FFRCT test.
The headline was that the use of an FFRCT resulted in the cancellation of a planned invasive angiogram in over 60% of patients. What’s more, during a year of follow-up, only four of the patients whose planned invasive angiograms were canceled wound up needing the procedure, and none of the 117 patients whose invasive angiograms were canceled suffered an adverse event. The study showed that it was safe to follow a diagnostic strategy guided by FFRCT results. FFRCT also provides information that will help doctors distinguish between coronary lesions that require revascularization and those that don’t.
One of the beauties of HeartFlow’s FFRCT is that no additional equipment is needed at the thousands of hospitals in the United States that already perform CT angiography. In the past, medical interventions came in two basic flavors, medications or devices. FFRCT adds a third: software. It solves millions of complex equations that simulate blood flow in the coronary arteries and provide mathematically computed fractional flow reserve values. FFRCT marries imaging technology and big data.
Based on PLATFORM data, Pamela Douglas, a researcher at Duke, calculated that an FFRCT strategy saves roughly $4,000, not accounting for the cost of the FFRCT test, which is priced at about $1,500. PLATFORM was conducted in Europe, so U.S. savings will likely be greater.
New technology tends to add to our already staggeringly high health care costs. But if these numbers hold up in the real world (a big if, given everything that can happen) then HeartFlow’s software could be ringing in a new era of software advances that make for smarter and less expensive diagnosis and treatment.