Closing the 17-year Gap Between Scientific Evidence and Patient Care

By Daniel Niven

Not many patients would be happy to hear that there’s a lag of about 17 years between when health scientists learn something significant from rigorous research and when health practitioners change their patient care as a result, but that’s what a now famous study from the Institute of Medicine uncovered in 2001. 

The study reflects a major problem that has plagued healthcare for decades — namely, the timely integration of high quality scientific evidence into daily patient care practices.

If you knew there was research available to guide the healthcare you required, wouldn’t you want your healthcare provider and the healthcare system to use that research to inform decisions pertaining to your care? Wouldn’t you want to receive care that is scientifically proven to be of benefit and not receive care that is scientifically proven to be of no benefit?

Although it has been clear for centuries that science contributes to advancing the practice of medicine and improving disease-specific survival rates (for example, the discovery of penicillin and its effect on infection-related mortality rates), this concept only became popularized within the medical community toward the last quarter of the twentieth century through the “evidence-based medicine” movement.

More recently, those who work in the field of ‘Knowledge Translation’ have been working hard to close the gap between research and practice.  For the most part, they’ve done this successfully by making the abundant research findings more accessible to policy makers, professional societies and practitioners and ‘nudging’ them to adopt more timely evidence-based practices. 

Their methods have largely focused on the adoption of new beneficial practices – new drugs, tests or interventions with substantial evidence behind them.  But a pattern has lately emerged from the scientific literature: new is not always better and too much healthcare can sometimes be bad for your health.

Owing to the recognition that unnecessary practices may negatively affect patient outcomes — and contribute to burgeoning costs within healthcare systems — there is now a movement afoot to promote the de-adoption (discontinuation) of practices currently used in daily patient care that research finds to be of no benefit or potentially harmful, collectively referred to as unnecessary practices. Initiatives such as the Choosing Wisely campaign, the Less is More and Reducing Research Waste initiatives have sprung from medical professional societies and high-ranking medical journals to help reduce the practice of ‘too much’ healthcare

For example, it turns out that cervical cancer screening in women under age 30 years is not beneficial and may cause unnecessary follow up testing; the use of bone cement to treat painful spine fractures among patients with osteoporosis does not improve pain any more than usual care, and placement of stents in the coronary arteries of patients with narrowed arteries but minimal symptoms is no better than treatment with medications alone. 

Other examples include reducing the use of a sophisticated monitoring device (pulmonary artery catheter) to obtain frequent measures of heart function in patients with heart failure and tightly controlling blood sugar using intravenous insulin in patients admitted to intensive care units.

For each of these examples, new research demonstrates that they do not improve patient outcomes, yet each persists to some degree in current clinical practice.

The 17-year gap between research and practice traditionally refers to the time required to adopt new practices. Unfortunately, new research shows that it may take even longer to de-adopt unnecessary practices. Regardless of the direction of practice change, shortening the gap between research and practice has been a long time coming and can only help improve outcomes for patients and control healthcare spending.

How do we get there?

Shortening the time gap between research and practice will require an increased understanding of what it takes to implement new research and a reduction in the time new research is reflected in professional guidelines.

Guidelines also need to be less cumbersome and directed more toward use at the point-of-care rather than simply a reference document.  Healthcare systems also need to be engineered so that frontline providers have a greater likelihood of providing care that is congruent with current science. This is likely best facilitated through use of comprehensive electronic medical records.  Given that many healthcare systems still employ the traditional paper-based charting and order system, this will require considerable financial commitment.

Moving from research to improved practice more rapidly will also take an engaged group of stakeholders — professional societies, healthcare providers, patients and their family members, medical administrators and governments – who appreciate the long-term benefit that may be derived from such considerable initial investment of time and money.

A healthcare system that enables providers to consistently deliver care that aligns with recommended best practice should arguably be one of our national priorities.

Daniel Niven is an expert advisor with EvidenceNetwork.ca, an Intensive Care Physician and Assistant Professor in the Departments of Critical Care Medicine and Community Health Sciences in the Cumming School of Medicine at the University of Calgary.