Editor’s note: In another Health Affairs Blog post also published today, Joe Selby, the executive director of the Patient-Centered Outcomes Research Institute, responds to David Introcaso’s post below. For more on the concept of patient centeredness, comparative effectiveness research, and the Patient-Centered Outcomes Research Institute, see Health Affairs‘ October issue, “Current Challenges In Comparative Effectiveness Research.“
This past May, the Patient-Centered Outcomes Research Institute (PCORI) approved five research priority areas: “assessment of prevention, diagnosis and treatment options”; “improving health care systems”; “communication and dissemination”; “disparities”; and, “patient-centered outcomes research and methodological research”. In June PCORI announced 50 pilot project research awards totaling $30 million. PCORI is anticipated to spend $3 billion between now and 2019.
What does it mean to be “patient centered” and what does this then mean about ways of “improving “health care systems” and “communication and dissemination”? PCORI belies its “patient centered” mandate since it has not put a primacy on understanding and improving the interaction between the patient and the provider — the only way the quality of health care delivery is ultimately improved. None of the fifty PCORI pilot projects examine the quality of these interactions. Understanding and improving patient-provider interactions also explains how clinical evidence is produced or becomes meaningful.
What makes examining these interactions essential is that knowing and doing are not the same. Consider hand hygiene, likely the most basic health care practice. Though the benefits of hand washing have been well known since Semmelweis, studies show hand washing adherence rates are as low as 30 percent and frequently no better than 50 percent.
Why then is it that where evidence is well known, practice (or doing) does not follow, and what does this suggest about of PCORI’s chances of filling the quality gap by producing and then disseminating ever more evidence?
The Accepted Paradigm
PCORI’s problem or challenge, as hand hygiene compliance rates sadly demonstrate, is that reasons cannot be causes. Since the two cannot be conflated, or since knowing things and doing things do not fold together, research evidence never simply disseminates and transfers.
The accepted paradigm we use for improving health care delivery is one where we assume evidence or knowledge is first produced and then communicated and disseminated. This hypothetical linear process is what the World Health Organization terms the “know-do” gap. Similarly, the National Institutes of Health “Roadmap” program is designed to accelerate “from bench to bedside” the transfer of evidence to practice, and the Agency for Healthcare Research and Quality’s “Office of Communication and Knowledge Transfer” is charged per the Affordable Care Act with creating tools to disseminate PCORI’s evidence. These and other similar efforts assume an ability to disseminate discrete evidence or knowledge in a linear and mechanistic fashion.
This method for improving health care quality means the use of “sound science” to produce “right” knowledge, or the use of explicit rules by which scientific propositions to improve health care can be obtained. This in turn means the generation of new evidence preferably by randomized control trials or at least by systematic reviews, e.g., the Cochrane Collaborative and now PCORI. The next step is to disseminate or transfer new knowledge, or have it travel along a “translational highway” to be brought to ever larger national scale. The effort becomes, to use the journal title, an “implementation science” challenge.
Understandably then, improving health care delivery takes on engineering or technical properties. Improvement means re-engineering or wedding engineering sciences with health care. This means largely driving out variation or bringing to ever larger “scale” uniform care delivery via processes such as Six Sigma or Toyota Lean Production. In the examination room it makes sense to talk about a clinician meeting a standard or a guideline and to then talk about a patient’s compliance with that guideline.
The paradigm thus provides the rationale for numerous other activities intended to improve the quality of health care delivery. It underpins continuing medical education and related professional development activities, as well as many knowledge management and research dissemination programs and other quality improvement and organizational structure or system re-design efforts. It serves as the underlying rationale for evidence-based medicine, performance measures, quality indicators, report cards, check lists, pay-for-performance, and other approaches or programs all endeavoring to transfer evidence to practice.
But does this paradigm usefully explain how we know, and the know-how of improving health care quality? Does evidence creation actually precede the delivery of care to be then simply communicated and disseminated?
An Alternative Paradigm
Unfortunately, progress made in improving care delivery using the accepted paradigm has proven, to be overly polite, slow. Despite all the evidence on ways to improve quality and reduce medical errors and medication non-compliance, improving quality and outcomes remains largely a puzzle. Paradigm testing, Thomas Kuhn said, occurs when “persistent failure to solve a noteworthy puzzle has given rise to crisis” – or times like now. Rather than tinkering with the present paradigm, let’s consider an alternative.
Let’s assume knowledge does not arise singularly first in the mind of one individual, to be then transferred to the mind of another individual. Instead, let’s assume the creation of knowledge or evidence actually begins with one’s response to another’s gesture and continues to build in the ongoing back and forth of the gesture and response of people communicating. Evidence or knowledge creation is then inherently a social act, the product of mutual adaptation. It is not meaning independent. No one individual owns knowledge. It cannot be stored or managed or simply be disseminated or transferred. In sum, people make sense of the world together. Reality, less brute facts, is a negotiated interpretation. As John Searle explains, it is a brute fact that the heart pumps blood, but defining its healthy functioning is not determined via evidence but by a value or goal to which we agree.
Reasons then are not causes. Evidence for an intervention is not by definition evidence for that intervention. Pronovost’s checklist worked because clinicians in Michigan and elsewhere agreed the evidence “commodity” held value or was meaningful. Science determines only the strength of the evidence that exists for any particular hypothesis. It does not presuppose a purpose or end. That’s teleology.
Under this paradigm, evidence creation and delivery improvement are not consecutive — they are more coherently understood as entwined, commonly occurring together in real time. Health care system improvement via systems thinking is actually a fantasy since systems are an abstraction of human interaction — and human interaction is all there is. Physicians and patients, being people, unlike objects in nature, are always self-interpreting (reflective and reflexive) entities. People are not planes. They cannot be engineered. There is ultimately no evidence or quality improvement, and moreover no meaning, absent one person’s response to another person’s gesture.
Consider this example. Dr. Warren Warwick, profiled in an Atul Gawande essay, is highly effective in treating his cystic fibrosis patients because his practice is primarily relational. Though Gawande illustrates Warwick’s success for other purposes, he does finely detail Warwick’s ongoing back and forth interactions with his patients. He focuses on Warwick’s interaction with a particular young female, during which Warwick tries to make sense of the patient’s reduced lung capacity by persisting in asking her about coughs, colds, treatment frequency, etc.
Eventually Warwick learns she has a new boyfriend and job and for these reasons she had been skipping her treatments. Learning this, Warwick is now able to work out an agreed-upon, meaningful treatment plan with his patient to reverse her functional decline. Not surprisingly we learn Warwick is disdainful of clinical guidelines, telling Gawande they are “a record of the past and little more.”
Advancing evidence and improving practice are not separate activities but intertwined processes inherently social and occurring in real time. Practice, as Thomas Schwandt argues, is not assumed to stand “in subsidiary relationship to scientific knowledge.” (See Schwandt’s writings on this topic here and here.) The practice setting is a site for an unfolding of events, not a context for applying evidence.
This suggests the fundamental way to improve health care quality is by designing delivery that improves the interaction between the provider and the patient. Since health care (along with biological functioning) emerges from relationships, Paul Uhlig argues the key to transformation is in optimizing the patterns of patient-provider interaction. “Patterns of organization should reflect patterns of interaction,” Uhlig says, “and patterns of organization should match patterns of patient need.”
As the Warwick example illustrates, attention should be paid to what is actually going on in conversation between actors. Anthony Suchman terms this simply “relationship-centered care.” In writing about improving health care effectiveness, Yaneer Bar-Yan makes an analogous point when he argues the fine-scale task of providing individualized care should not be corrupted by the health care industry’s drive toward large-scale, undifferentiated functions.
The importance of the qualities of interactions cannot be over-stated. Focusing on what is “actually going on” is all that ultimately matters, since again improvement emerges only between people interacting in real time. We produce nothing of consequence outside these interactions. At present, the focus is instead on what should be. Attention becomes focused on how the evidence commodity is managed. This is reflected in the bulk of studies on improving health care quality that typically attempt to enumerate the properties or attributes of good care. Instead of learning what people are actually doing with one another, the literature is largely about practice or business improvement schemes or systems, measurement, or reimbursement. These activities are “as if” exercises, mere conceits since nothing is ultimately produced outside of the interaction between clinician and patient.
“As if” exercises constitute the sum of PCORI’s fifty pilot projects. None among the fifty endeavor to provide evidence of the quality of interaction between the patient and provider. The projects instead propose developing aids, conceptual frameworks or models, databases, decision support, evaluation and gaming tools, frames, guidelines, instruments, lexicons, measures, mobile technologies, models, portals, scales, surveys, quality dimensions, tool-kits and other techniques. Similarly, in PCORI Executive Director Joe Selby’s lengthy interview published in Health Affairs last winter, there was no mention made of improving how well a provider interacts with his or her patient (nor is there mention in PCORI’s Draft Methodology Report).
Rhetorically, how is PCORI “patient centered” when patient interactions go ignored? This is all the more remarkable when you consider that our relation to the other, as Levinas noted, is the foundation of our knowing, not the reverse.