Authors

  1. Kitson, Alison L RNBSc (Hons)PhDFRCN

Article Content

The ultimate purpose of the evidence-based health care movement is to improve patient care through the appropriate, timely, effective and personalised use of evidence by practitioners from every healthcare discipline. Indeed, Sackett et al.'s definition2 of evidence-based medicine (EBM) reminds us that the consideration of evidence from research is one dimension of a triad of intellectual analyses. The other two dimensions - often perceived as forgotten and awkward companions to the emergent giant of research evidence - are clinical experience and patient characteristics. The ultimate success of EBM as described by Sackett et al. is the internalisation of the evidence by the practitioner and its subsequent use in forming a clinical judgement leading to an appropriate, safe, timely and effective clinical action.

 

So why, if this is the genesis of the EBM movement, have we encountered such polarisation in debates, most recently epitomised in the dialogue between Holmes et al.3 and his namesake Holmes,4 and in hotly contested views around the veracity of one interpretation of evidence against another?5 These debates for me reflect a number of deeper epistemological issues which do need to be considered, particularly as we move into the complexities of understanding how we actually get evidence into practice in a more consistent, rigorous way.

 

There seem to be three dimensions that need to be considered: first is the Popperian notion that all science is in the business of disproving rather than proving that certain theories work.1 Thus, the classic consideration of the 'theory' that all swans are white only exists until such time as a black swan is discovered (or created). This is an interesting perspective in that medical research has traditionally grown up within a largely atheoretical tradition, priding itself on its successful pragmatic empiricism.6 Yet as Allen et al.'s paper in this edition (pp. 78-110) argue, it is very important to understand the theoretical perspective before undertaking complex intervention studies.7 What is developing in the implementation science field in particular is the need to base complex interventions on clearly articulated theories. Such a reminder of the conditionality of science is important particularly when we want to rid ourselves of that most uncomfortable of human conditions - dealing with uncertainty and incongruity.

 

Indeed, the challenges we face in managing uncertainty or incongruity in our academic and personal lives brings me to my second observation: how we manage doubt and uncertainty in our clinical practice and how the evidence-based healthcare movement can help us manage that tension more creatively and effectively. In clinical practice, as in science, there is a danger in following a line of argumentation or thinking that leads us to the wrong conclusion. Without realising it and almost unable to stop ourselves, we follow a line of thought, an algorithm that will address the issues in a logical way but lead us to the wrong answer. Consider patients who have presented with one problem but because of where they were, how they were presenting and the assumptions made, we start off on a journey (armed with the research evidence) that does not lead us to the right spot.

 

This can happen in science too. As Hudson noted:8

 

The possibility that incongruity or inconsistency can act as a source of vitality does not spring naturally to the empirical mind. Yet it is some such model that students of the human order now seem to me to need. And they need it not merely in coming to terms with their subject matter but in making sense of the body of thought they themselves produce. (p. 110)

 

Minds trained in the convergent art of refining and synthesising knowledge are less likely to appreciate the creativity and spontaneity of divergent thinking. And yet, most scientific breakthroughs are creations - ways of thinking about and looking at the world that is different - diverging from the norm. Interestingly, some of the most well-known research on the management of innovations in systems9 identifies this continuous balancing between convergent and divergent thinking and problem solving within teams as a feature of successful introduction of new ideas, new innovations, new evidence into organisations. Equally, the literature on action science10 and action learning11 identify the importance of balancing these two perspectives.

 

This way of moving between assessing and assimilating the new and focusing and synthesising it into the current is a crucial skill set to master if we want to successfully implement evidence into practice. Our problem is that we still tend to see 'evidence' as a product, a commodity, a thing that can be 'put into' a system. Instead, 'evidence' is a complex construction of facts, propositions, experiences, biographies and histories and ultimately an exercise of judgement bounded by time and context.12 Long et al.'s paper in this edition (pp. 119-134) on how one hospital responded to adverse events illustrates how patient perspectives and local data can be used in a systematic way to begin to create another perspective on evidence.

 

Van de Ven and colleagues13 have demonstrated the journey that innovations go through within organisations - a very different starting point from the convergent (empirical) approach reflected in many intervention studies in health care, and indeed more reminiscent to Allen et al.'s reflections on assessing complex interventions (pp. 78-110). Acknowledging the need for holding the divergent and convergent thinking together, and suspending judgement on what works and what does not, Van de Ven and colleagues have told stories that could enrich the evidence-based healthcare community.

 

How do we get evidence into practice? I would suggest that the solution lies somewhere in our ability to integrate our understanding and use of research evidence with our clinical experiences (systematically and transparently documented), taking account of patient preferences and using information from routine data in a much more consistent way.14 How we balance this latter source of evidence with other more accepted sources is an issue for ongoing debate.

 

The third and final question I want to pose is: how and when do we start on the integration journey (and by integration, I mean the integration of the various types of evidence I have described as well as the integration of the evidence into our daily practice)? Should we, as Ian Graham, Vice President of Knowledge Translation for the Canadian Institute for Health Research advocates, be shaping all research proposals in a way that makes them responsible, not just for generating new knowledge, but also for the translation of that science into practice?15 The Canadian Institute for Health Research has taken a bold stand in leading the advancement of this agenda. Hopefully, other funding agencies will follow suit.

 

In the meantime, what we need is a continuous dialogue16 around these issues - (dialogue from the Greek meaning a free flowing of meaning through a group) not a discussion (from the Latin meaning the heaving of ideas back and forth in a winner-take-all competition). What was heartening at the recent Joanna Briggs Convention in Adelaide (November 2007) was the manifestation of that dialogue: the coming together in respectful engagement of researchers, scholars, information scientists, practitioners, service users, policy-makers and funders from very different philosophical perspectives and disciplines and, through that engagement, a new synergy was created.

 

The work of the Joanna Briggs Institute has reached a new level. It has taken a hard road and has not been deflected from pursuing the discipline of trying to synthesise the best available evidence from a variety of sources, using transparent and replicable methodologies. However, it also needs to caution its followers from becoming too certain or too prescriptive about the world. Methods and products are important, but we must remember they are (merely) tools to do a job. And the job is the transformation of knowledge or evidence into sound clinical judgements - made by individuals, by teams, by whole systems. And perhaps the next phase of creativity and divergent thinking will be around knowledge translation or implementation science - disciplining ourselves not to jump to conclusions about what works and what does not, but to observe patiently and painstakingly the multiple conversations and actions that take place.

 

So we would do well to reflect upon the uncertainty and conditionality of all theories, the limits of our knowledge and the importance of our everyday experiences of dialogue rather than discussion. Perhaps then we will become more effective and creative users of evidence in our own practice.

 

Alison L Kitson, RNBSc (Hons)PhDFRCN1,2,3,4

 

1Nursing Education Fund Inaugural Fellow, Royal Adelaide Hospital and Adjunct Professor of Nursing, The University of Adelaide, South Australia, Australia, 2Supernumerary Fellow, Green College, University of Oxford, Oxford, UK, 3Honorary Professor, University of Ulster, Leicester and City University, 4Director, AKP Associates, Oxford, UK

 

References

 

1. Sackett DL, Rosenberg WMC, Muir Gray JA, Haynes RB, Scott Richardson W. Evidence based medicine: what it is and what it isn't. BMJ 1996; 312: 71-2. [Context Link]

 

2. Popper K. The Growth of Scientific Knowledge. London: Routledge, 1963. [Context Link]

 

3. Holmes D, Murray ST, Perron A, Rail G. Deconstructing the evidence-based discourse in health sciences: truth, power and facism. Int J Evid Based Healthc 2006; 4: 180-6. [Context Link]

 

4. Holmes CA. Never mind the evidence, feel the width: a response to Holmes, Murray, Perron and Rail. Int J Evid Based Healthc 2006; 4: 187-8. [Context Link]

 

5. Tierney A. Critique is crucial editorial. Int J Evid Based Healthc 2007; 5: 267-8. [Context Link]

 

6. ICEBeRG Group. Designing theoretically-informed implementation interventions. Implement Sci 2006; 1. Accessed 1 Dec 2007. Available from: http://www.implementationscience.com/content/1/1/4[Context Link]

 

7. Medical Research Council. A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. 2000 [Online]. Accessed 10 September 2006. Available from: http://www.mrc.ac.uk/pdf-mrccpr.pdf[Context Link]

 

8. Hudson L. The Cult of the Fact. London: Jonathan Cape, 1976; 110. [Context Link]

 

9. Van de Ven AH, Polley DE, Garud R, Venkatarama S. The Innovation Journey. Oxford: Oxford University Press, 1999. [Context Link]

 

10. Schon DA. The Reflective Practitioner: How Professionals Think in Action. Aldershot: Avebury, 1991. [Context Link]

 

11. Ellstrom P-E. Understanding the Use of Knowledge in Practical Action: a Learning Perspective. Paper given at Knowledge Utilization Colloquium (KU07) Stockholm, Sweden 15th-17th August 2007. [Context Link]

 

12. Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? J Adv Nurs 2004; 47: 81-90. [Context Link]

 

13. Van de Ven AH, Angle HL, Poole MS. Research on the Management of Innovation. Oxford: Oxford University Press, 2000. [Context Link]

 

14. Kitson A, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci 2008 (in press). [Context Link]

 

15. Graham ID, Tetroe J. How to translate health research knowledge into effective healthcare action. Healthc Q 2007; 10: 20-2. [Context Link]

 

16. Senge P. The Fifth Discipline. London: Century Business, 1990; 10. [Context Link]