Authors

  1. Masso, Malcolm RN, BSc(Econ), MNA, MPH

The aim was to undertake a review of the literature on change management, quality improvement, evidence-based practice and diffusion of innovations to identify key factors that might influence the uptake and continued use of evidence in residential aged care. The key factors will be used to shape and inform the evaluation of the Encouraging Best Practice in Residential Aged Care Program which commenced in Australia in 2007. MEDLINE, CINAHL and the Cochrane Database of Systematic Reviews were searched using combinations of search terms. Searching focused on existing literature reviews, discussions of relevant conceptual and theoretical frameworks and primary studies that have examined the implementation of evidence-based practice in residential aged care. Keyword searching was supplemented with snowball searching (following up on the references cited in the papers identified by the search), searching by key authors in the field and hand searching of a small number of journals. In general, the period covered by the searches was from 2002 to 2008. The findings from the literature are often equivocal. Analysis and consolidation of factors derived from the literature that might influence the implementation of evidence-based practice resulted in the identification of eight factors: (i) a receptive context for change; (ii) having a model of change to guide implementation; (iii) adequate resources; (iv) staff with the necessary skills; (v) stakeholder engagement, participation and commitment; (vi) the nature of the change in practice; (vii) systems in place to support the use of evidence; and (viii) demonstrable benefits of the change. Most of the literature included in the review is from studies in healthcare and hence the generalisability to residential aged care is largely unknown. However, the focus of this research is on clinical care, within the context of residential aged care, hence the healthcare literature is relevant. The factors are relatively broad and cover the evidence itself, the process of implementation, the context within which evidence will be implemented and the systems and resources to support implementation. It is likely that the factors are not independent of each other. The set of factors will be refined over the course of the evaluation.

 

Article Content

Introduction

The Encouraging Best Practice in Residential Aged Care (EBPRAC) Program is funded by the Australian Government to implement evidence-based practice in residential aged care facilities. There are 170 000, mostly elderly, people living in 2872 residential aged care facilities.1 A key aim of the EBPRAC Program is to improve evidence-based clinical care for residents of Australian Government subsidised aged care homes.

 

Residential aged care facilities provide care for people who can no longer remain in their own homes. Low-level care includes the provision of suitable accommodation and services, such as laundry, meals, cleaning and assistance with activities of daily living. High-level care includes accommodation, personal care services, nursing care and the provision of equipment.1

 

The EBPRAC program involves 108 residential aged care facilities and consists of 13 projects, each with a lead organisation implementing evidence in one of nine areas of clinical practice - pain management, nutrition and hydration, medication management, falls prevention, oral health, infection control, wound management, behaviour management and palliative care. The program consists of two rounds of projects, with Round 1 starting in late 2007 and Round 2 starting in late 2008. Each project is funded for 2 years with total funding of $A12m across the 13 projects which includes a separate evaluation of each project. The program will conclude in December 2010.

 

The evaluation of the program, as distinct from the evaluation of individual projects, is based on a framework to examine the delivery and impact of the program on residents, providers and the residential aged care system. To inform the evaluation, a literature review was undertaken to identify key factors that might influence the uptake and continued use of evidence. This paper reports on the identification of those factors.

 

Implementation of evidence-based practice can be considered from a number of perspectives. At one level, it is about changing the practice of individual clinicians which might involve a variety of interventions, such as education. At another level, it is about an organisation improving the quality of care. Improvements might involve the use of novel practices and hence the literature on dissemination of innovations might have something useful to contribute. From a management perspective the concept of 'change management' is typically to the fore, suggesting that the literature on organisational change might be of value. The underlying assumption we have used in this paper is that evidence-based practice is about taking something new (the evidence) from one domain (research) and implementing it in another domain (practice) to change the way individuals and organisations operate, in order to improve quality.

 

The intention is that the review of the literature will continue over the course of the evaluation (a period of 3 years). What is reported here is effectively the start of the journey of discovery.

 

Methods

The literature that might usefully inform the review is diverse, involving many different research methodologies and paradigms, and extends well beyond healthcare or residential aged care. Hence, it was considered important to place some boundaries around the literature review, which primarily consisted of restricting it to healthcare and residential aged care. Some support for this approach can be taken from the observation by Iles and Sutherland that findings from the broader (non-health) change management literature might not necessarily be applicable to the clinical area.2

 

The search of the literature sought to answer the question 'what is known about implementing evidence into practice?' Papers included in the review were:

 

* Literature reviews, including systematic reviews and reviews of systematic reviews, focused on evidence-based practice, diffusion of innovations, quality improvement and change management.

 

* Primary studies of the implementation of evidence-based practice in residential aged care, primarily in Australia.

 

* Primary studies in healthcare identified as seminal pieces of work because of the high number of citations.

 

 

To make 'sense' of this literature and provide some context for considering the findings searching also included theoretic approaches and conceptual frameworks that might be relevant to the implementation of evidence-based practice.

 

MEDLINE, CINAHL and the Cochrane Database of Systematic Reviews were searched using combinations of the search terms guideline$, evidence$, implement$, theory, research, context or culture, evidence-based medicine, nursing, resident$ and aged$. Inclusion of 'nursing' as a search term reflects the predominance of nurses among health professionals working in residential aged care.

 

Database searching was supplemented with snowball searching (pursuing references of references with some tracking of citations forward in time), searching by key authors in the field and hand searching of a small number of journals - Quality and Safety in Health Care, Implementation Science, Journal of Evaluation in Clinical Practice, Journal of Advanced Nursing, Australasian Journal on Ageing, Worldviews on Evidence-Based Nursing and International Journal of Evidence-Based Healthcare. Citations were culled, first by title and then by abstract. In general, the period covered by the searches went from 2002 to 2008. This period was chosen partly for pragmatic reasons to limit the number of citations that would have to be reviewed but also based on an assumption that important references earlier than 2002 would be identified by snowball searching. We were also aware that several important works published in the period 2004-2005 already provided a good basis for review of the pre-2003 literature in areas, such as dissemination of innovations,3 quality improvement4 and implementation science.5

 

The net result was a database of 294 citations, of which the full texts of 190 articles were retrieved. The papers used to inform this paper included 21 systematic reviews, 17 literature reviews, 36 primary studies, 23 papers on theoretical and conceptual issues and eight papers classified as expert opinion. Of the primary studies, 30 were in residential aged care.

 

A total of 17 papers provided findings, supported by evidence, about factors that might influence the implementation of evidence-based practice, consisting of six systematic reviews3,6-10; six literature reviews5,11-15; four reviews of systematic reviews16-19 and one review of a series of case studies.20 The findings from these papers were entered into an Excel spreadsheet and sorted using the concept of matrix displays to arrive at a set of broad factors.21

 

Results

General findings

Despite the large volume of material the findings are often equivocal, even at what is generally accepted to be the highest level of evidence - systematic reviews and reviews of systematic reviews. For example, the evidence for using the quality improvement collaborative method, one of the most commonly used improvement methodologies, is described as 'positive but limited and the effects cannot be predicted with great certainty'.22 Clinical guidelines are often used as a mechanism for changing the practice of clinicians but the evidence about how best to implement clinical guidelines has been described as imperfect10 and 'still thin'.16 There is no evidence to support a particular guideline implementation in allied health professions.7 There are lots of studies on guideline implementation strategies which are 'of variable methodological quality, and hence of questionable value' (p. 895).19, Little is known about how to influence the use of research by nurses.23,24 The review by Foxcroft and Cole of organisational infrastructures to promote evidence-based nursing practice found no studies that were of sufficient quality to include.25

 

Findings, such as those outlined above have led to commentary that quality improvement is largely based in intuition and anecdote,26 with no particular quality improvement strategy standing out as being worthy of recommendation based on 'evidence of effectiveness, ease of implementation or costs' (p. 4).27 A literature review of the effectiveness of quality systems in nursing homes came to a similar conclusion.14 This has given rise to the argument that there are no 'magic bullets' that can be used in all circumstances28 and that future research is unlikely to find any.29

 

However, the lack of 'magic bullets' should not be taken as a rationale for doing nothing, that is, don't implement anything because nothing has been shown to work. Rather, the task becomes one of deciding which interventions to use and how to use those interventions in the most effective way. The question 'what works' is reframed to one of understanding 'how' and 'why' interventions work by identifying what have been described as the 'determinants of effectiveness'.29

 

The 17 papers that included evidence about factors that might influence the implementation of evidence-based practice referred to a total of 113 factors, some of which were the same or similar. Sorting these factors using an Excel spreadsheet resulted in the 8 categories (key factors) that form the basis of this paper. Examples of the links between the literature and the structure of this paper are shown in Table 1. A summary of the levels of evidence supporting the identified factors is shown in Table 2.

  
Table 1 - Click to enlarge in new window Identification of factors to support implementation of evidence-based practice
 
Table 2 - Click to enlarge in new window Number of reviews with supporting evidence for each factor

The main gaps in the literature are the lack of understanding about which factors are important in which circumstances and how the various factors interact with each other.

 

Conceptual frameworks

The general approach and techniques used to implement evidence-based practice are similar to the various methodologies for improving quality of care. Quality improvement tends to focus on improving processes and systems whereas much of the work on evidence-based practice has been about changing the behaviour of individual clinicians. The 'quality' literature has spawned a variety of frameworks and organisational requirements for implementation, with the factors most often identified (within the context of clinical practice) being systematic problem solving based on data and statistical analysis; participation by doctors; employee involvement and empowerment; explicit focus on internal and external customers and value chain integration.30

 

There are many references in the literature to the work of a team of researchers who have been working to develop a framework for implementing evidence-based practice, starting with a paper in 1998 which suggested that:

 

'Successful implementation is a function of the relation between the nature of the evidence, the context in which the proposed change is to be implemented, and the mechanisms by which the change is facilitated.' (p. 150)31

 

Further development of the framework has identified the importance of appropriate monitoring and feedback mechanisms and the use of internal and external facilitators for practice changes, with successful implementation associated with strong evidence and a receptive context.31 Later work refined the detail of the framework while leaving the basic structure unchanged32 and has resulted in a series of papers exploring different aspects of the framework, summarised most recently by Kitson et al.33

 

The framework has come to be known as the Promoting Action on Research Implementation in Health Services framework. It is worth noting that there is no inclusion of individual factors in the framework,34 its practical application for designing interventions to promote evidence-based practice is limited to the facilitation role and the framework is largely untested.35 The authors acknowledge that further work is needed on the relationships between the various elements and sub-elements of the framework and the applicability of the framework to different levels of analysis, for example, individual, team, unit and organisation. They have also suggested that the framework can be used in two ways: (i) as a diagnostic and evaluative tool when setting out to implement evidence-based practice; and (ii) as a tool to evaluate implementation.33

 

Other researchers have identified conceptual frameworks with the potential to inform the implementation of innovations. Helfrich et al. developed a framework by taking a model from manufacturing and adapting it for the health sector by inclusion of two additional constructs: 'the presence of an innovation champion and the fit between the innovation and the values of innovation users' (p. 280).36 Greenhalgh et al. developed a conceptual model, described as a 'memory aide' rather than a prescriptive framework, that includes what are described as 'system antecedents for innovation'; the innovation itself; the process of implementation and various other factors, such as communication and linkage. The model is based on the findings of their extensive systematic review that successful implementation of innovations is associated with decision making devolved to teams on the ground; support, commitment and involvement of senior management; widespread involvement of staff at all levels; few job changes; availability of timely, high quality, education; dedicated funding; effective communication and networking across organisational boundaries; timely and accurate feedback about the impact of implementation and adaptation of the innovation to the local context. The authors suggest that the 'fit' between an innovation and the context within which that innovation will be implemented might be a more useful unit of analysis than just considering the attributes of the innovation itself.3

 

One of the most surprising aspects of the literature is the general lack of clear conceptualisations and definitions of what is meant by the term 'implementation'. Even when the term is defined the ensuing analysis and discussion often takes place with little insight into how well an intervention has been implemented.

 

Fixsen et al. set out to describe the current state of the science of implementation by reviewing literature on human services, agriculture, business, engineering, manufacturing and marketing and as a result of their work draw the distinction between different kinds of implementation - paper, process implementation and performance implementation - and what appear to be discernible stages of implementation: exploration and adoption, installation, initial implementation, full operation, innovation and sustainability. They then proceed to make the point that 'it appears that most of what is known about implementation of evidence-based practices and programs is known at the exploration and initial implementation stages' (p. 18)5 and that 'the most effective intervention will not produce positive effects if it is not implemented' (p. 55).5 An alternative way of thinking about implementation is the concept of 'implementation fidelity' which is about the degree to which something has been implemented rather than the stage of implementation. The main issue is whether implementation of an intervention adheres to what was intended, that is, is the content, frequency, coverage and duration of the intervention consistent with the evidence on which it is based?37

 

These are important issues when reviewing the literature. In the absence of good information about the stage of implementation or the degree to which an evidence-based practice has been implemented, judgements about reported outcomes are problematic. Data on implementation fidelity is not commonly reported, making it difficult to gauge the degree of heterogeneity across primary research studies.37

 

This gives rise to the issue of having some means of designing and planning implementation, an 'organising framework' for implementation.38 One way of doing this is the theory of change approach which seeks to understand and construct the theory underpinning an intervention.39 It is specifically designed to provide a link between action and outcomes in complex situations and shares some features with realistic evaluation by emphasising the key role of context in understanding the links between what is carried out and what is achieved.40,41 Theories of change might be most useful and effective when they are integrated within a comprehensive planning framework.17 Understanding and making explicit the theory of change not only facilitates description and reporting but also evaluation, by targeting data collection at what are stressed to be the key points in the change process.

 

The potential complexity of implementation can be seen in the review of the literature on the clinical application of continuous quality improvement where Shortell et al. suggested that four inter-related dimensions are necessary for the success of continuous quality improvement - strategic, cultural, technical and structural - and that all four must be present for achieving organisation-wide improvement. This indicates, for example, that if there is not a supportive culture only small, temporary, effects with no lasting impact will be achieved.9. The need to consider change at various levels, for example, the individual, the team, the organisation, the broader context, is emphasised elsewhere in the literature.12,42,43 Such a multi-level approach, if it is not to be haphazard and unproductive, implies the need for some degree of organisation and planning. A recent large-scale empirical study of organisational change in healthcare has identified 'a coherent change strategy' as playing a key role in progressing service improvement.44

 

The use of a model of change is sometimes referred to in the literature as having a planned approach to change or designing specific interventions based on some form of needs assessment or diagnosis. One approach is to tailor interventions to overcome what are perceived as the barriers to change, with the implication that if the barriers can be removed, change will follow. Despite its intuitive appeal the evidence to support the effectiveness of tailored interventions is uncertain.45,46 Grol has recently described 15 'lessons' on achieving sustained change based on the results from studies, projects and programs in many countries and emphasises the need for a planned approach to change, including optimal preparation, marketing the proposal for change and clear goals for the change.47

 

Two frameworks have been identified specifically for residential aged care. The first, from a team of researchers in Canada, is described as a 'contingency model of innovation adoption' and sets forth 14 propositions, based on findings in the literature, which are considered to increase the likelihood of guideline adoption. The propositions cover context (e.g. structural capabilities that support guideline replication, chain-owned facilities rather than non-chain owned facilities and reinforcement by regulation), staff capabilities (e.g. increased proportion of trained staff, greater self-efficacy among clinical leaders), benefits of the change (e.g. less ambiguity regarding the link between guideline implementation and positive outcomes for residents, the extent to which the guideline serves the interests of the staff who use it), engagement of stakeholders (familiarity and agreement with the guideline among clinical leaders) and the nature of the change in practice (e.g. less reliance on tacit knowledge for translation, the 'fit' between the guideline and existing culture and practices).11

 

The second, from the USA, employs a conceptual framework of factors that are considered to influence the successful adoption of innovations by organisations.48 The framework is based on the results of a study that identified common challenges faced by hospitals when implementing a change in practice49 which was then adapted to evaluate a falls management program for nursing home residents. It comprises four groups of factors: (i) external environment or context; (ii) dissemination infrastructure; (iii) innovation (e.g. degree of culture change required, importance of benefits of the innovation); and (iv) what is referred to an 'adopting organisation', which includes senior management support, presence of clinical leaders and champions and data to support adequate resources for start-up, implementation and evaluation.48,49

 

It would appear from the literature that understanding implementation, diffusion of innovations, quality management and change management are important to the successful implementation of evidence-based practice. However, research in implementation is at an early stage compared with research in the other three areas.

 

The frameworks identified above show some commonalities. The evidence (or innovation) features in the conceptual frameworks, the context in which the evidence is implemented also features, as does the process of implementation. This is reminiscent of work carried out in the UK in the 1980s which proposed that research on change should encompass the content of change, the context of change and the process of change, along with the interactions between the three.50

 

Context

Reference to the influence of 'context' occurs repeatedly in the literature. This includes the idea that what works in one context might not work in another;20,31 that implementation might be more context-dependent for some interventions than for others;51; or that some contexts might be more receptive to change than others.3,52

 

The concept of 'context' appears frequently in the knowledge translation literature but this is often limited to a description of its importance.24 Despite the ubiquity of the term it is not well understood with various conceptualisations and definitions being proposed. In the absence of a definition context is often referred to as made up of constituent elements, which are also typically not defined, with terms, such as 'forces', 'conditions', 'factors' and 'characteristics'. For example, Meijers and colleagues found that variables with a statistically significant relationship with research use by nurses were clustered into six contextual factors: the role of the nurse, organisational climate, time for research activities, provision of education, multi-faceted support and access to various types of resources.24 The term is generally used inconsistently, in a way that lacks clarity, is only partially developed and requires further delineation.53

 

It has been argued that context is 'an important but poorly understood mediator of change and innovation' and that 'a more sophisticated active notion of context is needed' (p. S72).54 Context is not simply a backdrop to practitioners and what they do but interacts with individuals and the systems in which they work.44

 

A useful starting point for considering what is meant by context is a seminal piece of work on organisational change within healthcare which introduced the idea that some contexts might be receptive and some might be non-receptive to change. The key factors that influence 'receptivity', derived inductively from 16 case studies, include a supportive organisational culture; simplicity and clarity of goals and priorities; effective relations between clinicians and managers; and key people leading change and quality.52 Although focused on large scale organisational change rather than the more modest scale of change required to implement evidence-based practice in residential aged care, the study provides a good insight into the dimensions of change, particularly the idea that the key factors cannot be considered in isolation but interact with each other, 'providing a linked set of conditions which provide high energy around change' (p. 275).52 The importance of leadership, in particular, is a recurring theme in the literature.12,15,24,55,56

 

Ross et al. in their study of implementing evidence-based guidelines for the multidisciplinary assessment of older people found the receptive context for change framework to be useful but cautioned that 'there are limitations to presenting complex process factors of change within a schematic model, such as Pettigrew's in that it oversimplifies its dynamic, chaotic, unpredictable nature and the complex emotional and personal characteristics of relationships' (p. 527).57

 

Dopson et al. identified a similar set of factors for a receptive context, while using slightly different language, with their reference to favourable relationships between local stakeholders, managerial support, pressure for change, supportive organisational culture, sharing information and clear goals for change. The major difference is their finding that appropriate infrastructure and resources are critical.20 Gustafson et al. have developed a model to predict the outcomes of organisational change which also includes various aspects of receptivity, such as creating a mandate for change, what they refer to as a 'tension for change' and the need to adapt changes to fit existing culture and practices.13

 

Evidence and its use

The term 'evidence' is used in many and varied ways. One useful approach is that there are two distinct orientations for considering the question of what constitutes evidence: (i) evidence that is not dependent on context, an approach which has spawned the various hierarchies of evidence to achieve the highest quality of evidence; and (ii) evidence that is dependent on context in which evidence is judged less by quality and more by its relevance, applicability or generalisability.58

 

Kitson and colleagues have identified three dimensions of evidence - research, clinical experience and patient preferences.31 The Joanna Briggs Institute has taken a slightly different, but consistent, tack by developing a model that 'conceptualizes evidence-based practice as clinical decision-making that considers the best available evidence, the context in which care is delivered, client preference, and the professional judgment of the health professional' (p. 85).59 The model also suggests that the nature of the evidence will vary according to the clinical question being asked and the proposed intervention. The corollary is that to aid decision-making clinicians require answers to four questions:

 

1. Is the evidence-based intervention feasible in the local context?

 

2. Is the intervention appropriate (i.e. will it 'fit' the context of care delivery)?

 

3. Is the intervention meaningful (the extent to which the intervention is experienced positively by the patient)?

 

4. Is the intervention effective?59

 

 

From this has arisen the notion that research needs to be 'translated' into practice, that is, adapted for local use.15

 

Two teams of researchers in the UK conducted research about how innovations are diffused and adopted in clinical practice and then reflected on what they had learnt about what makes information credible (and hence used) and why people decide to use new knowledge. They identified some common themes: having robust evidence is not sufficient for getting the evidence adopted; the interpretation of evidence is socially constructed; evidence is differentially available for different professions and there are different views on what constitutes credible evidence.20 Evidence is not fixed and certain but rather 'the evidence base for particular technologies and practices is often ambiguous and contested and must be continually interpreted and reframed in accordance with the local context and priorities' (p. 591).3

 

Some attributes of evidence, in the form of clinical guidelines, that improve compliance have been identified:

 

* type of health problem (compliance better for acute rather than chronic care)

 

* the quality of the evidence

 

* compatibility with existing values

 

* complexity of decision-making required (less complex decision-making equates with better compliance)

 

* more concrete description of desired performance and fewer new skills and organisational change required to follow the guidelines60

 

 

Earlier work identified various factors that influence guideline adoption by doctors - qualities of the guidelines (e.g. complexity), characteristics of healthcare professionals using the guidelines and characteristics of practice settings - together with incentives, regulation and patient factors.61

 

Effectiveness of specific interventions

From about the mid-1990s various reviews, including reviews of systematic reviews, have appeared in the literature examining the impact of specific interventions to improve quality, improve the performance of health professionals, implement clinical guidelines or implement some form of evidence-based practice. Later reviews are, to a certain extent, a re-statement or refinement of early findings. Far more research has been undertaken on interventions to get individual health professionals to use evidence than on interventions directed at organisations or patients. From an organisational perspective the interventions with the most evidence are:

 

* Revision of professional roles - 'revision of professional roles can improve professional performance, while positive effects on patient outcomes remain uncertain'

 

* Multidisciplinary teams - 'overall, it seems that multidisciplinary teams can improve patient outcomes. They have primarily been tested in highly prevalent chronic diseases'

 

* Integrated care services - 'integrated care systems can improve patient outcomes and save costs. They have been extensively tested in highly prevalent chronic conditions'

 

* Knowledge management (e.g. reminders, clinical decision support, computer-assisted interactive education) - most types of interventions showed positive effects18

 

 

One of the most frequently cited papers is by Grimshaw et al. who reported on a systematic review of the development, dissemination and implementation of guidelines for doctors, primarily in the USA. The most commonly evaluated interventions were reminders and educational outreach. In contrast to other reviews, for example by Davis and Taylor-Vaisey,61 the results indicate that effectiveness did not increase as the number of interventions for a particular guideline increased. Recent evidence has supported the more generally held view that multiple interventions are more effective than single interventions.16

 

Facilitation is frequently used as a mechanism to change practice, often in the form of facilitation by individuals. Harvey et al. reviewed the research literature and seminal texts to conduct a concept analysis of facilitation as part of their work in developing the Promoting Action on Research Implementation in Health Services framework, of which facilitation is one of the three key components. For the purposes of their review they defined facilitation as 'the process of enabling (making easier) the implementation of evidence into practice' (p. 579).62 They suggested that the concept of facilitation can be represented as a continuum, ranging from a set of discrete tasks to a more holistic approach to changing individuals, teams and organisations and concluded that despite the large body of literature 'there are few explicit descriptions or rigorous evaluations of the concept' (p. 585).62

 

More recently, the nursing, medical, managerial and educational literature has been reviewed to clarify the concepts of opinion leaders, facilitators, champions, linking agents and change agents. The review identified two 'defining features' of the literature: (i) an underlying assumption that increasing the availability of knowledge will lead to behaviour change; and (ii) each of the five roles is a form of change agent. It was concluded that the concepts are not clearly defined and are used inconsistently, inhibiting the ability to compare results across studies that have investigated these roles.63 These findings are consistent with those of any earlier review of the nursing, medical and allied health literature.64

 

The use of local opinion leaders is of variable effectiveness65 and it is difficult to identify the attributes of opinion leaders or the context in which they work that makes them more effective.6 A review of the literature from the education, healthcare, social care and criminal justice sectors identified various practices that might increase the likelihood of research being used, including: individualised education; supportive opinion leaders, both expert and peer; reminders; adequately resourced facilitative strategies and multifaceted interventions, 'particularly where attention is paid to the contexts and mechanisms of implementation' (p. 29).15

 

Action research represents another approach to changing the practice of healthcare professionals and has been shown to have a role by 'crossing the boundaries' between research and practice (or action).66 It has primarily been used by nurses, including in residential aged care in Australia,67, the USA,68 and the UK.69 Action research has been used successfully within healthcare in various change programs and has also been applied widely in the broader field of management research with success 'largely dependent on organisational context'.2 Action research incorporates ideas of facilitation, engagement and feedback that are evident elsewhere in this review.

 

Similarly, the concepts of process improvement, person centred care, facilitation, managing change and acting on the context within which care takes place are found within the practice development literature. Practice development has the potential to provide a coherent 'model of change' but at the present time the methodologies used remain diverse and wide ranging, with little consensus about the best approach and weak evidence regarding effectiveness in many areas.70

 

In addition to the interventions referred to above, various factors have been identified in the literature that might influence the use of evidence or the uptake of new ideas:

 

* Adequate resources3,5,13,15-17,20

 

* Staff with the necessary skills5,10-11,13-15,17

 

* Stakeholder engagement, participation and commitment3,13,15

 

* Systems in place to support the use of evidence, for example, monitoring, feedback and reminder systems3,5,8-15,17-19

 

* Demonstrable benefits of the change3,13

 

 

The importance of staff skills is stressed by the frequency with which education, of one form or another, is used to implement evidence, either alone or in combination with other strategies. Didactic education has been shown to have little or no effect,71,72 education outreach has a small to modest effect73 and interactive workshops can result in moderately large changes in professional practice.71,72 There has been little work on the effectiveness of inter-professional collaboration and education.74

 

Implementation of evidence in residential aged care

Studies that have involved taking existing evidence and implementing that evidence in Australian residential aged care facilities are generally limited to one area of clinical practice at a time. For example, use of physical restraints,75,76 oral health for those with dementia,77-79 constipation,80 hydration,81 nutrition and physical activity,82-84 advance care planning,85,86 falls prevention,87 falls prevention and stroke,88 pressure ulcer management89 and nursing interventions for hydration, bowel management, falls prevention and skin care.90

 

The results are mixed and difficult to interpret, primarily because of the small scale and short timeframes of most of the studies. There are some references to the broader organisational and cultural issues that might have influenced the uptake of evidence but these usually rely on observations by the researchers. The influence of a receptive context is reported in several studies, usually framed in terms of the important (positive) influence of local leadership77,85,86 or the support of management.80,83 Other positive factors are familiarity of staff with the concept of quality cycles,77 links with the existing quality improvement program in the facility,80 staff involvement in identifying issues and actions,84 the presence of an external facilitator83 and stakeholder support.90 The most frequently reported negative influence is lack of resources, primarily staffing.83-86

 

Some qualitative research in Australia has identified factors that might influence the use of evidence in residential aged care, including heavy workload of staff, scarcity of resources, limited skills and knowledge of care workers, maintaining knowledge, 'boundary' issues between different workers, lack of management support, contextual, structural and environmental issues and beliefs or expectations of staff, residents and families.91-93 An added complexity within residential aged care is the differing skill and educational levels of staff involved in delivering care, ranging from tertiary-educated nurses to personal care assistants and volunteers. Untrained carers, rather than nurses, do most of the day-to-day monitoring of residents' behaviours.88

 

In 1997, a package of reforms including a new funding system for residential aged care and the establishment of an Aged Care Standards and Accreditation Agency to monitor standards resulted in significant changes within the industry. The aim has been to move towards more of a 'resident-centred care' perspective. However, studies that have been carried out in Australian facilities cast some doubt on the extent to which this has been achieved. One observational study found that communication between residents and staff were 'infrequent, of short duration and oriented to physical care' with the authors concluding that 'the poverty of interaction is of serious concern' (p. 35). Staff tended to ignore independent behaviour by residents but supported dependent behaviour.94 In a similar vein, Tuckett draws a distinction between 'doing-for (instrumental care) over being-with (empathic engagement)' and found that the former took priority, based on his own study and a review of the literature (p. 222).95

 

This situation is not unique to Australia. The Commonwealth Fund has recently published findings from a study in the USA which found that 20 years after legislation to enact a person-centred approach, only a small number of facilities have made the transition to resident-centred care.96 The EBPRAC program is focused on using evidence to improve clinical care. It might well be that residents, and the social environment in which they live, will have a key role in the way evidence is used. This can be seen, for example, in the results of a systematic review of the literature about the views and experiences of older people concerning falls prevention strategies which found that 'it is important to find out what characteristics a person is willing to modify, and what changes they are prepared to make, to reduce their risk of falling' (p. 35).97 This issue will be explored as part of the evaluation of the program.

 

Some research on implementing evidence-based practice in residential aged care has been reported from the USA,48,68,98,99 Canada100 and the UK.101,102 As with the Australian work, the results are somewhat patchy and difficult to interpret. For the purposes of this review, the international literature has not been explored in any great depth, given the potential influence of different funding, staffing and organisational arrangements in these countries compared with the situation in Australia.

 

Discussion

The literature review indicates that implementing any changes to clinical practice can be complex and the outcomes uncertain. There is a large volume of literature but much of this is not as useful as it might seem at first, primarily due to various methodological issues and lack of details about, for example, implementation.

 

Selection of factors for use in the evaluation of the EBPRAC program has been influenced by the need to design the evaluation for an audience that includes those working in and associated with residential aged care, especially decision makers and policy makers. To be useful in practice any framework needs to include a limited number of factors that are amenable to managerial intervention.36

 

Drawing on the findings of the literature review across the four domains of knowledge reviewed, viz. evidence-based practice, diffusion of innovations, quality improvement and change management, various 'key factors' have been identified to frame the evaluation of the EBPRAC program:

 

* Receptive context for change

 

* Model for change/implementation (including the role of specific change agents or facilitators)

 

* Adequate resources

 

* Staff with the necessary skills

 

* Stakeholder engagement, participation and commitment

 

* The nature of the change in practice, including local adaptation, local interpretation of evidence and 'fit' with current practice

 

* Systems in place to support the use of evidence, for example, monitoring, feedback and reminder systems

 

* Demonstrable benefits of the change

 

 

The number of papers containing empirical evidence to support these factors is relatively small but does include wide-ranging systematic reviews and literature reviews. There are some overlaps between these papers, for example, between systematic reviews and reviews of systematic reviews, or between one systematic review and a later systematic review, but we have tried to minimise this by concentrating on the latest reviews. The definition of each of the factors requires further work and will occur over the next 2 years.

 

Given that the research on which these factors are based has largely taken place in healthcare (usually hospitals), the factors represent the starting point for developing a model or set of principles that might be used by those wishing to implement evidence-based practice in residential aged care. The nature of the literature is such that a different group of researchers might identify different sources and interpret those sources in a different way. Some support for the selection of factors comes from recently reported work in the Veterans Health Administration system in the USA which used what they describe as an 'evidence-based organizational framework' to implement evidence into practice which relies on three elements - cultural norms and values, capacity and supportive infrastructure.103

 

It is likely that these eight factors are not independent of each other and might interact. For example, the level of resources might be a consequence of the presence or absence of organisational commitment to the implementation of evidence-based practice, that is, a receptive context for change. The definition of a receptive context for change, which will be refined as part of the evaluation, includes factors, such as leadership (including informal leaders), the existing relationships between staff, a climate that is conducive to new ideas and the presence of a recognised need for change. One element that is missing from the set of factors is the potential influence of the residents themselves. The literature provides no clear guidance about how this might play out, other than some clues in the general healthcare literature. The focus of the EBPRAC program is on clinical care but the influence of psychosocial issues within residential aged care might well turn out to be important. This will be examined as part of the evaluation.

 

In this situation, the two main options are to use either a deductive, 'theory testing', approach or an inductive, 'theory building', approach.34,104 The evaluation of the EBPRAC program will include elements of both. The 'key factors' listed above will be refined and tested as the evaluation proceeds. An extensive series of stakeholder interviews will be conducted to derive inductively what are seen as the important factors by those involved in the program. In effect, there will be what can be described as 'inductive-deductive interplay throughout the research process' (p. 341).105 The literature will continue to be reviewed throughout the evaluation to confirm findings from interviews or illustrate where the findings from the interviews might be at variance with what has been reported in the literature.

 

Conclusions

This literature review was undertaken to inform an evaluation of a series of projects to implement evidence into practice within residential aged care. It sought to identify factors that might explain why it was more or less difficult to implement and sustain the use of evidence and assist in understanding what is going on at both an individual and organisational level. In effect, to develop a model based on what is known from the literature and then test and refine that model over the course of the evaluation.

 

The role of theories or models is to increase the ability to predict what will result from a series of actions in particular circumstances. Theories can either describe or explain but it has been argued that the latter are more likely to be useful than the former for those seeking to implement research findings into practice. Such theories should have the ability to explain individual or group behaviour in terms of factors that are modifiable.106 Knowledge translation researchers have only recently begun to use theory to guide their work and this has largely been limited to single clinical issues and individual practitioners. It has been argued that there is a need for 'studies that can be linked with ongoing natural experiments to understand organization, system and context factors' that impact the use of research (p.S56).107

 

Most of the literature reviewed here reports studies in healthcare and hence the generalisability to residential aged care is largely unknown. Although both sectors are staffed by health professionals, the context is very different. Many of the findings have relied on randomised controlled trials and systematic reviews. The systematic reviews often refer to the lack of methodological rigour for included studies and use inclusion criteria that exclude many studies. Empirical studies and literature reviews typically report lists of indicators or 'success factors' operating at different levels to either help or hinder what is being implemented, whether that be a piece of evidence, a new idea or an intervention to improve quality. However, there is little reporting of the relationships between those factors. Evaluation of the EBPRAC program will attempt to tease out any relationships which might exist, as well as identifying a range of approaches to improve adoption of evidence-based practice in residential aged care. This work will also outline the contexts within which these approaches might be appropriate, and specify factors which should be present, in order for these approaches to succeed and evidence-based practice to be implemented effectively.

 

Acknowledgements

The evaluation is funded by the Australian Government Department of Health and Ageing under the EBPRAC Program.

 

References

 

1. Australian Institute of Health and Welfare. Residential Aged Care in Australia 2006-07: A Statistical Overview. Canberra: Australian Institute of Health and Welfare 2008. [Context Link]

 

2. Iles V, Sutherland K. Managing change in the NHS: organisational change: a review for all health care managers, professionals and researchers. 2001. Accessed 8 April 2009. Available from: http://www.sdo.nihr.ac.uk/adhoc/change-management-review.pdf. [Context Link]

 

3. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Quarterly 2004; 82: 581-629. [Context Link]

 

4. Shojania KG, McDonald KM, Wachter RM, Owens DK. Closing the Quality Gap: A Critical Analysis of Quality Improvement Strategies, vol. 1-Series Overview and Methodology. Rockville, MD: Agency for Healthcare Research and Quality, 2004. Report No.: Technical Review 9 (Contract No. 290-02-0017 to the Stanford University-UCSF Evidence-based Practices Center). AHRQ Publication No. 04-0051-1. [Context Link]

 

5. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation Research: A Synthesis of the Literature. Tampa, Florida: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network, 2005. Report No.: FMHI Publication #231. [Context Link]

 

6. Doumit G, Gattellari M, Grimshaw J, O'Brien MA. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2007; 1: 28 pp. [Context Link]

 

7. Hakkennes S, Dodd K. Guideline implementation in allied health professions: a systematic review of the literature. Quality and Safety in Health Care 2008; 17: 296-300. [Context Link]

 

8. Jamtvedt G, Young JM, Kristoffersen DT, O'Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2006; 2: 83 pp. [Context Link]

 

9. Shortell SM, Bennett CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Quarterly 1998; 76: 593-624. [Context Link]

 

10. Grimshaw J, Thomas RE, MacLennan G et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assessment 2004; 8: 352 pp. [Context Link]

 

11. Berta W, Teare GF, Gilbart E et al. The contingencies of organizational learning in long-term care: factors that affect innovation adoption. Health Care Management Review 2005; 30: 282-92. [Context Link]

 

12. Ferlie EB, Shortell SM. Improving the quality of health care in the United Kingdom and the United States: a framework for change. Milbank Quarterly 2001; 79: 281-315. [Context Link]

 

13. Gustafson DH, Sainfort F, Eichler M, Adams L, Bisognano M, Steudel H. Developing and testing a model to predict outcomes of organizational change. Health Services Research 2003; 38: 751-76. [Context Link]

 

14. Wagner C, van der Wal G, Groenwegen PP, de Bakker DH. The effectiveness of quality systems in nursing homes: a review. Quality in Health Care 2001; 10: 211-7. [Context Link]

 

15. Walter I, Nutley S, Davies H. Research impact: a cross sector review literature review. Research Unit for Research Utilisation, Department of Management, University of St. Andrews, 2003. [Context Link]

 

16. Francke AL, Smit MC, de Veer AJE, Mistiaen P. Factors influencing the implementation of clinical guidelines for health care professionals: a systematic meta-review. BMC Medical Informatics and Decision Making 2008; 8: 38. [Context Link]

 

17. NHS Centre for Reviews and Dissemination. Getting evidence into practice. Effective Health Care 1999; 5: 1-16. [Context Link]

 

18. Wensing M, Wollersheim H, Grol R. Organizational interventions to implement improvements in patient care: a structured review of reviews. Implementation Science 2006; 1: 2. [Context Link]

 

19. Prior M, Guerin M, Grimmer-Somers K. The effectiveness of clinical guideline implementation strategies - a synthesis of systematic review findings. Journal of Evaluation in Clinical Practice 2008; 14: 888-97. [Context Link]

 

20. Dopson S, FitzGerald L, Ferlie E, Gabbay J, Locock L. No magic targets! Changing clinical practice to become more evidence based. Health Care Management Review 2002; 27: 35-47. [Context Link]

 

21. Miles MB, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. Second. Thousand Oaks, California: Sage Publications, 1994. [Context Link]

 

22. Schouten LMT, Hulscher MEJL, JJEv E, Huijsman R, Grol RPTM. Evidence for the impact of quality improvement collaboratives: systematic review. BMJ 2008; 336: 1491-4. [Context Link]

 

23. Thompson DS, Estabrooks CA, Scott-Findlay S, Moore K, Wallin L. Interventions aimed at increasing research use in nursing: a systematic review. Implementation Science 2007; 2: 15. [Context Link]

 

24. Meijers JM, Janssen MA, Cummings GG, Wallin L, Estabrooks CA, Halfens RYG. Assessing the relationships between contextual factors and research utilization in nursing: systematic literature review. Journal of Advanced Nursing 2006; 55: 622-35. [Context Link]

 

25. Foxcroft DR, Cole N. Organisational infrastructures to promote evidence based nursing practice. Cochrane Database of Systematic Reviews 2003; 3: 18 pp. [Context Link]

 

26. Shojania KG, Grimshaw JM. Evidence-based quality improvement: the state of the science. Health Affairs 2005; 24: 138-50. [Context Link]

 

27. Ovretveit J. What Are the Best Strategies for Ensuring Quality in Hospitals? Copenhagen, Denmark: World Health Organisation Regional Office for Europe, 2003. [Context Link]

 

28. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. Canadian Medical Association Journal 1995 153: 1423-31. [Context Link]

 

29. Walshe K, Freeman T. Effectiveness of quality improvement: learning from evaluations. Quality and Safety in Health Care 2002; 11: 85-7. [Context Link]

 

30. National Institute of Clinical Studies. Factor supporting high performance in health care organisations. Melbourne: Health Management Group, La Trobe University, 2003. [Context Link]

 

31. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Quality in Health Care 1998; 7: 149-58. [Context Link]

 

32. Rycroft-Malone J, Kitson A, Harvey G et al. Ingredients for change: revisiting a conceptual framework. Quality and Safety in Health Care 2002; 11: 174-80. [Context Link]

 

33. Kitson A, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implementation Science 2008; 3: 1. [Context Link]

 

34. Rycroft-Malone J. Theory and knowledge translation: setting some coordinates. Nursing Research 2007; 56: S78-85. [Context Link]

 

35. Estabrooks CA, Thompson DS, Lovely JJE, Hofmeyer A. A guide to knowledge translation theory. Journal of Continuing Education in the Health Professions 2006; 26: 25-36. [Context Link]

 

36. Helfrich CD, Weiner BJ, McKinney MM, Minasian L. Determinants of implementation effectiveness: adapting a framework for complex innovations. Medical Care Research and Review 2007; 64: 279-303. [Context Link]

 

37. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implementation Science 2007; 2: 40. [Context Link]

 

38. Institute of Health Economics. Effective Dissemination of Findings from Research. Alberta, Canada: Institute of Health Economics, 2008. [Context Link]

 

39. Mason P, Barnes M. Constructing theories of change: methods and sources. Evaluation 2007; 13: 151-70. [Context Link]

 

40. Barnes M, Matka E, Sullivan H. Evidence, understanding and complexity: evaluation in non-linear systems. Evaluation 2003; 9: 265-84. [Context Link]

 

41. Pawson R, Tilley N. Realistic Evaluation. London: Sage Publications, 1997. [Context Link]

 

42. Estabrooks CA, Midodzi WK, Cummings GG, Wallin L. Predicting research use in nursing organizations: a multilevel analysis. Nursing Research 2007; 56 (Suppl.): S7-23. [Context Link]

 

43. Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Medical Journal of Australia 2004; 180: S57-60. [Context Link]

 

44. Fitzgerald L, Ferlie E, Addicott R, Baeza J, Buchanan D, McGivern G. Service improvement in healthcare: understanding change capacity and change context. Clinician in Management 2007; 15: 61-74. [Context Link]

 

45. Shaw B, Cheater F, Baker R et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2005; 3: 24 pp. [Context Link]

 

46. Bosch M, van der Weijden T, Wensing M, Grol R. Tailoring quality improvement interventions to identified barriers: a multiple case analysis. Journal of Evaluation in Clinical Practice 2007; 13: 161-8. [Context Link]

 

47. Grol R. Hopes, hypes and myths in improving patient care: how to implement sustainable change in health care? National Institute for Clinical Studies 2007; Accessed 8 April 2009. Available from http://www.safetyandquality.sa.gov.au/Default.aspx?tabid=109[Context Link]

 

48. Capezuti E, Taylor J, Brown H, Strothers HS, 3rd, Ouslander JG. Challenges to implementing an APN-facilitated falls management program in long-term care. Applied Nursing Research 2007; 20: 2-9. [Context Link]

 

49. Bradley EH, Schlesinger M, Webster TR, Baker D, Inouye SK. Translating research into clinical practice: making change happen. Journal of the American Geriatrics Society 2004; 52: 1875-82. [Context Link]

 

50. Pettigrew A, McKee L, Ferlie E. Understanding change in the NHS. Public Administration 1988; 66: 297-317. [Context Link]

 

51. Ovretveit J. A framework for quality improvement translation: understanding the conditionality of interventions. Joint Commission on Quality and Safety Journal 2004; Global Supplement: 15-24. [Context Link]

 

52. Pettigrew AM, Ferlie E, McKee L. Shaping Strategic Change. London: Sage, 1992. [Context Link]

 

53. McCormack B, Kitson A, Harvey G, Rycroft-Malone J, Titchen A, Seers K. Getting evidence into practice: the meaning of context. Journal of Advanced Nursing 2002; 38: 94-104. [Context Link]

 

54. Dopson S. A view from organizational studies. Nursing Research 2007; 56 (Suppl): S72-7. [Context Link]

 

55. Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatric Services 2006; 57: 1162-9. [Context Link]

 

56. Cummings GG, Estabrooks CA, Midodzi WK, Wallin L, Hayduk L. Influence of organizational characteristics and context on research utilization. Nursing Research 2007; 56: S24-39. [Context Link]

 

57. Ross F, O'Tuathail C, Stubberfield D. Towards multidisciplinary assessment of older people: exploring the change process. Journal of Clinical Nursing 2005; 14: 518-29. [Context Link]

 

58. Dobrow MJ, Goel V, Upshur RE. Evidence-based health policy: context and utilisation. Social Science and Medicine 2004; 58: 207-17. [Context Link]

 

59. Pearson A, Wiechula R, Court A, Lockwood C. A re-consideration of what constitutes 'evidence' in the healthcare professions. Nursing Science Quarterly 2007; 20: 85-8. [Context Link]

 

60. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet 2003; 362: 1225-30. [Context Link]

 

61. Davis DA, Taylor-Vaisey A. Translating guidelines into practice. A systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. Canadian Medical Association Journal 1997; 157: 408-16. [Context Link]

 

62. Harvey G, Loftus-Hills A, Rycroft-Malone J et al. Getting evidence into practice: the role and function of facilitation. Journal of Advanced Nursing 2002; 37: 577-88. [Context Link]

 

63. Thompson GN, Estabrooks CA, Degner LF. Clarifying the concepts in knowledge transfer: a literature review. Journal of Advanced Nursing 2006; 53: 691-701. [Context Link]

 

64. Richens Y, Rycroft-Malone J, Morrell C. Getting guidelines into practice: a literature review. Nursing Standard 2004; 18: 33-40. [Context Link]

 

65. Grimshaw JM, Shirran L, Thomas R et al. Changing provider behavior: an overview of systematic reviews of interventions. Medical Care 2001; 39: II2-45. [Context Link]

 

66. Waterman H, Marshall M, Noble J et al. The role of action research in the investigation and diffusion of innovations in health care: the PRIDE project. Qualitative Health Research 2007; 17: 373-81. [Context Link]

 

67. Lindeman M, Smith R, Vrantsidis F, Gough J. Action research in aged care: a model for practice change and development. Geriaction 2002; 20: 10-4. [Context Link]

 

68. Beck C, Heacock P, Mercer SO et al. Sustaining a best-care practice in a nursing home. Journal for Healthcare Quality 2005; 27: 5-16. [Context Link]

 

69. Ashburner C, Meyer J, Johnson B, Smith C. Using action research to address loss of personhood in a continuing care setting. Illness, Crisis and Loss 2004; 12: 23-37. [Context Link]

 

70. McCormack B, Dewar B, Wright J, Garbett R, Harvey G, Ballantine K. A Realistic Synthesis of Evidence Relating to Practice Development: Final Report to NHS Education for Scotland and NHS Quality Improvement. Edinburgh, Scotland: NHS Education for Scotland and NHS Quality Improvement Scotland, 2006. [Context Link]

 

71. O'Brien MA, Freemantle N, Oxman AD, Wolf F, Davis DA, Herrin J. Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2001; 1: 33 pp. [Context Link]

 

72. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. British Medical Journal 1998; 317: 465-8. [Context Link]

 

73. O'Brien MA, Rogers S, Jamtvedt G et al. Educational outreach visits: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2007; 4: 26 pp. [Context Link]

 

74. Reeves S, Zwarenstein M, Goldman J et al. Interprofessional education: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews 2008; 1: 17 pp. [Context Link]

 

75. Darcy L. Reducing and/or minimising physical restraint in a high care, rural aged care facility. International Journal of Evidence-Based Healthcare 2007; 5: 458-67. [Context Link]

 

76. Knox J. Reducing physical restraint use in residential aged care: implementation of an evidence-based approach to improve practice. International Journal of Evidence-Based Healthcare 2007; 5: 102-7. [Context Link]

 

77. Fallon T, Buikstra E, Cameron M et al. Implementation of oral health recommendations into two residential aged care facilities in a regional Australian city. International Journal of Evidence-Based Healthcare 2006; 4: 162-79. [Context Link]

 

78. Georg D. Improving the oral health of older adults with dementia/cognitive impairment living in a residential aged care facility. International Journal of Evidence-Based Healthcare 2006; 4: 54-61. [Context Link]

 

79. Rivett D. Compliance with best practice in oral health: implementing evidence in residential aged care. International Journal of Evidence-Based Healthcare 2006; 4: 62-7. [Context Link]

 

80. Grieve J. The prevention and management of constipation in older adults in a residential aged care facility. International Journal of Evidence-Based Healthcare 2006; 4: 46-53. [Context Link]

 

81. Keller M. Maintaining oral hydration in older adults living in residential aged care facilities. International Journal of Evidence-Based Healthcare 2006; 4: 68-73. [Context Link]

 

82. McKenzie R, Naccarella L, Thompson C. Well for Life: evaluation and policy implications of a health promotion initiative for frail older people in aged care settings. Australasian Journal on Ageing 2007; 26: 135-40. [Context Link]

 

83. Well for Life Project Working Group. 'Well for Life': improving nutrition and physical activity for residents of aged care facilities (summary report). 2000. Accessed March 2008. Available from: http://www.mednwh.unimelb.edu.au/research/pdf_docs/well_for_life_summary_report.[Context Link]

 

84. Lindeman MA, Black K, Smith R et al. Changing practice in residential aged care using participatory methods. Education for Health 2003; 16: 22-31. [Context Link]

 

85. Austin H. Respecting patient choices: final evaluation of the community implementation of the Respecting Patient Choices Program. 2006. Accessed March 2008. Available from: http://70.87.111.98/~rpccom/images/stories/pdfs/evaluation/executive_summary_rpc[Context Link]

 

86. Lyon C. Advance care planning for residents in aged care facilities: what is best practice and how can evidence-based guidelines be implemented? International Journal of Evidence-Based Healthcare 2007; 5: 450-7. [Context Link]

 

87. Poulos L. Falls prevention in residential aged care. 2006. Accessed March 2008. Available from: http://www.powmri.edu.au/fallsnetwork/SESIAHS%20Poulos%20presentation.pdf[Context Link]

 

88. Crotty M, Whitehead C, Rowett D et al. An outreach intervention to implement evidence based practice in residential care: a randomized controlled trial. BMC Health Services Research 2004; 4: 6. [Context Link]

 

89. Ellis I, Santamaria N, Carville K et al. Improving pressure ulcer management in Australian nursing homes: results of the PRIME trial organisational study. Primary Intention 2006; 14: 106-11. [Context Link]

 

90. Walpole R. The Implementation of Best Evidence Concepts in a High Care Aged Care Facility: An Experience in Success and Failure Australian Centre for Evidence-based Practice in Aged Care Conference 'Evidence in Practice: Leading the Way in Aged Care'. Melbourne, Victoria, La Trobe University, 2007. [Context Link]

 

91. Cheek J, Gilbert A, Ballantyne A, Penhall R. Factors influencing the implementation of quality use of medicines in residential aged care. Drugs and Aging 2004; 21: 813-24. [Context Link]

 

92. McConigley R, Toye C, Goucke R, Kristjanson L. Developing recommendations for implementing the Australian Pain Society's pain management strategies in residential aged care. Australasian Journal on Ageing 2008; 27: 45-9. [Context Link]

 

93. Moore K, Haralambous B. Barriers to reducing the use of restraints in residential elder care facilities. Journal of Advanced Nursing 2007; 58: 532-40. [Context Link]

 

94. Edwards H, Gaskill D, Sanders F et al. Resident-staff interactions: a challenge for quality residential aged care. Australasian Journal on Ageing 2003; 22: 31-7. [Context Link]

 

95. Tuckett AG. Residents' rights and nurses' ethics in the Australian nursing home. International Nursing Review 2005; 52: 219-24. [Context Link]

 

96. Doty MM, Koren MJ, Sturla EL. Culture change in nursing homes: how far have we come? Findings from the Commonwealth Fund 2007 national survey of nursing homes: Commonwealth Fund; 2008. [Context Link]

 

97. McInnes E, Askie L. Evidence review on older people's views and experiences of falls prevention strategies. Worldviews on Evidence-Based Nursing 2004; 1: 20-37. [Context Link]

 

98. Jones KR, Fink R, Vojir C et al. Translation research in long-term care: improving pain management in nursing homes. Worldviews on Evidence-Based Nursing 2004; 1 (S1) (3rd Quarter): S13-20. [Context Link]

 

99. Resnick B, Quinn C, Baxter S. Testing the feasibility of implementation of clinical practice guidelines in long-term care facilities. Journal of the American Medical Directors Association 2004; 5: 1-8. [Context Link]

 

100. Timmerman T, Teare G, Walling E, Delaney C, Gander L. Evaluating the implementation and outcomes of the Saskatchewan pressure ulcer guidelines in long-term care facilities. Ostomy/Wound Management 2007; 53: 28-32. [Context Link]

 

101. Watson J, Hockley J, Dewar J. Barriers to implementing an integrated care pathway for the last days of life in nursing homes. International Journal of Palliative Nursing 2006; 12: 234-40. [Context Link]

 

102. O'Halloran PD, Cran GW, Beringer TRO et al. Factors affecting adherence to use of hip protectors among residents of nursing homes - a correlation study. International Journal of Nursing Studies 2007; 44: 672-86. [Context Link]

 

103. Stetler CB, McQueen L, Demakis JG, Mittman BS. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series. Implementation Science 2008; 3: 30. [Context Link]

 

104. Ovretveit J, Gustafson D. Evaluation of quality improvement programmes. In: Grol R, Baker R, Moss F, eds. Quality Improvement Research: Understanding the Science of Change in Health Care. London: BMJ Books, 2004; 115-32. [Context Link]

 

105. McGhee G, Marland GR, Atkinson J. Grounded theory research: literature reviewing and reflexivity. Journal of Advanced Nursing 2007; 60: 334-42. [Context Link]

 

106. Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N. Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. Journal of Clinical Epidemiology 2005; 58: 107-12. [Context Link]

 

107. Titler MG, Everett LQ, Adams S. Implications for implementation science. Nursing Research 2007; 56: S53-9. [Context Link]

 

Key words:: aged care; evaluation; evidence-based practice; implementation