Keywords

Competency based education, Delphi, nurse practitioner, nurse practitioner competencies

 

Authors

  1. Chan, Tracey Elizabeth PhD, ANP-BC (PhD Student, Assistant Professor, Director)

ABSTRACT

Background: Competency-based education (CBE) has been recommended for nurse practitioner (NP) education. To implement CBE, existing NP core competencies need to be reduced in number and refined.

 

Purpose: This study refined and reduced redundancy in the National Organization of Nurse Practitioner Faculties (NONPF) and the American Association of Colleges of Nursing (AACN) NP core competencies through the consensus of experts in NP practice. This study used the current NP Core Competencies (NONPF, 2017), the Essentials of Doctoral Education for Advanced Nursing Practice (AACN, 2006), and the Common Advanced Practice Registered Nurse Doctoral-Level Competencies (AACN, 2017a) because these documents are the competencies-accredited NP programs commonly used in curriculum development. The primary aim of this study was to refine and reduce redundancy of these competencies; a secondary aim was to ensure that the final competencies were clear and measurable.

 

Methods: A Delphi approach was used to reach consensus among an expert panel who reviewed the core competencies via an online questionnaire. Descriptive statistics were used to calculate median and interquartile ranges; content analysis was conducted with qualitative data.

 

Results: Consensus was reached after 3 rounds and resulted in 49 final core competencies.

 

Implications for practice: This study provides the NP community with a manageable list of relevant, clear, and measurable competencies that faculty members can use to implement CBE in their programs.

 

Article Content

Nurse practitioners (NPs) are currently prepared at both the master's and doctoral levels in one of 6 population foci. Since the early 2000s, both the National Organization of Nurse Practitioner Faculties (NONPF) and the American Association of Colleges of Nursing (AACN) have endorsed the Doctor of Nursing Practice (DNP) degree as entry to nurse practitioner (NP) practice (AACN, 2004; NONPF, 2015), and NONPF recently reinforced this stance with a statement "to move all entry-level NP education to the DNP degree by 2025" (NONPF, 2018b, p. para. 1). The Institute of Medicine (IOM) is recommending nursing and NP education move to a competency-based education (CBE) framework (IOM, 2011). It is imperative that NP programs continue to prepare competent students to provide safe, quality, and independent patient care for the population foci in which they have been trained. Research has consistently demonstrated that the quality of care patients receive from NPs is similar or better than care provided by medical doctors (Stanik-Hutt et al., 2013). To continue graduating quality NPs and moving NP education to CBE, the NP competencies need to be refined and reflect the current state of health care.

 

Background and significance

Nurse practitioners complete graduate education and training at either a master's or doctoral level (DNP) within one of six identified population foci (family/individual across the lifespan, adult-gerontology, pediatrics, neonatal, women's health/gender-related, or psych/mental health), which qualifies them to sit for national certification (AANP, 2013). Since 2002, the NONPF has endorsed the DNP degree as entry to NP practice and has recently called for this to occur by 2025 (NONPF, 2015, 2018b). In 2004, the AACN released a statement supporting the move to the DNP as the educational degree needed for entry into practice as a NP (AACN, 2004). According to AANP (2013), the majority of currently accredited NP programs are at the master's level. However, DNP programs have been steadily increasing in number. In 2017, 303 DNP programs were available nationwide. One hundred eighty-seven programs were BSN-DNP, with at least an additional 124 DNP programs in the planning stages (AACN, 2017b). According to the American Academy of Nurse Practitioners Certifying Board, the current requirement for national certification as an NP entails that graduates complete an accredited NP program at the master's or doctoral level with a minimum of 500 hours of supervised clinical practice, pass a written certification examination, and transition into their roles as independent providers (American Academy of Nurse Practitioners, 2015). Although these requirements are expected to assure that the applicant is competent, past research does not support that earning certification equates to clinical competency (Hallas, Biesecker, Brennan, Newland, & Haber, 2012; Whittaker, Carson, & Smolenski, 2000). Numerous NP competencies have been published since the 1990s, but most NP programs incorporate them into traditional time-based knowledge acquisition higher education models rather than solely assuring achievement of the competencies using a CBE approach (NONPF, 2013).

 

Competency-based education

Competency-based education is an educational framework that has been recommended by various leaders within nursing and health care (Giddens et al., 2014; IOM, 2011; Lucey, 2017; Sroczynski & Dunphy, 2012). Competency-based education has been defined as "a data-based, adaptive, performance-oriented set of integrated processes that facilitate, measure, record and certify within the context of flexible time parameters the demonstration of known, explicitly stated, and agreed on learning outcomes that reflect successful functioning in life roles" (Spady, 1977, p. 10). Also, CBE focuses on assuring that students attain specific skills before advancing to new information and is not based on a predetermined period.

 

Implementation of CBE requires an agreed on the definition of competency. Although "competency" has been defined in a variety of ways within the nursing profession, all of the definitions incorporate learners' abilities to perform or apply their knowledge (Benner, 1982; Chapman, 1999; Fan, Wang, Chao, Jane, & Hsu, 2015; Nolan, 1998). The AACN recently adopted definitions of "competency" and "competence" based on work by Frank et al. (2010). Competency is defined as "an observable ability of a health professional, integrating multiple components such as knowledge, skills, and attitudes. Since competencies are observable, they can be measured and assessed to ensure acquisition" (AACN, 2017a, p. 2). Competence is defined as "The array of abilities (knowledge, skills and attitudes) across multiple domains or aspects of performance in a certain context. Competence is multi-dimensional and dynamic. It changes with time, experience, and settings" (AACN, 2017a, p. 2).

 

Compared with nursing, physical therapy (PT), pharmacy, and medicine have more routinely implemented CBE in their programs. Physical therapy was one of the first health care professions to implement CBE and, in 1992, implemented the Clinical Performance Instrument (Roach et al., 2012). This validated instrument measures students' attainment of necessary competencies and is used by a majority of PT programs (Roach et al., 2012). In addition, the American College of Clinical Pharmacy (ACCP) has well-defined and accepted competencies for their graduates that assure that they are ready to enter into pharmacy practice (Saseen et al., 2017). Finally, medical education research within the United States is ongoing regarding CBE with a defined set of competencies having been developed and accepted for general physicians (Englander et al., 2013). At least two US medical schools, the University of Minnesota Medical School and Brown University School of Medicine, have successfully implemented CBE (Andrews et al., 2018; Carraccio, Wolfsthat, Englander, Ferentz, & Martin, 2002; Lucey, 2017).

 

For these health professions to implement CBE, they had to develop a well-defined set of measurable and attainable competencies. The Association of American Medical Colleges has 58 competencies in 8 domains for general physicians (Englander et al., 2013). The ACCP has 6 essential domains that encompass 31 competencies that clinical pharmacists need to obtain (Saseen et al., 2017). Each of these professional organizations has evaluated the literature and the practice of their discipline to reach well-defined appropriate and measurable competencies. It is time for the discipline of nursing to fully implement CBE for NPs.

 

Nurse Practitioner competencies

Health-related organizations, including NONPF, the AACN, the Interprofessional Education Collaborative, the American Nurses Association, and the International Society of Nurses in Genetics, have collectively defined 354 specific competencies for all advanced practice registered nurses (APRNs), which includes NPs, and refer to them as core competencies. Core competencies reflect the knowledge and skills that all NPs should have and are considered the gold standard (Crabtree, Stanley, Werner, & Schmid, 2002).

 

Recently, the AACN convened a work group representing the four APRN roles (NP, clinical nurse specialist, certified nurse midwife, and certified registered nurse anesthetist) to develop "a common taxonomy for competencies for the doctoral-prepared APRN" (AACN, 2017a, p. 1). As previously noted, AACN supports the movement of APRN education to the doctoral level via the DNP degree. Ultimately, the group adopted Common Taxonomy for Competency Domains in the Health Professions described by Englander et al. (2013) as a framework for competency development (AACN, 2017a). The eight domains include the following: patient care; knowledge for practice; practice based learning and improvement; interpersonal and communication skills; professionalism; systems-based practice; interprofessional collaboration; and personal and professional development (Englander et al., 2013). This AACN group of APRNs developed a list of 31 competencies within these 8 domains that are applicable to all four APRN roles (AACN, 2017a). The AACN recognizes that each of the APRN roles need to further this work to move toward CBE.

 

Based on this AACN work, NPs need to first refine their core competencies. Although no defined number of competencies exist for a profession, the National Task Force on Quality Nurse Practitioner Education (2016) states that the NP curriculum needs to reflect nationally recognized core competencies that include the NONPF NP Core Competencies (NONPF, 2017) and the AACN Essentials of Doctoral Education for Advanced Nursing Practice (AACN, 2006). Because overlaps exist among the different competencies, redundancies need to be lessened. It is imperative that the core NP competencies are relevant, the extent to which these core competencies are necessary for newly graduated NPs, and how these core competencies reflect the current state of health care. An integrative review evaluating the current core competencies in relation to NP practice activities revealed weak alignment between the competencies and NP practice (Chan, Lockhart, Thomas, Kronk, & Schreiber, 2019). This review revealed that, although NPs spend a majority of their time in direct patient care, 86% of the core competencies reflect indirect care activities (Chan et al., 2019). Competencies should reflect the needs of the workforce (Hallas et al., 2012; Voorhees, 2001). The IOM "supports the development of a unified set of core competencies across [each level of] the nursing profession and believes it would help provide direction for standards across nursing education" (IOM, 2011, p. 201).

 

Therefore, the purpose of this study was to refine and reduce redundancy in the NONPF and AACN core APRN competencies through the consensus of US experts in NP practice. The study used the current NP Core Competencies (NONPF, 2017), the Essentials of Doctoral Education for Advanced Nursing Practice (AACN, 2006), and the Common Advanced Practice Registered Nurse Doctoral-Level Competencies (AACN, 2017a) as a basis because these are the competencies-accredited BSN-DNP programs used in curriculum development. The primary aim was to refine and reduce redundancy in NP core competencies with a secondary aim of assuring the competencies were clear and measurable.

 

Method

Design

A Delphi approach was used to research BSN-DNP competencies. The Delphi method allows discussion and judgment on a topic without interpersonal interaction, which can create bias and conflict (Goodman, 1987; Grisham, 2008). This approach was chosen because of the desire to collect a group of experts' opinions to reach consensus. Therefore, the Delphi technique would reach consensus on BSN-DNP competencies, the main aim of the study, through a series of questionnaires that build on each other (Goodman, 1987; Hasson, Keeney, & McKenna, 2000).

 

Selection of expert panel

In a Delphi technique, the sample is purposefully chosen because of the need for an expert panel of individuals rather than randomly selected participants. In this current study, a panel of experts on NP practice throughout the United States was recruited with the assistance of NONPF, the "leading organization for NP faculty" representing more than 90% of US NP programs (NONPF, 2018a). Inclusion criteria for the panel participants included the following: (1) employed in the United States; (2) able to read and write in English; and (3) (a) a faculty member with a minimum of 3 years of experience in a BSN-DNP program; (b) an actively practicing NP clinician educated as a DNP with a minimum of 5 years of experience; or (c) a recent BSN-DNP program graduate who has been employed as a NP full time for 6-18 months. Although using a panel with a variety of viewpoints can increase study validity and credibility (Day & Bobeva, 2005; Habibi, Sarafrazi, & Izadyar, 2014), it can also make it more difficult to achieve consensus (Skulmoski, Hartman, & Krahn, 2007).

 

Through e-mail communication, the lead researcher asked members of the NONPF Curricular Committee and the Program Directors' Special Interest Group to nominate one to two people who fit into each of the three panel groups and met other inclusion criteria; group members were asked to provide their nominees' names with credentials, geographical location, and contact information (phone number and e-mail). Members could also self-nominate. Next, the researcher eliminated duplicates from the list of nominees. A Delphi study does not have criteria regarding the number of experts that should be on the panel, and although ideal, each category does not need to have equal representation (Habibi et al., 2014; Keeney, Hasson, & McKenna, 2001).

 

The researcher contacted the nominated experts using an e-mail letter that explained the study and invited them to participate. It was important for panelists to understand the study and remain engaged throughout the study to increase its validity (Hasson et al., 2000). According to Keeney, Hasson, and McKenna (2006), assuring that panelists "realize and feel that they are partners in the study and are interested in the topic" (p.207) can enhance response rates.

 

Sixty nominees were sent invitations to participate with 37 being BSN-DNP faculty, 13 being actively practicing NPs with 5 years of experience as a DNP, and 7 being new BSN-DNP graduates employed as NPs. Nominees were asked to electronically respond regarding their willingness to participate, confirm that they met the inclusion criteria, and note into which of the three groups they fit. Of the 60 nominees, 37 individuals consented to participate in the study providing a 61.7% response rate. Sixteen individuals never responded, and 7 either declined or did not meet full criteria for participation.

 

Study measures and instruments

To begin, 139 different NP core competencies were retrieved from 3 key documents, which are the necessary components of curriculum development for accredited BSN-DNP programs: NONPF Core Competencies (NONPF, 2017), The Essentials of Doctoral Education for Advanced Practice Nursing (AACN, 2006), and Common APRN Doctoral-Level Competencies (AACN, 2017a). These core competencies comprised the variables that were evaluated by the panel over three rounds of review for their relevance, clarity, and measurability.

 

A researcher-devised questionnaire based on these 139 NP core competencies was developed to collect responses and gain consensus from the panel. The focus of the questionnaire was on evaluation of the competencies. This questionnaire changed after each round based on the panelists' feedback. The first round's questionnaire presented the competencies in random order, rather than by the organization that created them, to reduce bias (Hasson et al., 2000). Pilot testing of the first questionnaire was conducted with three NPs who were familiar with the competencies. They were asked to provide feedback on the questionnaire's usability and content as well as the time it took them to complete the questionnaire. The questionnaire did not require any revisions based on pilot study feedback.

 

For the first round, panelists were asked to rate each of the 139 competencies for its relevancy on a Likert scale ranging from 1 to 4 (1 = strongly disagree and 4 = strongly agree) with no neutral point to force experts to take a stance of either agreement or disagreement. "Relevancy" was defined to panelists as the degree to which the competency is necessary for a new NP obtaining the DNP degree. Panelists also had an option to add comments to each item and/or recommend additional competencies.

 

After analyzing the data obtained from the first round (see Results section), the lead researcher used the feedback to revise the questionnaire for use in Round 2. Changes included reducing or rewording the competencies based on feedback and grouping the remaining competencies together by a concept. In the second round, the panel was asked to determine if redundancy still existed and if the competency was critical on a 1-4 scale (1 = strongly disagree and 4 = strongly agree) instead of just relevant, measurable (yes/no), and clear (yes/no). "Critical" was defined as a competency necessary for a new BSN-DNP graduate to possess. "Measurability" was defined as being able to objectively evaluate the competency. "Clear" was defined as the competency being free from ambiguity. The option for panelists to add comments remained. Additionally, panelists were asked to offer suggestions to change the competency if it was marked as "critical" but not "measurable" or "clear." At the end of the questionnaire, panelists were given the opportunity to comment about concepts they believed were missing from the competencies. In Round 2 and beyond, the panelists received personalized results termed "iterative controlled feedback" from the previous round that included their individual rating as well as the overall median rating for each item. This feature allowed the panel to see its collective opinion (Hasson et al., 2000).

 

The Round 3 questionnaire incorporated the results of the Round 2 questionnaire and reduced or reworded the competencies based on the feedback. In the third round, the competencies were grouped together according to eight domains as described by the Taxonomy of Competency Domains for the Health Profession Competencies of Englander et al. (2013). The panelists were now asked to determine if they were in agreement with each of the competencies using the 1-4 Likert scale and to determine if the competency was placed in the appropriate domain (by answering yes/no). As in Rounds 1 and 2, the opportunity to provide comments or suggested changes was provided. At the end of the questionnaire, panelists were again given a chance to comment and/or mention if any concepts were missing from the competencies.

 

Procedure

The Duquesne University Institutional Review Board approved the study. The questionnaires were administered electronically using the Qualtrics software, a secure online program that has International Organization for Standardization 27001 certification (Qualtrics, 2018). The panel of experts was e-mailed a secure link to complete the questionnaire electronically. Each rounds' questionnaire was available to respondents for approximately 2 weeks. Panel members must have participated in the previous round to continue.

 

Summarizing comments and not sharing the identity of expert panel members with other panel members maintained confidentiality of the panelist's responses. Protecting the anonymity of panel members is a key characteristic of Delphi research (Keeney et al., 2006).

 

Analysis

Analysis of the quantitative data was performed using statistical package for social sciences (SPSS) version 23. Data from completed questionnaires were exported in SPSS format from Qualtrics for analysis. Descriptive statistics of median and interquartile ranges were calculated. The median was used because a Likert scale produces ordinal data (von der Gracht, 2012) and interquartile range was used as an indicator of consensus (De Vet, Brug, De Nooijer, Dijkstra, & De Vries, 2005). Competencies on the first questionnaire that had received a median score of three or above for relevancy with an interquartile deviation of one were included in the next round. Those items rated with a median less than three and an interquartile deviation of one were considered not relevant and eliminated. Competencies that had an interquartile range greater than one were also included in the next round regardless of their median rating. Competencies in Rounds 2 and 3 were also rated on measurability and clarity. Items that received a median of 3 or above on relevancy but less than 80% agreement on clarity or measurability were rewritten for the next round based on content analysis of comments received. Competencies that received consensus, interquartile deviations of less than or equal to one, with a median score less than three for relevancy were eliminated. Those items with a median of 3 or above on relevancy, and 80% agreement on clarity and measurability, were considered a core NP competency.

 

Qualitative comments on the questionnaires were analyzed through content analysis, "a research method for the subjective interpretation of the content of text data through the systematic classification process of coding and identifying themes or patterns" (Hsieh & Shannon, 2005, p. 1278). An inductive approach was used in each round. The researcher initially read through all the comments in the selected round then reread them again carefully and made note of key words and determined themes at the literal level (Hsieh & Shannon, 2005; Kondracki, Wellman, & Amundson, 2002). Categories were developed based on the themes. Data were then placed into the categories, and the relationship between categories was analyzed. Competencies were revised as appropriate. Throughout the study, a manual approach was used. Journal entries captured the thought processes and decisions made by the researcher to assist in credibility and dependability of the study, similar to an audit trail (McPherson, Reese, & Wendler, 2018; Skulmoski et al., 2007). Another researcher with expertise in nursing education independently analyzed data via content analysis using the same procedure to assure confirmability (McPherson et al., 2018) along with interrater agreement to reach 100% consensus.

 

Results

Sociodemographic data collected from the expert panel over three rounds are displayed in Table 1. Panelists were located throughout the United States, certified as NPs in various foci, and had many years of experience as a registered nurse. Initially, 37 experts consented to participate in the study. Of those interested expert panelists, only 27 (73%) responded to the Round 1 questionnaire. The response rate in Round 2 was 21 panelists retained from the 27 in Round 1 (78%); then, 17 of 21 panelists (81%) in Round 2 participated in the final Round 3. Participants had to participate in the previous round to continue on to the next round.

  
Table 1-a. Expert Pa... - Click to enlarge in new windowTable 1-a. Expert Panel Members' Demographics

Round 1

Initial quantitative results of the Round 1 questionnaire did not eliminate any of the competencies (full Round 1 results presented in Supplemental Digital Content 1, http://links.lww.com/JAANP/A39). Of the 139 competencies, 131 (94%) were rated as "relevant" with a median score of 3-4 for relevancy and an interquartile range of 0-1. The remaining eight competencies received a median of three or above for relevancy, but the interquartile range was above one, thus not indicating consensus. Because quantitative data did not result in competency reduction, it was determined that qualitative analysis of comments would be an important component of data analysis. Content analysis of the comments indicated concern over redundancy among the competencies and the ability to measure some of the competencies. To address redundancy, the researcher clustered the competencies by main concept within the competency, then combined or eliminated those that had similar intent. The main concepts that were found included the following: leadership, policy, information technology/data, ethics, communication, patient care/clinical practice, and outcomes/quality improvement. An additional two doctoral-prepared researchers with expertise in nursing education and methodology independently reviewed the work to assure interrater reliability. This process resulted in eliminating 51 competencies, leaving 88 competencies to be evaluated in the second-round questionnaire.

 

Round 2

In Round 2, the resulting 88 competencies were presented by concept as previously described in the Round-1 results (full Round-2 results presented in Supplemental Digital Content 2, http://links.lww.com/JAANP/A40). The verbiage for ranking the competencies was changed from relevant to critical because all the competencies were viewed as being relevant in Round 1. The panelists were also asked to indicate if each competency was clear and measurable and to indicate if there was redundancy in the competencies. If redundancies were found, the panelists were to indicate the competencies that were redundant.

 

The quantitative analysis of the Round-2 questionnaire revealed that 47 competencies did not reach consensus due to either an interquartile range above one (42 of the 47) or the rating fell below the 80% agreement on either clarity or measurability. With regards to redundancy, only the competencies under the concept of communication were found to not have any redundancy. The remaining concepts and competencies had redundancy. Content analysis of the comments received resulted in reduction of competencies based on redundancies. The content analysis also resulted in competencies being rewritten to clarify them or make them measurable. Finally, four additional competencies were written based on comments in relation to missing concepts including ethics, social determinants of health, and role differentiation. This analytical process resulted in eliminating 39 competencies, leaving 49 competencies to be evaluated in the third round.

 

Round 3

The 49 competencies in the third round were presented according to domains described by the Taxonomy of Competency Domains for the Health Profession Competencies of Englander et al. (2013) adopted by the AACN (full Round-3 results presented in Table 2). In the third round, the panelists were asked to rate if they were in agreement with the newly written/reworded competencies based on the 1-4 Likert scale and to decide if the competency was placed in the correct domain.

  
Table 2-a. Round 3 r... - Click to enlarge in new windowTable 2-a. Round 3 results

The quantitative analysis revealed that 48 of the 49 competencies reached consensus regarding agreement with it being a competency and correct domain placement. The competencies all had a median of four resulting in a final list of 48 competencies that were agreed upon by the expert panel. The one competency that did not reach consensus was related to health policy. Panelists suggested placing the competency in a different domain and increasing the level for achieving this competency. Based on content analysis, the competency was reworded and moved to a different domain and included on the final competency list. Comments were also received on other competencies that had reached consensus, but based on content analysis and the high level of consensus (all median of four and many with interquartile range of zero), no further competencies were changed. The final list of 49 NP core competencies is displayed in Table 3.

  
Table 2-b. Round 3 r... - Click to enlarge in new windowTable 2-b. Round 3 results
 
Table 2-c. Round 3 r... - Click to enlarge in new windowTable 2-c. Round 3 results
 
Table 2-d. Round 3 r... - Click to enlarge in new windowTable 2-d. Round 3 results
 
Table 2-e. Round 3 r... - Click to enlarge in new windowTable 2-e. Round 3 results
 
Table 2-f. Round 3 r... - Click to enlarge in new windowTable 2-f. Round 3 results
 
Table 3-a. Final Lis... - Click to enlarge in new windowTable 3-a. Final List of Competencies

Discussion

The purpose of this study was to refine and reduce redundancy in the NONPF and AACN NP core competencies through the consensus of experts on NP practice. This goal was achieved by reaching a final list of 49 competencies for BSN-DNPs.

 

Initial findings confirmed much redundancy in the NP core competencies. Decreasing the redundancy allows BSN-DNP programs to have a clearer understanding of the competencies that their students need to provide safe, quality care to patients. Despite the noted redundancies, it was surprising that almost all the competencies presented in Round 2 were considered relevant. It was not possible to significantly reduce the competencies using the quantitative analysis during the first two rounds. Instead, the qualitative method of content analysis became the main strategy for reducing and revising the list. It was clear that panelists were engaged in the study process based on the large number of comments and suggestions they made. The content analysis of the competencies and the panelists' comments resulted in reducing the final number of competencies. Comments received in Round 1 directed how the competencies were presented by concept in Round 2.

 

After Round 1, the instructional wording was changed from "if the competency was relevant" to "if the competency was critical" to have panelists think about the competencies from a distinct perspective. A competency that is relevant to NPs may not be critical for practice as a NP. This modification, however, did not result in a difference in relation to the quantitative data. In Round 2, panelists continued to provide a large amount of qualitative data in the form of competency rewording suggestions and combining competencies that had similar intent to further reduce redundancy.

 

Round 3's quantitative data revealed a consensus on 48 of the 49 competencies. Although comments and suggestions continued in Round 3, content analysis of the comments revealed the need to only reword one competency and change the domain in which it belonged. This competency was related to health care policy and had received diverse comments in all three rounds.

 

According to the panelists, a few concepts were missing from the competencies. For example, comments received in Round 2 included the need for an ethics competency that reflected "holding oneself to the highest of ethical standards" as well as a competency expanding on social determinants of health and the impact a DNP-prepared NP can have on improving them. Finally, it was noted that a competency for differentiating the NP role from other health care providers was necessary. A total of 4 new competencies were written and presented in Round 3. All of them reached consensus on being applicable for NPs graduating from a BSN-DNP program. In Round 3, no missing concepts were noted, and a comment was received that the "competencies are comprehensive and capable of finding activities and assignments to support the demonstration of the objective."

 

Incorporating an expert panel with a variety of perspectives is necessary to have a complete picture of the competencies necessary for day-to-day core NP practice. This study included perspectives from both NP educators and practicing NPs. As the entry-level education for NPs changes to the DNP and curricula move to CBE, it will be necessary for BSN-DNP programs to have a manageable list of core competencies that reflect both doctoral level education and workforce needs. The study results provide evidence for NONPF and AACN to take into account when revising the BSN-DNP core NP competencies.

 

Limitations

The limitations of this study are similar to other Delphi studies, as it is not a well-defined research method. The first limitation is determination of consensus. Mean, interquartile range, and percent of agreement were used as the consensus criteria because these are acceptable methods (De Vet et al., 2005). Consensus criteria established prior to data analysis contributed to the credibility of the study (Hasson et al., 2000; Keeney et al., 2006). Second, some researchers believe that using a predeveloped list of items can make the panelists feel restricted (Powell, 2003). To overcome this issue, panelists were given (and used) the opportunity to write-in comments or additional competencies. Third, a general limitation of the Delphi technique relates to reliability and validity. According to Hasson et al. (2000), "there is no evidence of the reliability of the Delphi method" (p. 1012), and validity can be affected by response rates; thus, it was important to retain panelists throughout each round. Retention was supported through follow-up and engaging panelists in the research importance, resulting in an attrition rate of 22% for Round 2 and 19% for Round 3 with a total attrition of 37%, which is an acceptable level based on previous Delphi research (Keeney et al., 2006). Validity can also be affected with iterative controlled feedback in that panelists can be persuaded toward conformity rather than true agreement (Goodman, 1987; Keeney et al., 2006). A fourth concern with the Delphi technique is that anonymity "can lead to lack of accountability"(McKenna, 1994, p. 1224), implying that because panelists are anonymous, they do not feel ownership to their responses. Fifth, results can also be biased by expert panel composition, as they are not a "representative sample" (Powell, 2003, p. 378). A random sample is typically used in research to assure that results are generalizable to the population, but with the Delphi technique, the sample is a selected group based on their expertise, which can cause bias. Therefore, the results may not be generalizable. In this study, most of the panelists were NP educators, and only a few were newly graduated practicing NPs despite an effort to seek a diverse panel. Many of the NP educators may have been practicing NPs, although this is unknown, as this demographic data were not collected. It should also be noted that NP education occurs at both a master's and doctoral levels, and this study is reflective of education at a doctoral level. A final concern particular to this study is that NP practice differs across the United States due to state regulations and could affect panelists' responses. Therefore, an effort was made to use panelists from a variety of regions within the United States. However, sample was skewed to the Northeast and Midwest regions of the United States. Furthermore, there was a statement on each questionnaire that the competencies are to reflect general NP practice across the entire country.

 

Conclusion

For NP education to move to the CBE framework, NP core competencies needed revisions. This study produced a refined list of 49 NP core competencies that are relevant, clear, and measurable. Use of this list by national NP organizations and educational programs is a beginning step in moving NP education toward CBE as other health professions have done. NPs must continue to provide safe, quality patient care. A change to the CBE educational model in programs without competency revision could present challenges in meeting this goal.

 

Acknowledgments:The authors would like to acknowledge Dr. Anne Thomas, who was the inspiration for this study. She passed away unexpectedly prior to publication of the final study results but she was an integral part of study development.

 

References

 

AACN. (2004). AACN position statement on the practice doctorate in nursing. Retrieved from http://www.aacnnursing.org/DNP/Position-Statement. [Context Link]

 

AACN. (2017a). Common Advanced Practice Registered Nurse Doctoral-Level Competencies. Retrieved from http://www.aacnnursing.org/Portals/42/AcademicNursing/pdf/Common-APRN-Doctoral-C. [Context Link]

 

AACN. (2017b). DNP factsheet. Retrieved from http://www.aacn.org. [Context Link]

 

AACN. (2006). The essentials of doctoral education for advanced nursing practice. Retrieved from http://www.aacn.nche.edu. [Context Link]

 

AANP. (2013). Nurse practitioner curriculum. Retrieved from https://www.aanp.org/images/documents/publications/curriculum.pdf. [Context Link]

 

American Academy of Nurse Practitioners. (2015). Candidate handbook and renewal of certification handbook. Retrieved from https://www.aanpcert.org/ptistore/resource/documents/2013 CandidateRenewalHandbook -Rev 11 25 2013 forNCCA(FINAL).pdf. [Context Link]

 

Andrews J. S., Bale J. F. Jr., Soep J. B., Long M., Carraccio C., Englander R., [horizontal ellipsis] EPAC Study Group. (2018). Education in pediatrics across the continuum (EPAC): First steps toward realizing the dream of competency-based education. Academic Medicine, 93, 414-420. [Context Link]

 

Benner P. (1982). From novice to expert. The American Journal of Nursing, 82, 402-407. [Context Link]

 

Carraccio C., Wolfsthat S. D., Englander R., Ferentz K., Martin C. (2002). Shifting paradigms: From Flexner to competencies. Academic Medicine, 77, 361-367. [Context Link]

 

Chan T., Lockhart J., Thomas A., Kronk R., Schreiber J. (2019). Nurse practitioner practice and core competencies: Do they relate? An integrative review. Journal of Professional Nursing. doi:10.1016/j.profnurs.2019.11.003 [Context Link]

 

Chapman H. (1999). Some important limitations of competency-based education with respect to nurse education: An Australian perspective. Nurse Education Today, 19, 129-135. [Context Link]

 

Crabtree M. K., Stanley J., Werner K. E., Schmid E. (2002). Nurse practitioner primary care competencies in specialty areas: Adult, family, gerontological, pediatric, and women's health. Retrieved from https://eric.ed.gov/?id=ED471273. [Context Link]

 

Day J., Bobeva M. (2005). A generic toolkit for the successful management of Delphi studies. The Electronic Journal of Business Research Methodology, 3, 103-116. [Context Link]

 

De Vet E., Brug J., De Nooijer J., Dijkstra A., De Vries N. K. (2005). Determinants of forward stage transitions: A Delphi study. Health Education Research, 20, 195-205. [Context Link]

 

Englander R., Cameron T., Ballard A. J., Dodge J., Bull J., Aschenbrener C. A. (2013). Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Academic Medicine, 88, 1088-1094. [Context Link]

 

Fan J. Y., Wang Y. H., Chao L. F., Jane S. W., Hsu L. L. (2015). Performance evaluation of nursing students following competency-based education. Nurse Education Today, 35, 97-103. [Context Link]

 

Frank J. R., Snell L. S., Cate O. T., Holmboe E. S., Carraccio C., Swing S. R., Harris K. A. (2010). Competency-based medical education: Theory to practice. Medical Teacher, 32, 638-645. [Context Link]

 

Giddens J. F., Lauzon-Clabo L., Morton P. G., Jeffries P., McQuade-Jones B., Ryan S. (2014). Re-envisioning clinical education for nurse practitioner programs: Themes from a national leaders' dialogue. Journal of Professional Nursing, 30, 273-278. [Context Link]

 

Goodman C. (1987). The Delphi technique: A critique. Journal of Advanced Nursing, 12, 729-734. [Context Link]

 

Grisham T. (2008). The Delphi technique: A method for testing complex and multifaceted topics. International Journal of Managing Projects in Business, 2, 112-130. [Context Link]

 

Habibi A., Sarafrazi A., Izadyar S. (2014). Delphi technique theoretical framework in qualitative research. The International Journal of Engineering and Science, 3, 8-13. [Context Link]

 

Hallas D., Biesecker B., Brennan M., Newland J. A., Haber J. (2012). Evaluation of the clinical hour requirement and attainment of core clinical competencies by nurse practitioner students. Journal of the American Academy of Nurse Practitioners, 24, 544-553. [Context Link]

 

Hasson F., Keeney S., McKenna H. (2000). Research guidelines for the Delphi survey technique. Journal of Advanced Nursing, 32, 1008-1015. [Context Link]

 

Hsieh H. F., Shannon S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15, 1277-1288. [Context Link]

 

IOM. (2011). The future of nursing: Leading change, advancing health. Washington, DC: The National Academies Press. [Context Link]

 

Keeney S., Hasson F., McKenna H. (2001). A critical review of the Delphi technique as a research methodology for nursing. International Journal of Nursing Studies, 38, 195-200. [Context Link]

 

Keeney S., Hasson F., McKenna H. (2006). Consulting the oracle: Ten lessons from using the Delphi technique in nursing research. Journal of Advanced Nursing, 53, 205-212. [Context Link]

 

Kondracki N. L., Wellman N. S., Amundson D. R. (2002). Content analysis' Review of methods and their applications in nutrition education. Journal of Nutrition Education and Behavior, 34, 224-230. [Context Link]

 

Lucey C. R. (2017). Achieving Competency-Based, Time-Variable Health Professions Education. New York, NY: Josiah Macy Jr. Foundation. Retrieved from http://macyfoundation.org/publications/publication/achieving-competency-based-ti. [Context Link]

 

McKenna H. (1994). The Delphi technique: A worthwhile research approach for nursing? Journal of Advanced Nursing, 19, 1221-1225. [Context Link]

 

McPherson S., Reese C., Wendler M. C. (2018). Methodology update: Delphi studies. Nursing Research, 67, 404-410. [Context Link]

 

National Task Force on Quality Nurse Practitioner Education. (2016). Criteria for evaluation of nurse practitioner programs (5th ed.). Retrieved from https://www.nonpf.org/page/15?. [Context Link]

 

Nolan P. (1998). Competencies drive decision making. Nursing Management, 29, 27-29. [Context Link]

 

NONPF. (2018a). About Us. Retrieved from https://nonpf.site-ym.com/?page=1. [Context Link]

 

NONPF. (2013). NONPF special meeting: NP education today, NP education tomorrow: Executive summary. Retrieved from http://c.ymcdn.com/sites/nonpf.site-ym.com/resource/resmgr/Docs/Executive_Summar. [Context Link]

 

NONPF. (2017). Nurse practitioner core competencies content. Retrieved from http://www.nonpf.org. [Context Link]

 

NONPF. (2018b). The doctor of nursing practice degree: Entry to nurse practitioner degree by 2025. Retrieved from http://www.nonpf.org. [Context Link]

 

NONPF. (2015). The doctorate of nursing practice NP preparation: NONPF perspective. Retrieved from http://www.nonpf.org/?page=83. [Context Link]

 

Powell C. (2003). The Delphi technique: Myths and realities. Journal of Advanced Nursing, 41, 376-382. [Context Link]

 

Qualtrics. (2018). Security Statement. Retrieved from https://www.qualtrics.com/security-statement/. [Context Link]

 

Roach K. E., Frost J. S., Francis N. J., Giles S., Nordrum J. T., Delitto A. (2012). Validation of the revised physical therapist clinical performance instrument (PT CPI): Version 2006. Physical Therapy, 92, 416-426. [Context Link]

 

Saseen J. J., Ripley T. L., Bondi D., Burke J. M., Cohen L. J., McBane S., Vande Griend J. P. (2017). ACCP clinical pharmacist competencies. Pharmacotherapy, 37, 630-636. [Context Link]

 

Skulmoski G. J., Hartman F. T., Krahn J. (2007). The Delphi method for graduate research. Journal of Information Technology Education, 6. doi:10.28945/199. [Context Link]

 

Spady W. G. (1977). Competency based education: A bandwagon in search of a definition. Educational Researcher, 6, 9-14.

 

Sroczynski M., Dunphy L. M. (2012). Primary care nurse practitioner clinical education: Challenges and opportunities. Nursing Clinics of North America, 47, 463-479. [Context Link]

 

Stanik-Hutt J., Newhouse R. P., White K. M., Johantgen M., Bass E. B., Zangaro G., Weiner J. P. (2013). The quality and effectiveness of care provided by nurse practitioners. The Journal for Nurse Practitioners, 9, 492-500.e413. [Context Link]

 

von der Gracht H. A. (2012). Consensus measurement in Delphi studies. Technological Forecasting and Social Change, 79, 1525-1536. [Context Link]

 

Voorhees R. (2001). Competency-based learning models: A necessary future. New Directions for Institutional Research, 110, 5-13. [Context Link]

 

Whittaker S., Carson W., Smolenski M. C. (2000). Assuring continued competence-Policy questions and approaches: How should the profession respond? The Online Journal of Issues in Nursing, 5. [Context Link]

 

For more than 388 additional continuing education articles related to Advanced Practice Nursing topics, go to http://NursingCenter.com.

 

Instructions for Earning CE Credit:

 

* Read the article.

 

* The test for this CE activity can be taken online at http://www.NursingCenter.com/CE/JAANP. Find the test under the article title.

 

* You will need to create a username and password and login to your personal CE Planner account (It's free!) before taking online tests. Your planner will keep track of all your Lippincott Professional Development online CE activities for you.

 

* There is only one correct answer for each question. A passing score for this test is 13 correct answers. If you pass, you can print your certificate of earned contact hours and access the answer key. If you fail, you have the option of taking the test again at no additional cost.

 

* For questions, contact Lippincott Professional Development: 1-800-787-8985.

 

 

Registration Deadline: March 1, 2021.

 

Disclosure Statement:

 

The authors and planners have disclosed that they have no financial relationships related to this article.

 

Provider Accreditation:

 

This activity is approved for 1.0 contact hour of continuing education by the American Association of Nurse Practitioners. Activity ID 20024159. This activity was planned in accordance with AANP CE Standards and Policies. This activity is also provider approved by the California Board of Registered Nursing, Provider Number CEP 11749 for 1.0 contact hour. Lippincott Professional Development is also an approved provider of continuing nursing education by the District of Columbia Board of Nursing, Georgia Board of Nursing, and Florida Board of Nursing, CE Broker #50-1223.

 

Payment:

 

* The registration fee for this test is $12.95.

 

* AANP members are eligible for a 50% discount. Visit the member-benefit section on AANP website (https://aanp.org/membership/memberbenefits) to obtain the discount code. Use the code when asked for payment during checkout.