Authors

  1. Jones, Lisa PhD, RN, CCRN
  2. New, Keri DNP, RN

Article Content

Examinations naturally create some level of anxiety for students. Flawed items, including errors with grammar or spelling on examinations, may cause confusion and increase anxiety for some students.1 Students may be concerned that the question was written to trick them, which may adversely affect their score. Researchers have found writing flaws are present in as many as 46% to 85% of multiple-choice items.1,2

 

Flawed items may demonstrate poor performance on review of the statistical analysis, creating the need for faculty to determine the best method of scoring the flawed items.3 Common options are accepting more than 1 answer as correct or not including the flawed item in the total score.4 Either of these alternatives results in a score that may not accurately reflect the students' knowledge or may artificially inflate the students' grades. As a result, higher-achieving students are more often adversely affected by the flawed items than borderline students.1 One researcher found that as many as 10% to 15% of students who failed would have passed if items with writing flaws were eliminated.5

 

Studies have demonstrated that a peer review process improves the psychometric quality of questions.3,6 Higher-quality items require fewer grading adjustments, allowing the assessment to more accurately measure student performance.

 

Where We Started

Our program used a testing software that required students to take paper examinations and enter their answers on a paper fill-in-the-bubble form. Every test item in every course for more than 20 years was archived in the software. Although the examination bank contained thousands of questions, we knew there were problems with many of them. There were often multiple versions of the same item, items with grammatical errors, poorly written items, and items that were no longer relevant.

 

Despite thorough reviewing of examinations by the faculty, often the flawed items appeared on an examination. In addition, 42% of our faculty reported having to interrupt class at least 1 time during every examination to clarify an item because of a misspelled word, confusing stem, or confusing response. Although unavoidable, faculty recognized that these interruptions could adversely affect students' examination performance. More than 57% of our faculty reported making multiple-item grading adjustments on each examination due to item flaws.

 

In fall 2015 and spring 2016, we piloted an electronic examination platform in 2 senior-level nursing courses and adopted the platform for use in all courses beginning in 2016. Although the new electronic platform had the capability of transferring the item bank from the previous system, we chose not to upload the entire test bank into the new system. We wanted to ensure that we loaded high-quality items into the new system and that each item would undergo peer review prior to uploading into the electronic platform. This was the impetus for our nursing department to develop an item writing committee, comprised of all faculty who write examination questions.

 

Faculty Development

During the 1-year pilot of electronic testing, extensive faculty development was initiated. Our journey began with a commitment from the department chair to ensure faculty received sufficient support and education to write quality items. All faculty responsible for writing examination questions completed the online National Council of State Boards of Nursing Test Development and Item Writing course. This self-paced course provides an overview of the National Council Licensure Examination recommendations for writing multiple-choice items. Additional guidance in the course includes writing alternate format items, specific item writing guidelines for multiple-choice items, and preventing question bias. This course provides a valuable foundation for item writing, regardless of the faculty member's previous experience with this.

 

Our nursing program integrates Quality and Safety Education for Nurses (QSEN) throughout the curriculum. Each examination item is linked to a QSEN competency. An additional faculty development presentation on the QSEN competencies was provided by an experienced faculty member.

 

A faculty expert from our university presented an interactive workshop on Bloom's taxonomy and writing items at the application and analysis levels. Educational sessions were offered prior to faculty meetings to provide additional opportunities to address any other faculty questions regarding item writing. These sessions were facilitated by faculty from our department.

 

Review Process

In fall 2015, we established an Item Review Committee consisting of all 14 faculty responsible for writing examination questions. Faculty teaching experience ranged from 3 to more than 20 years. Our review process begins when a faculty writes a new item. At least 2 weeks prior to the item appearing on an examination for the first time, the item writer requests the item be reviewed by 2 peer reviewers. The item is then reviewed by another faculty on the same curricular level. In team-taught courses, this is usually an additional faculty member teaching in the same course. The second item reviewer is a faculty member from the next curricular level.

 

Both reviewers respond to the item writer within 3 business days. Reviewers evaluate the items based on recommended item writing guidelines.3 Reviewers also provide feedback on punctuation, grammar, spelling, and question clarity. The item writer remains the content expert and determines whether to make the recommended edits or use the item as written. Once edits are made or the item remains as previously written, the faculty member enters it into the electronic testing platform to be used on an examination.

 

Our Results

Since the implementation of the Item Review Committee, we have seen an improvement in the quality of our examination items and testing process. More than 57% of faculty report no longer having to interrupt examinations to clarify questions. Nearly 63% of faculty report they have made no grading adjustments since implementation of the Item Review Committee. Faculty indicate increased satisfaction and confidence with item writing.

 

Lessons Learned

The implementation of an Item Review Committee has proven successful for our program. Including all faculty who write examination questions on the committee decreases the burden that comes with having only a few designated committee members. We found the review process to be beneficial for both the item writer and item reviewers. Critiquing the examinations of other faculty increases individual awareness of common item writing flaws and helps the reviewer when writing items for their content areas. Faculty development provided the foundation for faculty to write quality examination items. Ongoing faculty development and support are important to ensure consistency over time. Through this process, we also identified faculty members who have intense interest in item writing, Bloom's taxonomy, QSEN, or item analysis. Using their expertise has been a cost-effective method of providing additional requested faculty development options.

 

References

 

1. Tarrant M, Ware J. Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Med Educ. 2008;42(2):198-206. [Context Link]

 

2. Nedeau-Cayo R, Laughlin D, Rus L, Hall J. Assessment of item-writing flaws in multiple-choice questions. J Nurses Prof Dev. 2013;29(2):52-57. [Context Link]

 

3. Tarrant M, Ware J. A framework for improving the quality of multiple-choice assessments. Nurse Educ. 2012;37(3):98-104. [Context Link]

 

4. Magaldi M, Kinneary P, Colillo G, Sutton E. A guide to postexamination analysis: utilizing data to increase reliability and ensure objectivity. Nurse Educ. 2018;4(2):61-63. [Context Link]

 

5. Downing SM. The effects of violating standard item writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 2005;10:133-143. [Context Link]

 

6. Abozaid H, Park YS, Tekian A. Peer review improves psychometric characteristics of multiple choice questions. Med Teach. 2017;39(S1):S50-S54. [Context Link]