The Research Digest, November 2014, Issue Number 42

The Research Digest (previously known as the Monthly Digest) highlights new research in learning and teaching. This month, we are highlighting recent work on the use of final exams. This issue was developed in conjunction with November’s issue of Best Practices on Designing Final Exams. If you’re interested in learning more about this topic, or have any suggestions for us, please send an email to lto@ryerson.ca. To access past issues of the Research Digest, visit the LTO website.

New research on the use of final exams:

Journal of Economic Education, 2013
The Impact of Assessment Policy on Learning: Replacement Exams or Grade Dropping

Abstract: “Instructors often debate the merits of alternate grading policies such as dropping the lowest exam or offering an additional exam to replace the lowest score. To date, there has been little research conducted on the impact of these policies on performance. In this study, the author investigates student performance in intermediate macroeconomics over three semesters at a small Eastern college. In each semester, a different assessment policy was applied: Count all three in-term exams; drop the lowest of three exams; offer an additional exam to replace the lowest in-term exam score. Contrary to previous research and conventional wisdom, the author finds that allowing students to drop their lowest grade improved performance on a cumulative final exam, while offering a replacement test had no significant effects.”

CBE – Life Sciences Education, 2013
Verbal Final Exam in Introductory Biology Yields Gains in Student Content Knowledge and Longitudinal Performance

Abstract: “We studied gains in student learning over eight semesters in which an introductory biology course curriculum was changed to include optional verbal final exams (VFs). Students could opt to demonstrate their mastery of course material via structured oral exams with the professor. In a quantitative assessment of cell biology content knowledge, students who passed the VF outscored their peers on the medical assessment test (MAT), an exam built with 40 Medical College Admissions Test (MCAT) questions (66.4% [“n” = 160] and 62% [“n” = 285], respectively; p less than 0.001);. The higher-achieving students performed better on MCAT questions in all topic categories tested; the greatest gain occurred on the topic of cellular respiration. Because the VF focused on a conceptually parallel topic, photosynthesis, there may have been authentic knowledge transfer. In longitudinal tracking studies, passing the VF also correlated with higher performance in a range of upper-level science courses, with greatest significance in physiology, biochemistry, and organic chemistry. Participation had a wide range but not equal representation in academic standing, gender, and ethnicity. Yet students nearly unanimously (92%) valued the option. Our findings suggest oral exams at the introductory level may allow instructors to assess and aid students striving to achieve higher-level learning.”

Educational Research and Evaluation, 2012
Learning during a Collaborative Final Exam

Abstract: “Collaborative testing has been suggested to serve as a good learning activity, for example, compared to individual testing. The aim of the present study was to measure learning at different levels of knowledge during a collaborative final exam in a course in basic methods and statistical procedures. Results on pre- and post-tests taken individually (N = 30) before and after the collaborative part of the final exam confirmed learning effects at the uni- and multi-structural levels, as well as on the relational level of Biggs’ structure of the observed learning outcome (SOLO) taxonomy (Biggs & Collins, 1982). Low performers at pre-test raised their test scores more than high performers. No differences could be generalized at the extended level of knowledge. Results also suggest that it might be preferable to collaborate without first deciding on questions individually. The experimental design can be applied when evaluating learning, at different levels of knowledge, during a variety of learning activities.”

Advances in Health Sciences Education, 2011
Students Generating Questions for Their Own Written Examinations

Abstract: “Assessment partnerships between staff and students are considered a vital component of the student-centred educational process. To enhance the development of this partnership in a problem-based learning curriculum, all first-year students were involved in generating a bank of formative assessment questions with answers, some of which were included in their final written examination. Important principles to guide development of a sound methodology for such an assessment partnership have been described. These include organisational issues as well as matters pertaining to participation, education and motivation of students and teaching staff.”

British Journal of Educational Technology, 2009
The Efficacy of Final Examinations: A Comparative Study of Closed-Book, Invigilated Exams and Open-Book, Open-Web Exams

Abstract: “Educators have long debated the usefulness (or otherwise) of final examinations; a debate that has typically revolved around the relative merits of closed-book exams, open-book exams, take-home exams or their substitution by some other assessment format (e.g., project work). This paper adds a new dimension to the debate by considering how the final examination assessment instrument might be enhanced through harnessing the power of technology, more specifically, how the learner experience of the final examination might be made more authentic and, in the process, more constructively aligned with stated learning outcomes. The authors report on the latest findings of an ongoing research project evaluating the effectiveness of “open-book, open-web” (OBOW) examinations delivered by an online university, “vis-a-vis” a closed-book, invigilated alternative. Earlier research had indicated that the OBOW model receives the strong endorsement of students in a number of respects, most particularly the quality of the learning outcomes.”

This entry was posted in Research Digest. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *