Document Type

Article

Journal/Book Title/Conference

Behavioral Sciences

Author ORCID Identifier

Michelle L. Rivers https://orcid.org/0000-0002-4931-2895

Acacia L. Overono https://orcid.org/0000-0003-4023-4648

Volume

15

Issue

4

Publisher

MDPI AG

Publication Date

4-6-2025

Journal Article Version

Version of Record

First Page

1

Last Page

18

Creative Commons License

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

Abstract

Multiple-choice (MC) tests are widely used in educational settings but have been criticized for promoting passive recognition rather than active retrieval. Our research explores how adding a simple component to MC tests—answer justification—influences test performance and metacognitive accuracy. Across two experiments, university students studied a textbook chapter and completed either a standard MC test (MC-only group) or an MC test requiring them to justify their answers (answer justification group). Participants also provided predictive and postdictive metacognitive judgments. The results showed that the answer justification group significantly outperformed the MC-only group on an immediate test (Experiments 1 and 2) and scored numerically higher on a delayed test two days later (Experiment 2). Further, some initial evidence suggested that metacognitive accuracy was influenced by test type, but future research is needed. These findings support a retrieval-based explanation: generating answer justifications increases test performance by strengthening memory through elaborative retrieval. This study demonstrates that incorporating answer justification into MC tests may improve learning and metacognitive accuracy. We also offer practical suggestions for classroom implementation, considering that answer justification boosts test performance but also imposes a time cost compared to standard MC tests.

Included in

Psychology Commons

Share

COinS