Date of Award:

12-2011

Document Type:

Dissertation

Degree Name:

Doctor of Philosophy (PhD)

Department:

Instructional Technology and Learning Sciences

Committee Chair(s)

Andrew E. Walker

Committee

Andrew E. Walker

Committee

Byron R. Burnham

Committee

J. Nicholls Eastmond

Committee

Sheri Haderlie

Committee

Karl R. White

Abstract

To reduce the risk of repeating prior research efforts or choosing incorrect research methods, a sound literature review should be performed before undertaking a new study. As such, the literature review occupies a well-defined role in the research process. It is natural to assume much research has been done in how these skills are taught to future scholars. However, this is not the case. Research in this area is limited and varied. This dissertation builds on existing efforts and fills in a portion of the missing research. This work examines some of the textbooks used to teach doctoral students literature review skills. It also looks at the current state of dissertation literature reviews from a specific field in education, Instructional Technology.

The Boote and Beile Literature Review Scoring Rubric is a widely used source of information about important criteria for a dissertation literature review. A scoring rubric is a list of critical features for a piece of work. Rubrics help students know how their work will be evaluated. In this dissertation, researchers use the Literature Review Scoring Rubric as a framework to examine textbooks used to teach doctoral students literature review skills. They then assess the quality of dissertation literature reviews using the rubric.

In the first study, researchers analyzed seven top-selling education research textbooks using content analysis techniques. They wanted to determine how well the textbooks covered the items on the Literature Review Scoring Rubric. Each textbook received a final letter grade, much like a student in a classroom. Three of the textbooks received a failing grade of F, one received a C-, another received a B, and one received an A-. This study supports the claim that textbooks used to teach doctoral students tend to focus on search strategies and not on the more broad requirements of a dissertation review.

The second study replicates Boote and Beile’s study. Using the Literature Review Scoring Rubric, researchers evaluated 27 randomly chosen dissertations from Instructional Technology. They wanted to know if the literature reviews from Instructional Technology scored differently than ones from the general field of education. They also wanted to know if the dissertation study design (i.e., qualitative, quantitative, or mixed methods) affected the quality of the review. The researchers also examined the rubric’s ability to consistently measure the quality of the reviews.

The study showed the literature reviews from Instructional Technology had a lower average score (19.96 out of 37 possible points) than ones from education as a whole (24.08 out of 37 possible points). The lower average scores may be due to the field itself. It may also be in part because researchers did not select dissertations based on the quality of the program. Finally, the use of different researchers than the Boote and Beile study may have been a factor in the differences. Study design also had little effect on the overall score of the dissertation literature review. Quantitative dissertations scored better

From a practical viewpoint, faculty can use the findings from the first study to guide the selection of teaching materials. They can also examine the curriculum to determine how it can be strengthened or supplemented. From a scholarly view, these two studies add to the developing discussion about the dissertation literature review. The first study addresses the oft-neglected research surrounding materials used to teach literature review skills. The second study extends Boote and Beile’s research into a specific field of study.

Checksum

e81779666a627341640689dc8080eac9

Comments

Publication made available electronically December 21, 2011.

Share

COinS