Location

Cheatham 212

Event Website

http://www.cpe.vt.edu/cuenr/index.html

Start Date

3-27-2010 2:00 PM

End Date

3-27-2010 2:30 PM

Description

Geographic Information Science and Technology (GIST) plays an increasingly pronounced role in providing natural resource professionals with information and analysis tools. GIST is an integral component of resource planning, management and assessment; therefore, professors in the College of Natural Resources at North Carolina State University have designed and embedded geospatial exercises across undergraduate curricula in Forest Management and Natural Resources. We developed a flexible framework for assessing how well geospatial learning objectives are being met. Developing a framework requires identifying geospatial learning objectives, establishing criteria for success, and creating assessment tools. Structured interviews were used to identify geospatial learning objectives and criteria for success. Interview subjects included faculty with geospatial instruction embedded within their courses as well as program directors throughout the college. We used a grounded theory approach to code and identify emergent commonalities among subjects’ responses and prepared geospatial learning objectives from these categories. Assessment criteria and outcome indicators, based on key properties, were written for each objective. We crosswalked our objectives with the University Consortium for Geographic Information Science (UCGIS) Body of Knowledge1 to determine how well core topical areas are covered by the embedded instructional interventions. Anderson and Krathwoh’s framework2 was used to develop criteria for success by sorting objectives into factual, conceptual, and procedural knowledge categories and identifying the intended performance level of each objective. Based on these findings, we designed a pre‐post instructional intervention questionnaire addressing a range of foundational geospatial material. We pilot‐tested the questionnaire during the 2009 academic year to determine if geospatial objectives are communicated in observable or measurable ways, and if a pre‐post instruction questionnaire is an effective tool for assessing student learning. Findings from the pre‐post intervention pilot study, additional assessment approaches such as tracking questions embedded within course tests, the evaluation of students’ research reports from an assessor’s perspective, and implications for scaling‐up GIST assessment efforts within the college will be discussed.

Comments

Citation: Carr, J., H. Cheshire, G. Hess, H. Devine, D. Bailey. 2010. Assessing embedded geospatial student learning outcomes. UENR Biennial Conference, Session Curricula and Assessment, Paper Number 8. http://digitalcommons.usu.edu/cuenr/Sessions/Cirricula/8/

Share

COinS
 
Mar 27th, 2:00 PM Mar 27th, 2:30 PM

Assessing Embedded Geospatial Student Learning Outcomes

Cheatham 212

Geographic Information Science and Technology (GIST) plays an increasingly pronounced role in providing natural resource professionals with information and analysis tools. GIST is an integral component of resource planning, management and assessment; therefore, professors in the College of Natural Resources at North Carolina State University have designed and embedded geospatial exercises across undergraduate curricula in Forest Management and Natural Resources. We developed a flexible framework for assessing how well geospatial learning objectives are being met. Developing a framework requires identifying geospatial learning objectives, establishing criteria for success, and creating assessment tools. Structured interviews were used to identify geospatial learning objectives and criteria for success. Interview subjects included faculty with geospatial instruction embedded within their courses as well as program directors throughout the college. We used a grounded theory approach to code and identify emergent commonalities among subjects’ responses and prepared geospatial learning objectives from these categories. Assessment criteria and outcome indicators, based on key properties, were written for each objective. We crosswalked our objectives with the University Consortium for Geographic Information Science (UCGIS) Body of Knowledge1 to determine how well core topical areas are covered by the embedded instructional interventions. Anderson and Krathwoh’s framework2 was used to develop criteria for success by sorting objectives into factual, conceptual, and procedural knowledge categories and identifying the intended performance level of each objective. Based on these findings, we designed a pre‐post instructional intervention questionnaire addressing a range of foundational geospatial material. We pilot‐tested the questionnaire during the 2009 academic year to determine if geospatial objectives are communicated in observable or measurable ways, and if a pre‐post instruction questionnaire is an effective tool for assessing student learning. Findings from the pre‐post intervention pilot study, additional assessment approaches such as tracking questions embedded within course tests, the evaluation of students’ research reports from an assessor’s perspective, and implications for scaling‐up GIST assessment efforts within the college will be discussed.

https://digitalcommons.usu.edu/cuenr/Sessions/Cirricula/8