Professional Development in Assessment was developed at JMU as an extension of the Center for Assessment and Research Studies' mission to improve higher education by inspiring and empowering faculty and staff to make evidence-based decisions to enhance student learning and development.

Navigate this page using the index to the left or the dropdowns below to explore carefully selected resources mapped to key skill areas in assessment, as specified by the Assessment Skills Framework.

Skill Area 3: Select & Design Instruments

Select and Design Instruments
Spotlight Resources
Quick Access: Review Educational Standards

Novice

Video - Is My Survey Good?

Keston Fulcher quickly breaks down the key question that always needs answering before a survey, test, or rubric can be meaningfully evaluated.

SASS - Looking for Instruments

In the great design v select debate associated with SLO measurement, it can be a struggle to locate existing instruments that could be viable for current SLOs. This guide by Student Affairs Assessment Support Services (SASS) consultants provides a good starting point to explore existing instrument resources. 

UF - A Practical Guide to Assessment - Basic Concepts and Assessment Categories

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for this specific domain can be found in Module 2c: “Assessment Classifications & Scoring” and Module 3a “Item Types for Quizzes and Tests”

NILOA - New to Assessment - Gathering Data

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data”: Ch. 1, which focuses on the role of planning in effective data usage.

Intermediate

Hunter College Handout - Matching Assessments to Learning Outcomes

This handout provides a quick and detailed outline of the considerations to be made when building or selecting an assessment that aligns with student learning outcomes.

Assessment Quickies - Matching Assessment to Teaching & Learning

This resource is a continuation of the 10 step Assessment Quickies series, which dives deeply into the levels of student learning. This video discusses Step 6 in the series – matching assessment to teaching and learning – which details the types of measures associated with various levels of learning.

 

Novice

SASS - Instrumentation Selection & Design:

Student Affairs Support Services presents several considerations one would need to make when deciding on instrumentation for assessment needs. This page emphasizes the importance of measure to SLO alignment, outlines types of measures, and goes into depth on the matter of selecting vs designing, with several resources available for both.

1. SASS Webpage – A great resource for conceptualizing key considerations of instrumentation. For this particular trait, the section entitled "Selecting vs Designing Instruments" provides a great breakdown of various context and resources considerations.

2. Video Option – SASS personnel walk through several the considerations for selecting vs designing an instrument in under 6 minutes.

NILOA - New to Assessment & LARC Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data”: Ch 2-3.

Intermediate

Assessment Quickies - Choosing Assessment Measures

This resource is a continuation of the 10 step Assessment Quickies series, which dives deeply into the levels of student learning. This video discusses Step 5 in the series and provides a brief explanation of direct v indirect measures and the process of choosing an assessment measure.

SASS - Instrumentation Selection & Design

For further development of skills in this domain, revisit this focused review of instrumentation considerations by Student Affairs Assessment Support Services (SASS) at JMU. Beyond the general information it provides, pay special attention to the table entitled “When to Select/Design” if you’re looking to weigh the pros and cons of each to make a decision. This is good to use in conjunction to the Hathcoat article, if you want to dive more in depth.

Practical Examples:

The following talks illustrate large assessment shifts that had to be made due to the real-world impact of COVID-19. These discussions include both instrumentation considerations, as well as considerations throughout the entirety of the assessment cycle.

1. Large-Scale Assessment during a Pandemic: Results from James Madison University's Assessment Day

2. Challenging our Assumptions: Lessons that COVID-19 Measurement has Highlighted

1. SASS - Instrumentation Considerations

Student Affairs Assessment Support Services (SASS) at JMU present an in-depth navigation of the numerous considerations when selecting or designing a measure. Resources for those seeking to learn more about the role of validity and reliability in this decision are present under the heading “Psychometric Properties”

2. Hathcoat, Sanders, & Gregg: Item Development, Reliability, & Validity Article

Authors clearly lay out several considerations, best practices, and how-to scenarios that might arise in the search for instrumentation. With regards to validity and reliability, the article goes into depth about the different types, the different methods of assessing reliability and validity, and well as present SPSS walkthroughs for gathering this information from your own data. 

UF - A Practical Guide to Assessment – Basic Concepts & Assessment Categories

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for this specific domain can be found in Module 2a: Technical Components of Assessment, with reliability and validity considerations for classroom assessment.

Coefficient Alpha Video (by Kate Schaefer)

In a 40 minute video, Kate Schaefer guides you through a more in-depth explanation of one estimate of reliability, Cronbach’s alpha, with various theoretical examples and empirical examples in SPSS.

CARS - Overview of Writing Instrument Items Presentation

This resource provides access to the slide deck from a previous CARS presentation on item writing practices. The presentation provides several opportunities to check your knowledge through activities. For a real test of abilities, the presentation can be put in presenter mode to maximize the impact of examples.

UF - A Practical Guide to Assessment – Basic Concepts & Assessment Categories

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for this specific domain can be found in Module 3b: Writing Item Stems with lists of examples to guide practices, as well as in the Module 3 resources, presenting a checklist for reviewing selected response item, and Module 3c: Writing Answer Choices. 

1. SASS - Instrumentation Considerations

JMU's Student Affairs Assessment team presents an in-depth navigation of the numerous considerations when selecting or designing a measure. Resources for this specific area can be found under the heading “How to Develop an Instrument”.

2. Hathcoat, Sanders, & Gregg: Item Development, Reliability, & Validity Article

Authors clearly lay out several considerations, best practices, and how-to scenarios that might arise in the search for instrumentation. For resources in this domain, the article has an entire section devoted to item writing.  

Economic Policy Institute - Addressing Non-cognitive Skills

This briefing paper argues for the inclusion of more non-cognitive assessment in academic settings, providing a context for why non-cognitive assessment might be important for educational settings.

Oxford Handbook - Assessing Non-cognitive Skills

This review article reviews non-cognitive factors and outlines both the traditional approach to noncognitive assessment, as well as some emerging techniques and future directions for noncognitive assessment with numerous examples scattered throughout.

1. SASS: Instrumentation Considerations

JMU's Student Affairs Assessment Team presents an in-depth navigation of the numerous considerations when selecting or designing a measure. Resources for this specific area can be found under the heading “How to Develop an Instrument”.

2. Hathcoat, Sanders, & Gregg: Item Development, Reliability, & Validity Article

Authors clearly lay out several considerations, best practices, and how-to scenarios that might arise in the search for instrumentation. For resources in this domain, the article has an entire section devoted to item writing.  

Novice

SASS Video - Rubric Design

Student Affairs Assessment Support Services (SASS) at JMU breaks down the role of rubrics in performance assessment, different types of rubrics and their applications, as well as detailing the characteristics of “good” rubrics.

Texas A&M Video - Rubrics

Texas A&M Student Life Studies presents a video describing rubrics and how to apply them to SLO assessment.

Intermediate 

UW-Stout: List of Rubric Resources

This is a comprehensive guide to rubrics with links to examples, best practices, performance assessment information, scoring guides, and more!

Tools for Grading – Sample Rubrics

Vanderbilt University provides numerous sample rubrics and Excel spreadsheets to better conceptualize rubric application.

UF: A Practical Guide to Assessment – Basic Concepts & Assessment Categories

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for structuring performance assessment items can be found in Module 2d: Rubric Design, Module 3d: Writing Short Answer, Sentence, Completion, and Extended Response Items, with additional resources for structuring portfolio assessment and developing rubrics in Module 5a: Portfolios and Module 6: Consensus-Bases Assessment.

Samples:

1. Texas A & M Student Leader Learning Outcomes: Sample rubrics presented with corresponding Student Learning Outcomes for various topics

2. Association of American Colleges & Universities - The VALUE Rubrics: The Association of American Colleges and Universities (AAC&U) developed and made available 16 rubrics that address essential learning outcomes.

Skill Area 4: Implementation Fidelity

Implementation Fidelity
Spotlight Resources
Best Place to Start: Importance of Implementation Fidelity

Novice

SASS Resources

1. Video: Evaluating Implementation Fidelity

SASS continues on with their assessment cycle video series, now introducing the concept of implementation fidelity, the 5 components, and how to evaluate fidelity data.  

2. Webpage Overview: Implementation Fidelity in Assessment

This resource reinforces ideas present in the SASS video for understanding the goals, components, and applications of fidelity data as well as presenting additional resources

Webinar - Using Implementation Fidelity Data to Evaluate and Improve Program Effectiveness

In this webinar, Dr. Finney will introduce implementation fidelity assessment and demonstrate how it can be coupled with outcomes assessment data to inform more valid inferences about program effectiveness. This webinar is an introduction to implementation fidelity appropriate for professionals hoping to connect implementation fidelity data with outcomes assessment data to facilitate accurate communication about the quality of programs and to guide program and learning improvement. 

Intermediate

Key Articles to Review & Revisit:

This article brought to us by NILOA emphasizes implementation fidelity as a critical component of assessing student learning, with introduction to and explanation of the 5 components of implementation fidelity. 

This article examines both program theory and implementation fidelity as essential components of higher ed assessment. Specific descriptions of implementation fidelity can be found in Step 4 within the article and figures.  

This article outlines the process for engaging practitioners and faculty in fidelity research, emphasizing what is gained from fidelity data, and how to begin examining it within own institution.  

Workshop on designing IF studies

Attached is the slide deck from an Implementation Fidelity Workshop led by Sara Finney, which includes a review of several key IF concepts, a practical example (Slide 45), a meta-cognitive check of key concepts (Slide 52), an activity prompt to check your knowlede (Slide 53), and a great debrief (Slide 56).

Novice

SASS Website - Collecting Implementation Fidelity Data

SASS presents an overview of Implementation Fidelity in this continuation of the Assessment Cycle series. For skills in this area, hone in on the section entitled "Collecting Implementation Fidelity Data".

First Year Orientation: Connecting the Dots Through Implementation Fidelity Improvement

This resource introduces some basic terms and ways of thinking about implementation fidelity, while also illustrating the importance of IF data and the role it plays in assessment with this first-year student orientation example.

Evaluation Brief - Measuring Implementation Fidelity

This document further discusses the pros and cons of various implementation fidelity data collection methods

Implementation Fidelity in Community-Based Interventions

The purpose of this paper by Breitenstein et al., 2010 is to define implementation fidelity and describe its importance for the larger science of implementation, discuss data collection methods and current efforts in measuring implementation fidelity in community-based prevention interventions, and present future research directions for measuring implementation fidelity that will advance implementation science.

Intermediate

Practical Example: Helping Students Learn Better - Evaluating Program Theory & Implementation Fidelity in a University-level Context

The slide deck from an ASHE Presentation by Smith & Finney presents a pracitcal example in articulating the theory behind a program and then examining implementation fidelity data (IF example begins on Slide 21).

Key Articles to Review & Revisit:

1. Finney & Smith: Ignorance is Not Bliss 

This article brought to us by NILOA emphasizes implementation fidelity as a critical component of assessing student learning, with introduction to and explanation of the 5 components of implementation fidelity. 

2. Smith & Finney: Elevating Program Theory & Implementation Fidelity 

This article examines both program theory and implementation fidelity as essential components of higher ed assessment. Specific descriptions of implementation fidelity can be found in Step 4 within the article and figures.  

3. Smith, Finney, & Fulcher: Actionable Steps 

This article outlines the process for engaging practitioners and faculty in fidelity research, emphasizing what is gained from fidelity data, and how to begin examining it within own institution. Step 5 discusses the Collection of Implementation Fidelity data.

SASS Resources:

1. SASS: Implementation Fidelity Data 

SASS explains how to use implementation fidelity data to evaluate SLOs. Specifically focuses explaining how to collect IF data, the five components, and the importance of IF in interpreting results.

2. Implementation Fidelity Video

For more explanation via video format, SASS also presents a video discussing implementation fidelity data and its components. There is also the inclusion of a general fidelity checklist example to illustrate how one might go about collecting and quantifying IF data.    

 

Implementation Fidelity & Decision Making – Webinar 

John Hathcoat presents a webinar, detailing implementation fidelity, with emphasis placed on the errors we can make when interpreting our learning outcomes data and how the inclusion of implementation fidelity helps us make more accurate and empirically-based decisions.  

Articles:

1. James Bell and Associates' Brief on Implementation Fidelity

This article briefly discusses analysis of implementation fidelity data analysis.

2. Finney & Smith: Ignorance is Not Bliss 

This article brought to us by NILOA emphasizes implementation fidelity as a critical component of assessing student learning, with introduction to and explanation of the 5 components of implementation fidelity. 

This article examines both program theory and implementation fidelity as essential components of higher ed assessment. The illustration on page 3 of this article portrays how implementation fidelity data are interpreted in conjunction with student learning outcomes assessment findings

This article outlines the process for engaging practitioners and faculty in fidelity research, emphasizing what is gained from fidelity data, and how to begin examining it within own institution. Step 5 discusses the collection of implementation fidelity data. Step 6 discusses the collection of implementation fidelity data.

5. Gerstner & Finney: Measuring Implementation Fidelity 

Student affairs-based examples are presented in this article, demonstrating how inclusion of implementation fidelity in the outcomes assessment process increases the validity of inferences about program effectiveness and, ultimately, student learning. Useful for both student affairs professionals and academic departments.  

Skill Area 5: Outcomes Information

Collecting Outcomes Information

The Research Knowledge Base - Design 

This excellent resource takes the form of a web-based textbook, focuses on introductory concepts in social research methodology. The linked “Design” section, specifically offers in-depth descriptions of multiple designs and revisits the concept of validity.  

SASS Resources:

1. Data Collection & Analysis 

Student Affairs Assessment Support Services (SASS) at JMU introduce common data collection designs, presenting key considerations for choosing a design, such as how to align a data collection with program SLOs and take into account potential validity threats.  

2. Collecting Data on SLOs 

SASS gets to Step 5 in their video series diving into the Assessment Cycle, now outlining the data collection stage of the assessment cycle—highlighting what questions one might ask when making the decision, including practical considerations and SLO alignment, as well as potential formats for data collection.  

Threats to Validity Handout

A quick, easy handout to review potential validity threats, and visualize which data designs are most susceptible to them.

UF: A Practical Guide to Assessment – Basic Concepts & Assessment Categories  

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for this specific domain can be found in Module 5b: Pre & Post Testing.

NILOA - New to Assessment Modules 

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data”: Ch 1-2.  

SASS Resources:

Revisit the following resources from the “Planning a Data Collection” section above to help make a concrete decision about which method you will use.

  1. Assessment Data Collection & Analysis

This page reinforces the importance of collecting meaningful and credible data and to continue to take into account potential validity threats.

  1. Video: Collecting Information

This video again highlights what questions one might ask when making data design decisions, including practical considerations and SLO alignment, as well as potential formats for data collection.

Videos:

1. Assessment Quickies: Collecting Assessment Evidence 

This resource is a continuation of the 10 step Assessment Quickies series, which dives deeply into the levels of student learning. This video discusses Step 7 in the series – collecting assessment evidence – which details the strategies for collecting assessment data. 

North Dakota State University outlines how to assess student learning through four key principles in this quick video, packed full with tips for your own assessment efforts and examples for clarity.  

3. Survey Sampling for the Penn State Pulse Program – Webinar  

Detailed example using a Penn State University program of how survey sampling techniques fits into data collection considerations, including discussion of why we should sample, overviews of sampling techniques, and representativeness checks.  

UF: A Practical Guide to Assessment – Basic Concepts & Assessment Categories  

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Resources for this specific domain can be found in Module 7b: Building an Interpretation and Use Argument.

NILOA - New to Assessment Modules 

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content headings “Demystifying Assessment”: Ch. 4 and “Gathering Data” Ch. 2-3.   

Back to Top