Professional Development in Assessment was developed at JMU as an extension of the Center for Assessment and Research Studies' mission to improve higher education by inspiring and empowering faculty and staff to make evidence-based decisions to enhance student learning and development.

Navigate this page using the index to the left or the dropdowns below to explore carefully selected resources mapped to key skill areas in assessment, as specified by the Assessment Skills Framework.

Skill Area 6: Analyze, Report, Maintain

Analyze Data, Interpret & Report Results, & Maintain Information
Spotlight Resources
Quick Access: Review Educational Standards

Novice

UF - Module 4c - Analyzing Item Difficulty & Discrimination

Use the following resources from the University of Florida's Practical Guide to Assessment:

ExamSoft - Video Series Introduction to Item Analysis

1. An Introduction to Item Analysis 0 Number Everywhere! (4:51 min)
2. The Definition of Item Difficulty (5:56 min)
3. Twenty-Seven Percent - The Index of Discrimination (7:09 min)
4. Putting It All Together - Using Distractor Analysis (6:19 min)

Introduction to Classical Test Theory with CITAS

This white paper is intended for any individual that is interested in learning how to make tests and assessments better, by helping you apply international best practices for evaluating the performance of your assessments. CITAS provides basic analytics necessary for this evaluation, and it does so without requiring advanced knowledge of psychometrics or of software programming.

This paper will begin by defining the concepts and statistics used in classical item and test analysis, and then present how the CITAS spreadsheet provides the relevant information. CITAS was designed to provide software for quantitative analysis of testing data that is as straightforward as possible – no command code, no data formatting, no complex interface. 

Intermediate

CITAS Excel Worksheet

CITAS is a simple spreadsheet for non-psychometricians to evaluate the quality of assessments.

CITAS is an easy-to-use tool for implementing classical test theory on small data sets, designed to provide a straightforward and no-cost way for non-psychometricians to evaluate the quality of assessments. If you are using assessments but not evaluating their statistics, you have no way of knowing whether your test is reliable.

CITAS is intended for anyone that is new to psychometrics and wants a simple tool to help learn the analytics. It is limited to 100 examinees and 100 items.

Iteman

Iteman evaluates reliability, overall test performance, item performance,  distractor analysis, and automatically flags problematic items.

The free version of Iteman provides an account with full functionality but limited to 100 students and 100 items. This is perfect for teaching or learning classical test theory as well as using for real-world application on many smaller-scale, classroom size tests.

The free version allows you to get going with the program immediately and start improving your assessments by flagging items that do not meet best practices.

Video Introduction to Reliability

SASS - Instrument Alignment Considerations – Psychometric Properties

This Student Affairs Assessment Support Services (SASS) page presents a quick, simple overview of reliability, with an outline of the various types, as well as a description of validity, with emphasis on some important distinctions of what validity data tells us and how it should be interpreted.

Research Methods Knowledge Base - Reliability

This excellent resource takes the form of a web-based textbook, focused on introductory concepts in social research methodology. The linked “Reliability” section, specifically offers conceptual and mathematical depictions of the various types of reliability.

UF - Unveiling the Mysteries of Reliability & Validity

This slide deck introduces the questions that need to be asked when building an interpretation and use argument, and why validity and reliability evidence is necessary to make the inferences we’d like to make about assessment data. The slides define validity and reliability, how to build validity argument for instrument use or design choice, and how to use equations to evaluate reliability.

External Modules:

1. UF- A Practical Guide to Assessment: Module 4b - Reliability and Validity

2. NILOA - New to Assessment Modules: Gathering Data (Ch. 2)

SASS: Instrument Alignment Considerations – Psychometric Properties

This Student Affairs Assessment Support Services (SASS) page presents a quick, simple overview of reliability, with an outline of the various types, as well as a description of validity, with emphasis on some important distinctions of what validity data tells us and how it should be interpreted. 

Holzman, Pope, & Horst: Reliability & Validity 101

This AALHE Intersections paper details the assessment responsibilities of student affairs professionals, with explanation of the concepts of reliability and validity with several examples. Despite student affairs focus, this is also a fairly didactic explanation of these concepts and their importance that can also be utilized by academic departments.

UF - Unveiling the Mysteries of Reliability & Validity

This slide deck introduces the questions that need to be asked when building an interpretation and use argument, and why validity and reliability evidence is necessary to make the inferences we’d like to make about assessment data. The slides define validity and reliability, how to build validity argument for instrument use or design choice, and how to use equations to evaluate reliability.

Novice

SASS - Analyzing Data

Another video in the Student Affairs Assessment Support Services (SASS) at JMU's walk through of the assessment cycle – this time focusing in on understanding the type of data that has been collected and how type is related to data collection design.

SASS - Reporting Results

SASS presents a pdf guide to documenting assessment results, with specific emphasis on practice and examples. The guide is meant to serve as a reference for student affairs practitioners when communicating quantitative assessment results, but is not intended as a comprehensive teaching tool.

UF - A Practical Guide to Assessment Modules

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Explore the following resources for skills in this area. 

NILOA - New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data” Ch. 2.

Intermediate

CMU - Open Learning Initiative:

Carnegie Melon University presents introductory-level courses to teach students the basic concepts of statistics and the logic of statistical reasoning. Designed for students with no prior knowledge in statistics, its only prerequisite is basic algebra. Includes a classical treatment of probablity. For skills in the current area, explore the following courses:

1. Probability & Statistics

2. Statistical Reasoning

Research Methods Knowledge Base: Analysis

Another Knowledge Base section from the web-based textbook, focused on introductory concepts in social research methodology. The linked “Analysis” section introduces some key data analysis questions and common terms for understanding. The sections revisit validity and data preparation before diving into the distinctions and respective applications of descriptive vs inferential statistics.

Step-By-Step Introduction to Descriptive Data Analysis & Excel Data Visualization

This recorded webinar introduces basic data management, descriptive statistics, and how to use excel for graphical displays. The description of the video contains links out to the PowerPoint used for the presentation as well as data sets meant to practice the skills Dr. Ligia Perez walks through.

Novice

Research Methods Knowledge Base - Qualitative Measures

Another Knowledge Base section from the web-based textbook, focused on introductory concepts in social research methodology. The linked “Qualitative Measures” section introduces the idea of qualitative research (and how it is related to quantitative research) and attempts to orient one to the major types of qualitative research data, approaches and methods.

Additionally, see the following two Knowledge Base pages for key definitions in the qualitative domain:

1. Qualitative Approaches

2. Qualitative Methods

NILOA - New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data” Ch. 2.

Intermediate

 

Yale - Fundamentals of Qualitative Research Methods

Qualitative research is a strategy for systematic collection, organization, and interpretation of phenomena that are difficult to measure quantitatively. Dr. Leslie Curry leads six modules covering essential topics in qualitative research, including what it is qualitative research and how to use the most common methods, in-depth interviews and focus groups. These videos are intended to enhance participants' capacity to conceptualize, design, and conduct qualitative research in the health sciences. View the following for skills in this domain:

1. Module 2: Writing a Qualitative Research Question

2. Module 5: Overview of Qualitative Data Analysis – specific emphasis on coding schema


Research Methods Knowledge Base - Qualitative Measures

A continuation of the Qualitative Knowledge Base section from the web-based textbook, focused on introductory concepts in social research methodology. This section focuses in on how to judge the quality of qualitative data, connecting the key considerations to the more well-known parallels in quantitative data.

Mixed Methods Assessment – Webinar

This recorded webinar presents a brief overview of quantitative and qualitative designs, with overall focus on mixed methods data, and examples of how mixed methods have been utilized at the presenters’ institutions.

Mixed Methods Example

This hour long video lecture from John Creswell goes into depth on mixed methods research, the “5 Steps in Learning Mixed Methods”, and how to design a mixed methods study.  

NILOA - New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data” Ch. 2.

Novice

SASS - Reporting Assessment Results

Student Affairs Assessment Support Services (SASS) provides another overview of key assessment considerations. The focus here is on best practices for data reporting and data visualization, emphasizing how to most effectively convey results to others.

Duke Library: The Dos & Don’ts of Data Visualization

Duke University’s libraries present some quick and clear guidance on how to avoid messy, ineffective data visualizations, utilizing examples for each tip.

The Power of Data Visualization in Student Affairs

This webinar utilizes student affairs examples to illustrate why visuals are important in presenting data, and how we can move beyond the standard bar charts to include more creative graphical presentations.

NILOA: New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content heading “Gathering Data” Ch. 2.

Intermediate

International Reporting & Data Visualization - Blog

Evergreen Data’s blog section contains numerous posts focused on specific aspects of data visualization, such as visualize small data sets, how to title charts/graphs, working in Google Sheets, and much, much more!

Step-By-Step Introduction to Descriptive Data Analysis & Excel Data Visualization

This recorded webinar introduces basic data management, descriptive statistics, and how to use excel for graphical displays. The description of the video contains links out to the PowerPoint used for the presentation as well as data sets meant to practice the skills Dr. Ligia Perez walks through.

 

How to ‘Excel’ in Visualizing Data

This webinar details the basics of how to use Excel to create good, dynamic visualizations of data. Discussions include tips and tricks for graphs, with particular attention paid to how to create a dashboard to analyze data from different angles.

Assessment Quickies - Analyzing Data

This resource is a continuation of the 10 step Assessment Quickies series, which dives deeply into the levels of student learning. This video focuses in on Step 8 – analyzing evidence of student learning.

SASS - Analyzing Data

Another video in JMU's Student Affairs Assessment Support Services (SASS) walk through the assessment cycle – this time focusing in on understanding the type of data that has been collected and how type is related to data collection design. For this domain, focus on the brief review of possible results, how they’re connected via assessment cycle, and what they provide to interpretation.

 

SASS - Using Assessment Results

SASS professionals at JMU discuss use of assessment results, with particular emphasis paid to “closing the loop”, the logic of educational assessment, and integration of various other parts of the assessment cycle in this process.

 

SASS - Reporting & Use of Assessment Results

The final overview in the SASS Assessment Cycle explanations. For skills in this domain, review the best practices for effectively conveying assessment results, and focus on the “Use of Results” and “Learning Improvement” sections. 

Novice

SASS Website - Reporting & Use of Assessment Results

(1) REPORTING RESULTS

Provides brief guidelines for reporting assessment results, specifically (1) telling a meaningful story, (2) being clear, concise, and compelling, and (3) adequately address reasonable critiques.

(2) USE OF RESULTS

Describes the use of assessment results as determining if a program is effective, conditionally effective, or ineffective and recommends follow-up procedures for each outcome.

(3) LEARNING IMPROVEMENT

Describes the use of assessment results for learning improvement in three steps: assess, intervene, re-assess.

Example of a Program-Level Learning Improvement Report - RPA Article

Provides a structured commentary and rationale for including specific information in a programmatic assessment report.

NILOA - Evidence-Based Storytelling

Occasional Paper 50: Evidence-Based Storytelling in Assessment

“This paper provides an overview of an alternative conception of use through the lens of evidence-based storytelling—an approach that has been used at the National Institute for Learning Outcomes Assessment (NILOA) to refine and encourage evidence-based stories in assessment (Jankowski & Baker, 2019). This occasional paper serves two main purposes: to re-examine what is meant by use of assessment results and to unpack evidence-based storytelling and its connection to assessment.”

Building a Narrative via Evidence-Based Storytelling - A Toolkit for Practice

This paper is “designed to help [assessment practitioners] think through various elements in the creation of a compelling, evidence-based story. It was developed through document and narrative analysis review of accreditation reports, program reviews, and annual assessment reports. In addition to the individual questions to consider when crafting a narrative, the toolkit includes resources for undertaking a group activity to peer review reports, documents, or data visualizations. This peer review process has been field-tested over the last three years and refined with different groups including assessment professionals, faculty, and institutional research staff.”

Planning for Effective Communication of Assessment - A Toolkit for Practice

This paper is “designed to help [assessment practitioners] effectively communicate about the value and importance of assessment. Stemming from NILOA’s work around transparency and the Evidence-Based Storytelling Toolkit, this resource helps create a plan to disseminate assessment related information as well as evidence-based narratives. Here, we introduce guiding questions and activities to develop a communications strategy.”

Intermediate

SASS: Reporting Results

SASS presents a pdf guide to documenting assessment results, with specific emphasis on practice and examples. The guide is meant to serve as a reference for student affairs practitioners when communicating quantitative assessment results, but is not intended as a comprehensive teaching tool.

JMU CARS - Assessment Reporting

1. The Assessment Progress Template (APT) and Rubric

Lists information typically included in programmatic assessment reports in a rubric format and provides a template for reporting this information.

2. General Information for Contents of Each Section of the APT 

Use this resource to review the briefly described guidelines for content presented in a programmatic assessment reports.

Research Methods Knowledge Base

Depending on the audience and purpose of the assessment report, guidelines for research publication can be adapted to fit reporting needs, specifically focus on the following areas: considering the intended audience of the report, structuring the report into a story, following stylistic elements, and including methodological information.

Write-Up

The Audiencу
The Story

Key Elements of Write-Up

Methods
Stylistic Elements

Learning Improvement Community

Describes how to form a narrative of an evidence of improvement in student learning that was informed by assessment processes and intentional change(s) to the learning environment with directed questions. Provides examples of brief, narrative-oriented assessment reports from a variety of programs.

Demonstrations from the University of Hawaíi at Mānoa:

1. Degree Program Assessment Reports

Demonstrates how information can be presented for programmatic assessment with real reports from undergraduate and graduate programs.

2. Assessment for Curricular Improvement Poster Exhibit

Demonstrates how programmatic assessment can be reported in a brief, poster medium via examples.

Novice

 

Data One - Data Management Components – 1 Pager

Quick, general overview of the components of a data management plan – including why it’s necessary to implement one and what questions are involved in development.

UD - Data Management Definitions

The University of Deleware presents an overview page with descriptions of key data management terms and links to more in-depth descriptions, as well as guidelines, for each.

This is a great resource to navigate questions about data maintenance at one university and hits at storing, across time, and sensitivity.

JMU - IRB Data Management Tips

The IRB at JMU presents some guidelines and tips for how to ethically manage data – including what key considerations should be addressed, guidance for securing various types of data, and example practices at JMU regarding how to dispose of confidential data.

JMU Libraries - Naming Tips & Data Organization

JMU libraries presents some best practices to get started with your own data management system – including some tips on how to name files and organize data for maximum efficiency.

Intermediate

Why Care About Data Maintenance & Security - Video by Dena Pastor

Data One - Full Education Modules

Data One presents free education modules aimed at helping professionals to better understand and implement a data management plan. Each module includes PowerPoints with slide notes, a 1 pager handout with helpful review of the key information throughout the module, and a supporting exercise to practice applying the new information.

ICPSR

This resource contains a guide conceptualized by the Inter-university Consortium for Political and Social Research with clear descriptions of how to handle data across the various stages of its life, emphasizing the importance of good data management and best practices for various types of data in research.

Skill Area 7: Use of Results

Use of Results to Improve Student Learning
Spotlight Resources

Novice

Assessment Quickies - Use of Student Learning Evidence for Program Improvement

Another video in the Assessment Quickies series, here specifically emphasizing Step 9 – use of student learning evidence for program improvement.

SASS - Use of Results:

1. Video: Using Assessment Results

2. Webpage Overview - Section "Use of Results"

UF - A Practical Guide to Assessment Modules

A resource designed for faculty at the University of Florida and all who teach in higher education, this guide presents a series of short videos on assessment topics that Dr. Brophy presents as part of his Passport to Great Teaching-Creative Assessment Faculty Development Learning Community. Explore the following resources for skills in this area.

Using Assessment Results Guide:

1. All programs - Using Assessment Results for Program Improvement

2. General Education Courses - Using General Education Assessment Results for Program Improvement

NILOA - New to Assessment  Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content headings “Assessment Benefits & Barriers” Ch. 1-5 and “Using Assessment Data” Ch. 1 and 3.

Intermediate

CARS - Hub of Learning Improvement Resources & Examples

The Center for Assessment and Research Studies at JMU presents numerous resources for better understanding the role of assessment data in improving student learning, highlighting the simple model for learning improvement.

Learning Analytics, Big Data, & Student Affairs – Webinar

This structured conversation provides an overview of analytics and how institutions have successfully leveraged it, highlighting the theoretical approaches for applying analytics and considerations for implementing analytics initiatives on more university campuses.

Skill Area 8: Assessment in Practice

Assessment in Practice - Additional Skills

SAAL Webinar Topics:

1. Panel Perspectives on Assessment Commitees: This webinar focuses on the various aspects of assessment committees, including forming new committees, effective management, instilling accountability, unique structures/responsibilities on campuses. A panel of three professionals from different institutions will share their perspectives from their own campus practices and known trends in the field.

2. Mindful with Measurements - Using Qualitative & Quantitative Data in Assessment: With conversation spurred by a blog post, this moderated panel seeks to explore the differences in qualitative and quantitative data in student affairs assessment. Panelists will discuss the theories and methods that drive practical assessment method decisions. How do broader concepts like validity, fairness, and use impact our selection or design tools such as rubrics and surveys? How do we build a sound process to get the data we need? Moderated by Joe Levy, the panel features a wealth of experience from Ciji Heiser of Western Michigan University, Ross Markle of DIA Higher Education Collaborators, and Kate McConnell of the Association of American Colleges & Universities. Time will be set aside at the end for attendees to pose questions to the panel.

3. Practical Example: Challenging our Assumptions - Lessons that COVID-19 Measurement has Highlighted: How has Covid-19 impacted assessment, how can covid impact be assessed, equity issues.

Dissertation Work: Demonstrating Validity Evidence of Meta-Assessment Scores Using Generalizability Theory

This research focused on the dependability of the ratings provided to programs by faculty raters. In order to extend the generalizability of the meta-assessment ratings, a new fully-crossed G-study was conducted with eight faculty raters to compare the dependability of their ratings to those of the previous graduate student study.

NILOA: New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under the content headings “Demystifying Assessment” Ch. 2 and “Developing Sustainable Assessment Practices” Ch. 2 and 3.

Assessment Quickies - Why Assess Student Learning?

This is the tenth in a series of ten short clips on assessing student learning, focusing on the question, "Why do it at all?".

NDSU - Sharing and Using Assessment Results

Jeremy Penn provides guidance on what to do once you've collected assessment evidence. This is why we do assessment!

Assessment Competency - Multiple Perspectives

Assessment professionals in the field share their own work, experiences, and tips for conceptualizing assessment competency. 

SAAL Webinar Topics:

1. Building a Culture of Assessment - Navigating the Politics: This webinar focuses on tips on navigating politics of higher ed to make changes in assessment culture

2. Building a Culture of Co-Curricular Assessment - Where to Start: Whether you are new to a position or the university is new to co-curricular/Student Affairs assessment, getting started can be difficult. Sometimes you don’t have all the information or resources you need, but the directive to lead assessment remains. This structured conversation will explore facets of preparation in getting started, considerations for guiding the work, and suggestions for advancing culture or going beyond baseline activity. Contextual examples and lessons learned from the facilitator will be provided, while also allotting time to engage feedback and field Q&A from participants.

3. Opening the Loop - Strategies in Summative & Formative Feedback: Provides some guidance on how to act as a consultant/foster assessment relationships and collaboration, how to provide assessment feedback (mostly about fostering good relationships with clients and promoting assessment).

NILOA: New to Assessment Modules

NILOA presents a collection of open-access resources introducing the basics of assessing student learning, with resources for assessment terminology, assessment modules, and activities. Resources for this domain can be found in the modules at the bottom of the page under Ch. 4: Assessment Benefits & Barriers

 

Novice

NILOA - Occasional Papers:

1.  Equity & Assessment - Moving Towards Culturally Responsive Assessment: As colleges educate a more diverse and global student population, there is increased need to ensure every student succeeds regardless of their differences. This paper explores the relationship between equity and assessment, addressing the question: how consequential can assessment be to learning when assessment approaches may not be inclusive of diverse learners? The paper argues that for assessment to meet the goal of improving student learning and authentically document what students know and can do, a culturally responsive approach to assessment is needed. In describing what culturally responsive assessment entails, this paper offers a rationale as to why change is necessary, proposes a way to conceptualize the place of students and culture in assessment, and introduces three ways to help make assessment culturally responsive.

2.  A New Decade for Assessment - Embedding Equity into Assessment Praxis: Entering into a new decade with an even more diversified college student population will not only require more assessment models involving students but also deeper professional development of institutional representatives key to student learning. Reflecting upon the conversations over the last three years around culturally responsive assessment and related equity and assessment discussions, this occasional paper highlights questions, insights, and future directions for the decade ahead by exploring what equitable assessment is and is not; the challenges and barriers to equitable assessment work; where the decade ahead may lead; and next steps in the conversation on equity and assessment.

Critical Assessment: Philosophy and Approach

Webinar presented by SAAL, focusing on using assessment to address inequity, and in an ethical manner.

Emerging Dialogues in Assessment - The 8 Key Questions

Ethical Reasoning in Action present the 8 Key Ehical Questions - A strategy for incorporating diversity, equity, and inclusion values into assessment planning, as well as external ethical dilemmas. 

Intermediate

NILOA - Equity Response Papers:

1. Steps in the Right Direction - Additional Strategies for Fostering Culturally Responsive Assessment

2. Strategies for Change - Equity in Assessment Practices

Socially Just Assessment as a Tool for Institutional Equity

SAAL Structured Convo covers equity concerns and gives ideas for how to promote equitable change.

A Practical Resource - Visual Inclusion

Below are some website resources that can be usef to check for colorblind accessibility:

1. Colorblind Webpage Filter

2. WebAIM

Back to Top