In CARS, we think of the assessment process as an ongoing cycle, in which assessment of student learning outcomes is used to improve programming. Student learning outcomes are things we want students (or program participants) to know, think, or do upon completing the offered program. For more information on the assessment cycle, see the resources below. The following sections include resources for each step of the cycle.

Assessment Cycle

            Video: An Overview of the Assessment Cycle
            Slides: Workshop on the Assessment Cycle
            Webinar: One Size does not Fit All: Developing Custom Assessment Solutions

Step 1: Writing Learning Outcome Objectives

            Video: Writing Objectives
            Handout: Writing Clear Objectives, the ABCD Method
            Handout: Common Mistakes in Writing Objectives
            Handout: The Do’s and Dont’s of Objective Writing
            Handout: Checklist for Effective Objectives
            Slides: Objective Writing Workshop

 Step 2: Mapping Objectives to Programming

            Video: Program Theory
            Video: Mapping Objectives to Program Components

  Step 3: Selecting and/or Designing Instruments

            Video: Selecting/Designing Instruments
            Handout: Overview of Selecting/Designing Instruments 
            Handout: How to Find Pre-existing Instruments 
            Handout: Comprehensive Guide to Selecting and Designing Instruments 
            Slides: Overview of Writing Instrument Items 
            Slides: Item Writing Workshop 
            Video: Designing and Using Rubrics 

  Step 4: Collecting Data on Learning Outcomes and Implementation Fidelity

            Video: Collecting Data on Learning Outcomes
            Video: Evaluating Implementation Fidelity 
            Video: Introduction to Implementation Fidelity 
            Slides: Implementation Fidelity Workshop with an Applied Example 
            Article: Measuring Implementation Fidelity (Gerstner & Finney, 2013)
            Article: Practical Approach to Assessing Implementation Fidelity 
                         (Swain, Finney, & Gerstner, 2013)
            Article: Importance of Implementation Fidelity (Fisher, Smith, Finney, & Pinder, 2014)   

 Step 5: Analyzing Data

            Video: Analyzing Student Learning Outcomes Data 

Step 6: Using Assessment Results to Improve Programming and Learning Outcomes

            Video: Using Assessment Results          

JMU Rubrics/Instruments

Please see our Examples of Learning Improvement for examples of how JMU programs have used assessment result to improve their courses, activities, and other programming components.


Academic Degree Programs

*For resources on the assessment process and each step of the assessment cycle click here.

*For an example of learning improvement from one of the Academic Degree Programs at James Madison University click here.

The Assessment Progress Template (APT)

The APT is to be submitted by the program with Department Head approval on or before June 1.

Program Assessment Coordinators are typically the leading authors of the APT. Submit the APT for departmental review. In order to access the APT submission system, a program username and password must be entered. This information is provided in an email sent out on March 1.

Department Heads review and approve program APTs under the department siteThis site requires a username and password, which is provided in an email sent out on March 1. Some department heads may also choose to co-author program APTs.

Example APTs

The purpose of the template is to provide the most current assessment-related information for each of JMU's academic programs. A separate template is completed for each academic and certificate program offered at JMU.

            Exemplar APT: College of Business  
            Exemplar APT: College of Science and Math  
            Exemplar APT: College of Health and Behavioral Studies 
            Exemplar APT: College of Visual and Performing Arts 
            Exemplar APT: College of Arts and Letters 

Resources

            APT Rubric
            General Information for Contents of Each Section of the APT 
            Including Tables and Graphs in the Report 
            Complete How-To for the APT 
            Update to Rubric Rationale 
            APT System Guide for Dept. Head 
            APT System Guide for Assessment Coordinators 

It is highly recommended the APT be drafted in Microsoft Word before it is then copied and pasted in APT submission system as a final version (i.e., a version that department heads will review). Drafting the template in Word makes it easier to edit and work with others. Additionally, the Word document can serve as a backup copy.

Alternative Option Information

These example APT alternatives can help guide your program in designing an alternative year project for learning improvement, or the assessment process, in your program, as well as guide the writing and submission of the one-page alternative. The APT alternative is to be submitted by the program with Department Head approval on or before June 1.”

            Alternative Option Information Sheet
            80's Pop Culture Alternative Example - Assessment Process
            80's Pop Culture Alternative Example - Learning Improvement
            CBIS Alternative Example - Assessment Process
            CBIS Alternative Example - Learning Improvement

Overview of Meta-Assessment

Meta-assessment is the process of evaluating and providing diagnostic feedback to academic degree program’s assessment plans. The following presentation, given at SACSCOC in December 2016, provides a brief overview of the meta-assessment process at JMU. Please contact Program Assessment Support Services with further questions. A shorter webinar of the presentation can be found here.

            SACSCOC Meta-Assessment JMU Workshop (ppt)

The APT Template

We understand that writing and organizing the APT can be difficult and that it may be unclear as to what exactly to include in the APT. PASS has developed a template for the APT that may be helpful in this regard. The APT Template has been designed to make your reporting as streamlined, efficient, and organized as possible. For every element of the APT rubric, there is a corresponding section on the APT Template. Tables are provided for many of the sections in the APT Template. You may use these specifically, or adapt them how you see fit. The APT template is completely optional, but we encourage you to examine it and determine if it would assist you in the reporting process. If you have any questions about the APT Template, please feel free to contact PASS (programassessment@jmu.edu).

APT Template


Student Affairs Resources

*For resources on the assessment process and each step of the assessment cycle click here.

*For an example of learning improvement in Student Affairs at James Madison University, see the Examples of Learning Improvement click here.

StudentAffairs Assessment Reports

Reporting Template: The LADOR


General Education

*For resources on the assessment process and each step of the assessment cycle, see the General Resources section above.

*For an example of learning improvement in General Education at James Madison University, see the Examples of Learning Improvement tab under General Education Assessment.

Examples of Cluster Reports

Cluster 1, Skills for the 21st Century:

Cluster 4: Social and Cultural Processes: Fall 14-Spring 15 Report
Cluster 5: Individuals in the Human Community: Fall 13-Spring 15 Report


Professional Development Opportunities

Assessment 101

Throughout the summer, The Center for Assessment & Research Studies, with the support of University Programs, hosts a week-long interactive workshop in assessment practice called Assessment 101. In this workshop, faculty and administrators from across campus learn about assessment in a focused environment. The program is led by graduate student consultants within CARS; typically advanced doctoral students in JMU’s Assessment & Measurement PhD program.

The week-long workshop is held multiple times throughout the course of the year – primarily in the summer months (June and July). During this five-day, interactive workshop, participants from both academic and student affairs will learn about and practice the components of the assessment cycle with a special emphasis on learning improvement. The workshop sessions will include everything from writing student learning outcomes, mapping the curriculum, and creating instruments, to analyzing data, reporting results and using results. This workshop is ideal for those relatively new to assessment practices.

Participants from academic affairs are nominated by their Dean or Department Head. Participants from student affairs are nominated through the Student Affairs Assessment Advisory Council.  These names are then submitted to CARS. If you are interested in participating in any of these professional development opportunities, please contact your Dean, Department Head, or the Student Affairs Assessment Advisory Council.


Assessment Advisory Council

The current Assessment Advisory Council Charge is:

This body is charged with advising the Provost and Senior Vice President for Academic Affairs regarding procedures and practices of assessment across the campus. The Provost has charged every member with reporting back to their constituents on current JMU assessment practice and policy.

The AAC should periodically:

  • Review the procedures and processes pertaining to first year and late sophomore/early junior assessment days; 
  • Review the procedures and processes pertaining to assessment in academic and degree programs and certificates; 
  • Review the procedures and processes pertaining to assessment in library and educational technologies;
  • Review the reporting of assessment data to students;
  • Review external reporting (e.g. SCHEV) of assessment data off-campus;
  • Suggest ways for better use of assessment data in APR, accreditation, C&I processes;
  • Review implementation of competency requirements for on-campus applications such as General Education and for off-campus such as for community college articulation agreements;
  • Suggest ways for better display of JMU program learning objectives, description of assessment instruments, results, and reported uses of results;
  • Offer general recommendations regarding improvements to campus assessment practice.
  • Work closely with the Student Affairs & University Planning Assessment Advisory Council pertaining to assessment in student affairs. 

Assessment Advisory Council 2018 – 2019

Chair: Dr. Herb Amato
Associate Vice Provost , University Programs

Council Members listing can be found here.

Back to Top