Cover Photo Image

By Elizabeth R. H. Sanchez (‘15M)

James Madison University’s First Year Orientation (FYO) staff understand that program effectiveness—certainly a hot topic within the academic community—is critical to the success of student learning. By collecting data, making evidenced-based programmatic changes, and re-assessing students who have experienced a modified program, former Director of Orientation Tisha McCoy-Ntiamoah and assessment liaison Dr. Sara Finney help to facilitate and evaluate an evolving curriculum for first year students that is unmatched in higher education.

orientationAccording to Finney, “It is really difficult to know if what you are doing [as a professor, facilitator, or implementer of a program] is effective. Personally…I would not be able to tell you without assessment.”  In fact, Finney believes programs need to evaluate both the outcomes of a program (what students know, think, and can do after completing a curriculum) and the student experience of a program to understand what and how students are learning. Using outcomes based data is important in determining to what degree students are learning while additional assessment activities such as collecting implementation fidelity data can illuminate why students are or are not succeeding. With this valuable information, unnecessary (and costly) programmatic changes can be avoided. 

In 2009, McCoy-Ntiamoah, Finney, and the Orientation and assessment staff first started collecting implementation fidelity data: data that indicates if and how a planned Summer Springboard matches the actual Summer Springboard experienced by students. One of the difficulties McCoy-Ntiamoah faces as director of FYO is that her developmental and overseeing role does not include actual program deployment; meaning, that when accepted students come to campus for Summer Springboard, the success of FYO is left in the hands of multiple JMU faculty, administrators, and departments that are designated to present information and facilitate learning activities that are meticulously designed and aligned with an agreed upon set of student learning outcomes. Multiple program facilitators increase the difficulty of ensuring that the program is implemented as intended--that FYO content is appropriately covered and delivered.

In order to understand how students experience the curriculum, McCoy-Ntiamoah and Finney rely on implementation fidelity as well as outcomes assessment data. In other words, as the American hip-hop duo OutKast recorded in their 2000 hit Ms. Jackson, “you can plan a pretty picnic but you can’t predict the weather.” Undoubtedly, JMU academic and student affairs programs, like First Year Orientation, are much more complicated to plan than a pretty picnic. However, the unpredictability of the weather is loosely analogous to the variability of how any program is implemented. A dedication to student learning, a willingness to make programmatic changes in order to achieve objectives, and an evaluation process that involves outcomes-based direct measures and implementation fidelity data, are what makes a program effective in achieving student learning outcomes—a challenge McCoy-Ntiamoah and Finney take seriously.

Orientation2For example, for the 2012 Summer Springboard, three graduate student assessment consultants posed as first year undergraduate students. While experiencing the program just as the first year JMU students did, they gathered implementation fidelity information using an elaborate checklist created by the program’s stakeholders. The checklist includes FYO objectives, such as Objective 1 which states “As a result of attending First Year Summer Springboard, students will indicate increased confidence and knowledge in making course selections.” Specific programmatic features such as presentations and activities that support student learning are also listed under each objective. For every aspect of programming, the graduate student auditors reported if and to what degree these curricular features were adhered to, the actual time spent on each presentation or activity, the quality of the presentation and activity (low=confusing, medium, or high=clear), and their perceptions of student responsiveness (for example, “students asked questions” or “students seemed tired after lunch”). 

This implementation fidelity data was then coupled with outcomes assessment data to provide insight into why students met or failed to meet learning objectives. That is, incoming students complete outcomes assessments created by Orientation stakeholders that align with agreed upon learning objectives. These assessments are administered before students experience any part of Orientation programming, after they complete at-home programming, and then again after completing Summer Springboard. It is the pairing of the outcomes assessment and implementation fidelity data that allows Orientation staff and program implementers to make meaningfully suggestions for programmatic changes in subsequent Summer Springboards (such as time dedicated to each objective or the activities and presentations planned) and to celebrate improved student learning.  

orientation3For McCoy-Ntiamoah, the gathering of implementation fidelity data, along with collecting outcomes data, allows her to “connect the dots” as to “why [parts of First Year Orientation] are not working,” when “working” is viewed as students learning and improving as a result of program modifications. McCoy-Ntiamoah also states, “It’s a challenge to make changes to a program that [so many] have a piece of...No one change has been done in isolation.” And, as with many aspects of life, demonstrated improvements in student learning based on changes in FYO do not happen quickly or without openness and commitment.  McCoy-Ntiamoah possesses a willingness and objectivity that is honorable and rare; simply stated, she says, “I do not take failure as a bad thing.”

Finney, too, understands how political programmatic changes can become when working with multiple university stakeholders. She knows that demonstrating learning improvement takes “a ton of work” but she reminds all involved that assessment is simply an evaluation of program effectiveness and every university stakeholder “need[s] to know where learning is happening”—improving student learning, after all, should be the goal of every academic and student affairs programming at JMU. 

Back to Top