Nicholas Curtis is a first year assessment and measurement doctoral student at James Madison University. He recently interviewed Dr. Peter Ewell, current president of the National Center for Higher Education Management Systems (NCHEMS). Dr. Ewell is among the most prolific scholars in higher education. Additionally he has consulted with over 400 colleges and universities and 27 state systems of higher education on topics including assessment, program review, accreditation, and student retention.Dr. Ewell

Dr. Ewell shared his thoughts on higher education assessment practices. Specifically, he covered “closing the loop”, promoting assessment to accomplish more than just meeting accountability demands, and predictions about what higher education assessment will look like in 2025.

Curtis’ first question prompted Ewell to share his professional history. His career started as a faculty member, teaching political science at the University of Chicago; but he soon desired a different path. He took a job as Director of Planning at Governors State University, a Chicago area institution, which brought him into contact with NCHEMS. Ewell became acquainted with the staff of NCHEMS and was subsequently offered a job directing a 4-year Kellogg grant on using information about student outcomes to improve institutional planning and decision making.

Initially, Ewell intended to go back to a faculty position after the grant ended, but his work at NCHEMS fascinated him so much that he has stayed with the center since 1981. Even though his travels take up about 1/3 of his time, he enjoys the work. It is always changing, creating exciting new opportunities to work with different groups. For example, he currently works with the National Institute for Learning Outcomes Assessment (NILOA), the Mathematics Association of America, and the University of Charleston.

These days, Ewell focuses principally on accountability, quality assurance, and accreditation. He noted the tasks change from day to day and “the variety of work keeps people going”. Ewell is currently the president of NCHEMS but notes that there is an open search for someone to fill the role starting next year.

Curtis further prompted Ewell to discuss institutional consulting. Ewell stressed assessment planning needs to begin with the questions of “What do we want answered by assessment? Who is this for? And who would use the results?” These questions should define a guiding mindset for assessment at any institution. Additionally Ewell noted that program review “is necessary for utilization” of assessment and pointed out that graphics are frequently the best way of communicating assessment results. He stressed that data needs to be engaging for the audience, for example emphasizing “discrepancies, outliers, [and] where data deviates from central tendencies”. The user should have the chance to engage with and create further questions from the results.

During their conversation Curtis referenced the new Excellence in Assessment (EIA) program released by the Voluntary System of Accountability (VSA), the National Institute for Learning Outcomes Assessment (NILOA), and the Association of American Colleges & Universities (AAC&U), which recognizes institutions for their efforts in intentional integration of campus-level learning outcomes assessment. Recently, Ewell coauthored chapters in the 2015 book Using Evidence of Student Learning to Improve Higher Education which emphasizes finding ways to promote assessment to accomplish more than accreditation. Curtis brought the topics together and asked Ewell whether he believes the Excellence in Assessment program supports the initiatives discussed in the chapters.

Ewell stated he has mixed feelings about this program. Ewell believes “it is great to have assessment foregrounded”, but acknowledges programs such as the Excellence in Assessment and the Council for Higher Education Accreditation award (CHEA) , while allowing for recognition, also create winners and losers. He believes this competition may discourage people from applying. Additionally, Ewell metioned that he has wondered why some of the best programs do not apply for these awards and stated there needs to be a way “for this [the award programs] to be a credible certification so everyone goes for it.”

Curtis also questioned Ewell regarding Ewell’s idea of a perfect assessment world, where “we would have everything we want from assessment”. He was also interested in what obstacles were blocking this perfect world. Ewell was straight-forward with his answer to the obstacle question: “faculty buy-in”. He went on to say that faculty buy-in is a complicated issue. “Assessment is a legitimate enterprise, people know it needs to be done, and it is a necessary condition for doing business in higher education in the 21st century.” In other words, although there is an understanding that assessment is necessary, some faculty still do not want to take the time to join the effort.

Ewell tied in the concept of faculty buy-in with the last question from Curtis: what are his bold predictions for assessment looking forward 10 years? Ewell stated his vision is “faculty doing assessment without knowing it. Assessment would become a seamless process.” One exciting new feature of assessment Ewell pointed out during the conclusion of the interview is the prospect of simulation-based assessments. These assessments are frequently based on software that underlies videogames. He believes in the potential of this type of assessment because students will be confronted with actual cases faced in the workplace. Both Curtis and Ewell agreed that more evidence of their validity is needed before such assessments can expand.

As for Curtis, he remains intent on pursuing a career in improving higher education. In this early stage of his career, who better to speak with than one of the most experienced professionals in the field? Looking to the future, Curtis is adamant in his belief that the types of changes and improvements Ewell discussed are possible.

Disclaimer: The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the position of the Center for Assessment and Research Studies or James Madison University.

Back to Top