Students noted in bold.

Bandalos, D.L., & Finney, S.J. (2019). Factor analysis: Exploratory and confirmatory. In G.R. Hancock, L. M. Stapleton, & R.O. Mueller (Eds.), The reviewer’s guide to quantitative methods in the social sciences (pp. 98-122). New York, NY: Routledge.

Bandalos, D.L., & Spratto, E.M. (2020). Review of the Reiss Motivation Profile (RMP) for Self-Discovery. In Carlson, J., Geissinger, K., & Jonson, J., The Twenty-first Mental Measurements Yearbook. Lincoln, NE: The Buros Institute of Mental Measurements.

DeMars, C.E. (2020). Alignment as an alternative to anchor purification in DIF analyses. Structural Equation Modeling, 27, 56-72. doi: 10.1080/10705511.2019.1617151

DeMars, C. E. (in press). Multilevel Rasch modeling: Does misfit to the Rasch model impact the regression model? Journal of Experimental Education. doi: 10.1080/00220973.2019.1610859

Finney, S.J., Satkus, P. & Perkins, B.A. (in press). The effect of perceived test importance and examinee emotions on expended effort during a low-stakes test: A longitudinal panel model. Educational Assessment.

Finney, S.J. & Horst, S.J. (2019). Standards, standards, standards: Mapping professional standards for outcomes assessment to assessment practice. Journal of Student Affairs Research and Practice, 56, 310-325.

Finney, S.J. & Horst, S.J. (2019). The status of assessment, evaluation, and research in student affairs. In V. L. Wise & Z. Davenport (Eds.), Student affairs assessment, evaluation, and research: A guidebook for graduate students and new professionals (pp. 3 – 19). Springfield, IL: Charles Thomas Publisher.

Finney, S. J., Barry, C. L., Horst, S. J., & Johnston M. M. (2019). Exploring profiles of academic help seeking: a mixture modeling approach. Learning and Individual Differences, 61, 158-171.10.1080/00220973.2019.1610859

Fulcher, K., & Prendergast, C. (2019). Lots of assessment, little improvement? How to fix the broken system. In S.P. Hundley & S. Kahn (Eds.), Trends in assessment: Ideas, opportunities, and issues for higher education. Sterling, VA: Stylus.

Gregg, N. & Leventhal, B.C. (2020). Data visualizations: Effective evidence-based practices. [Digital ITEMS Module 17]. Educational Measurement: Issues and Practice, 39(3), 239-240. doi: 10.1111/empi.12387

Hathcoat, J.D., Meixner, C., Nicholas, M. (2019).  Ontology and epistemology.  In P. Liamputtong (Ed.), Handbook of Research Methods in Health Social Sciences (pp. 99-116), New York, NY: Springer Publishing.

Holzman, M. A., Pope, A.M., & Horst, S. J. (2020, Winter). Reliability and validity 101: A primer for student affairs assessment.  AALHE Intersection: A Journal at the Intersection of Assessment and Learning, 5-9.

Horst, S. J., & Prendergast, C. O. (2020). The Assessment Skills Framework: A taxonomy of assessment knowledge, skills and attitudes. Research & Practice in Assessment, 15, 1- 25.

Meixner, C., & Hathcoat, J.D. (2019). The nature of mixed methods research. In P. Liamputtong (Ed.), Handbook of Research Methods in Health Social Sciences (pp. 51-70), New York, NY: Springer Publishing.

Meixner, C., Pope, A. M., & Horst, S. J. (2020). Implementation fidelity in the classroom: Exploring the alignment of pedagogy to learning outcomes. Journal of Faculty Development

Myers, A.J, Ames, A.J, Leventhal, B.C, & Holzman, M.A. (2020). Validating rubric scoring processes: An application of an item response tree model. Applied Measurement in Education, 33(4), 293-308. doi: 10.1080/08957347.2020.1789143

Myers, A. J. & Finney, S. J. (in press). Change in self-reported motivation before to after test completion: Relation with performance. Journal of Experimental Education.

Myers, A. J. & Finney, S. J. (in press). Does it matter if examinee motivation is measured before or after a low-stakes test? A moderated mediation analysis. Educational Assessment.

Pastor, D.A., & Love, P.D. (2020). University-wide assessment during COVID-19: An opportunity for innovation. AALHE Intersection, 2(1), 1-3.

Perkins, B.A., Satkus, P., & Finney, S.J. (in press). Examining the factor structure and measurement invariance of test emotions across testing platform, gender, and time. Journal of Psychoeducational Assessment.

Pope, A., Finney, S.J., & Bare, A. (2019). The essential role of program theory: Fostering theory-driven practice and high-quality outcomes assessment in student affairs. Research & Practice in Assessment, 14, 5–17.

Pope, A., Finney, S.J., & Crewe, M. (in press). Evaluating the effectiveness of an academic success program: Showcasing the importance of theory to practice. Journal of Student Affairs Inquiry.

Sauder, D.C., & DeMars, C.E. (2019). An updated recommendation for multiple comparisons. Advances in Methods and Practices in Psychological Science, 2, 26-44.

Sauder, D.C., & DeMars, C.E. (2020). Applying a multiple comparison control to IRT item-fit testing. Applied Measurement in Education, 33, 362-377. doi: 10.1080/08957347.2020.1789138.

Smith, K.L., & Finney, S.J. (2020). Elevating program theory and implementation fidelity in higher education: Modeling the process via an ethical reasoning curriculum. Research and Practice in Assessment, 15, 1-13.

Smith, K. L., Finney, S. J., & Fulcher, K. H. (2019). Connecting assessment practices with curricula and pedagogy via implementation fidelity data. Assessment and Evaluation in Higher Education, 44, 263 – 282.310-325.

Spratto, E. M., & Bandalos, D. L. (2019).  Attitudinal survey characteristics impacting participant responses. The Journal of Experimental Education. Advance Online Publication.                                                                               
Spratto, E. M., Leventhal, B. C., & Bandalos, D. L. (2020). Seeing the forest and the trees: Comparison of two IRTree models to investigate the impact of full versus endpoint-only response option labeling. Educational and Psychological Measurement. Advance Online Publication.

Waterbury, G.T., & DeMars, C.E. (2019). A user friendly effect size: When normality matters. Journal of Experimental Education, 87, 260-268. doi: 10.1080/00220973.2018.1434757

Back to Top