How an Almost Full Year of Program Assessment Made Me a Better Student Affairs Professional


SUMMARY: In this powerful piece, CSPA student Rain Garant shares how his experiences in the Center for Assessment and Research studies transformed his perspective on assessment and helped him become an even better student affairs professional.


By: Rain Chris Garant

When starting my student affairs graduate program, “assessment” was the last thing on my mind. Far more pressing matters included: holy crap my cohort is cool, will I fit in? (yes!); how much “casual” can I slip into “business casual”? (quite a bit); and how do I avoid burnout while still giving everything my all at all times? (you can’t; prioritize, breathe, and pick up bouldering as a hobby). I expected assessment would remain the afterthought I’ve always viewed it as, or be relegated to the role of big baddie when end-of-year reporting rolls around.

Rain-Article-Cohort

What actually occurred? I plunged headfirst into assessment and loved it! After discovering that assessment is the way we evidence program effectiveness, I realized that evidencing learning improvement has always been at the core of my career goals – even if the language and skills necessary to excel in it had been just out of reach.

What Changed My Perception of Assessment?

My Graduate Assistantship

I spent a fair amount of time listening at the start of my assistantship with DEEP Impact, taking care and due diligence to hear why my site operated the way that it did. I built rapport with my students, took notes during staff meetings, and ran around frantically trying to find a working microphone during evening programming. Once I felt like I had a pretty good handle on the current state of affairs, I confidently agreed with my students: we’re good--Reeeaal good. At the time, I was just glad I could be the cool boss.

But....when I sat down with my supervisor to see how we could begin to make changes to our programming, we hit a snag: with no evidence of what programming was currently successful, it was difficult to envision a path to improvement that wasn’t pure speculation. We felt in our guts that DEEP Impact was successful; qualitative reports from my team during our debriefs indicated high engagement and, in multiple cases, a perceived shift in participants’ mindset. But without any formal assessment of our programming, we weren’t sure what was “working” and what was falling flat.

This was a dilemma my coursework wasn’t covering. That first semester opted for a macro lens of development and theory, but my struggle with knowing if DEEP impact was successful required something more granular. After three months spent spinning in circles, I began to short-circuit. I could tell you that I believed DEEP Impact was developing students, but I couldn’t say why or how. Emotionally and physically exhausted, I went to my parent's house over winter break ready to lick my wounds and eat a decent meal before coming back in the spring.

Assessment 101

Then, Assessment 101 happened. Three full days of assessment boot camp, complete with a complimentary breakfast buffet and a binder of printouts. Assessment 101 was hardcore; I can’t sugar coat it. Our cohort came back from Winter break to get slammed with a wall of information on unfamiliar concepts and processes. With that said, there was no way for me to anticipate how beneficial those three days would be, or how energized I’d begin my spring as a result.

I realized those unfamiliar concepts and processes were things I probably should have known and been doing back in October during my GA. Ouch! It is not an understatement to say that there were moments during Assessment 101 when it felt like an electric current ran through the workshop. All those ideas my DEEP student staff and I tossed around to better our programming now could be tackled – improving DEEP could actually be realized. I remember emailing my DEEP supervisor after every day in Assessment 101 because I didn’t want the flood of ideas to leave me.

So, what exactly did I learn in Assessment 101? How to select an instrument that would measure the student learning outcome I was interested in impacting. How to interpret assessment results and make valid, data-based claims. Most importantly, it helped me fall in love with evidence- informed program planning. What is evidence-informed programming? It is using theory and research to not only identify malleable and feasible student learning outcomes, but then building programming that should “work” based on the literature. Essentially, exactly what DEEP was searching for was handed to me on a silver platter.

After one activity a fellow cohort member and I made eye contact and all I could think to say was “holy @#*%.” If I could do Assessment 101 all over again I would only change two things: I’d bring some materials from my assistantship to look at during the breaks and I’d bring more coffee. This was stuff you want to stay awake for.

Practicum in CARS

Evidence-informed programming became my focus during my five-month spring practicum in CARS. I worked as a member of the Student Affairs Assessment Support Services (SASS) team and, with permission from my assistantship supervisor, created a learning improvement exemplar modeled after DEEP Impact. I developed a profound appreciation for the necessity of theory to ground our work as professionals. By examining research in the field, I was able to build new evidence-based programming for DEEP and “trim the fat” from our program in order to help students develop increased inter-cultural competence (DEEP Impact’s long-term outcome).

Rain-Article-SASSTeam

I left this practicum with a tangible product for my portfolio, which outlined, from inception to rollout, what a successful diversity education program looks like. I can pull from memory the program theory and lay out the logic of the curriculum. I can’t lie--this ability to defend why the program shouldbe successful is empowering.

The best part? I gained the skills to do it all over again for another outcome and program. This new skill has transformed how I view my entry into the field. Not only does this skill in evidence-based programming ooze marketability, but now I approach program design and implementation with new confidence—confidence that before I could only posture. I do not just look like a stronger candidate “on paper,” I am genuinely more qualified to accept a position where I will be developing and implementing programs that should be successful.

CARS Summer Assistantship

And, if all that wasn't enough, I leaped at the opportunity to join CARS for a summer assistantship. I spent my summer rating essays from the university-wide assessment day, scoring academic unit assessment progress templates, and creating a digital primer on “what does it mean to infuse social justice into assessment?” These experiences not only built new skills but they showcase to employers the variety of assessment-related work I can tackle.

Oh, and I got to help LEAD a weeklong, summer version of Assessment 101! I spent a week teaching others (e.g., faculty, student affairs professionals, PhD students) about program theory, writing strong learning objectives, and evidencing learning improvement. It took an incredible amount of effort and I am truly grateful to the co-facilitators who lifted me up along the way. And, I’m equally proud of myself and not afraid to say it! Graduate school is rife with feelings of imposter syndrome. To get the rave reviews our team received assuaged some of the deeper, darker doubts I’ve held onto since September about whether or not I’m prepared for the field of higher education. Leading Assessment 101 helped me realize that I not only view myself as ready for the field of student affairs, but professionals on my campus and beyond do, too.

Where Am I Now?

As I approach the end of a full year in my graduate program, I will have spent eight of those twelve months working with CARS. I never expected that when I first started my grad program.

The ways that I’ve grown as a professional due to my experience in CARS are attributable to two things: the opportunities I was given (e.g., Assessment 101) and the opportunities I actively and persistently sought out (e.g., summer assistantship in CARS). I want to be clear: these opportunities exist for every member of my cohort.

I feel truly privileged to be afforded the time and the minds of the best in the biz (and I’m not just saying that, they’re award-winning). As a result, I experienced learning improvement. I expanded what I know, think, and can do.

To distill my new-and-improved ethos down: anyone who believes the skill of programming is essential (and thus requires intentional honing) is someone who would benefit from embracing outcomes assessment. CSPA students and professionals receive a primer in assessment during Assessment 101, but if growth is to be an ongoing endeavor, then so should our connection to outcomes assessment.

My Unsolicited Advice.

Interview for a practicum at CARS, soak up every minute of Assessment 101, and talk to your assistantship supervisor about ways that your whole team can connect with SASS for the betterment of your co-curricular programming. These resources are at your fingertips for two full years. Seek out and soak up every developmental opportunity you can.

Now is the time. It is YOUR time to improve. Embrace it.


Rain Chris Garant (he, him, his) is an alumnus of James Madison University with a degree in English and Women’s Gender & Sexuality Studies. He is expected to receive his Master’s of Education in College Student Personnel Administration from James Madison University in May 2020. With five years of experience designing and implementing educational programs surrounding marginalized identities, Rain Chris is an impactful public speaker who recognizes the power of storytelling to connect lives. Outside of his current work, Rain Chris considers himself a home cook, minimalist, and aspiring distance runner. He is constantly seeking new books to read and new places for travel.

Back to Top