As artificial intelligence (AI) continues to evolve, its potential to streamline assessment practices within student affairs has become increasingly apparent. This webpage is designed to provide an overview of AI, it’s role in higher education, and the best practices for integrating Generative AI into various aspects of assessment, reducing workload while maintaining the integrity and effectiveness of the evaluation processes. By leveraging AI thoughtfully, Student Affairs (SA) professionals can enhance their ability to assess programs, support student learning, and improve institutional efficiency. Specifically, at the end of the page for each step we provide ideas on how to use AI for:
- Program Theory
- Student Learning Outcomes
- Program Mapping
- Instrumentation
- Implementation Fidelity
- Data Collection
- Interpretation of Results
- Use of Results
We do not cover Data Analysis as you should NOT use AI during that step.
By following the guidelines provided on this page, professionals can responsibly incorporate AI tools into assessment efforts, ensuring ethical, data-driven decision-making that enhances student success.
What is Generative AI?
Artificial Intelligence (AI) refers to the development of computer systems that can perform tasks typically requiring human intelligence, such as problem-solving, decision-making, language understanding, and pattern recognition.
Generative AI designed to process and generate human-like text. Trained on vast datasets, Generative AI is used for applications such as content creation, summarization, and conversational AI.
To learn more from JMU IT visit their Generative Artificial Intelligence site.
Suggested AI Tools:
Microsoft Copilot Chat
Available in Microsoft 365, Copilot Chat lets you ask questions and get help directly within your Microsoft apps. You can use it to summarize documents, draft communications, or make sense of Excel data.
Copilot Chat is the only AI tool permitted for student data use at JMU.
- Access Microsoft Copilot Chat
- Click “Sign In”
- Click “Sign in with a work or school account”
- Log in with your JMU email address and password
- To know you are in a secure chat check for a green shield in the top right of the page
ChatGPT
ChatGPT is a conversational AI that helps you think through problems, write survey questions, interpret findings, and simplify complex ideas. It’s like having a collaborator who’s always ready to help.
ChatGPT is NOT permitted for use with student data at JMU. But you may wish to use it for help with general problems unrelated to institutionally secure data.
- Access ChatGPT
- Login with Google or create an account to start chatting
- You will NOT have a secure chat
Scopus AI
Scopus AI helps you explore and summarize academic research. It’s especially useful when you want to see what studies already say about a topic you're exploring in your assessment work.
Prompt Engineering
What is Prompt Engineering?
Prompt engineering is the process of designing and refining inputs, or prompts, to effectively guide AI models, particularly LLMs, in generating accurate and relevant responses. This involves structuring instructions, questions, or statements in a way that maximizes the model’s ability to understand user intent. By carefully adjusting the wording, context, and format of prompts, users can improve the consistency and quality of AI-generated outputs (What Is Prompt Engineering?, 2025).
The primary goal of prompt engineering is to minimize the need for post-generation corrections by crafting precise prompts that align with desired outcomes. AI-generated content can vary in quality, often requiring iteration to refine responses. Skilled prompt engineers play a critical role in ensuring that AI models produce meaningful, contextually appropriate results, thereby enhancing their efficiency and usability (What Is Prompt Engineering?, 2023). In this way, prompt engineering serves as a key practice in optimizing interactions with AI systems, enabling users to fully harness the capabilities of generative models. Thus, at each stage of the assessment cycle, we will provide guidelines for optimizing your prompts to maximize the effectiveness of your gAI use in assessment.
General Tips and Tricks
Effective prompt engineering is essential for guiding AI models to generate accurate and relevant responses.
- Be specific and clear: well-defined prompts minimize ambiguity and improve the model’s understanding of the task (General Tips for Designing Prompts – Nextra, 2025).
- Provide sufficient context: this enhances response quality by giving the AI necessary background information to generate relevant outputs (Prompt Engineering Best Practices, 2024).
- Structure prompts logically: using bullet points or step-by-step instructions can improve clarity and effectiveness (Best Practices for Prompt Engineering with the OpenAI API, 2024).
- Embrace iteration: prompt engineering is an iterative process that involves experimenting with different phrasings and refinements to achieve optimal results (Bigelow, 2023).
- Understand AI limitations: recognizing model constraints helps users anticipate shortcomings and adjust prompts accordingly (Vergadia & Williams, 2023).
By applying these best practices, one can maximize the accuracy and usefulness of AI-generated content.
The Dos and Don’ts
![]() |
![]() |
Dos | Don’ts |
---|---|
• Use AI as a brainstorming partner | • Overestimate AI’s capabilities |
• Treat AI as a learning tool | • Accept AI-generated outcomes without question/revision |
• Provide clear inputs | • Upload institutional data or sensitive information |
• Check outcomes for alignment to your programming | • Use AI as a replacement for assessment professionals |
• Revise and refine AI generated SLOs | • Rely on AI for sensitive judgements without human oversight |
• Keep equity and ethics in mind | • Ignore bias or ethical risks |
• Use AI to augment, not replace, human judgment | • Assume AI is always right |
• Protect privacy and data protection laws | • Use AI to deceive or manipulate |
• Be transparent about when and how AI is used | • Use AI in place of human empathy or nuance |
• Verify outputs, especially in high-stakes contexts | • Over-automate processes without human review |
• Use representative and inclusive prompts | • Share confidential or personal data unsafely |
• Understand how the AI tool works and its limitations | • Let AI operate unchecked without human context or revision |
Limitations and Considerations of AI
As AI becomes more integrated into student affairs assessment, ethical considerations must be addressed to ensure responsible and fair use. While AI offers powerful tools for data analysis and decision-making, it also introduces significant challenges related to bias, privacy, transparency, and human oversight.
- Bias and fairness
- AI systems reflect biases in training data, potentially leading to unfair treatment of students, especially those from marginalized backgrounds.
- Regular auditing of AI tools is necessary to ensure equity in AI-driven decisions.
- Data privacy and security
- AI relies on large datasets, raising concerns about consent, data protection, and security.
- Institutions must establish clear ethical guidelines for data use, ensuring compliance with privacy regulations (e.g., JMU's use of Copilot Chat for AI with student data).
- Student and staff autonomy
- Over-reliance on AI can undermine human judgment in decision-making and lead to impersonal interventions.
- AI should supplement, not replace, human expertise in assessment.
- Students must have control over their data and the ability to contest AI-generated decisions.
- Transparency and explainability
- AI-driven insights must be interpretable by both students and staff to maintain trust and fairness in assessments.
- Practical limitations of AI
- AI excels at analyzing structured responses but struggles with evaluating complex human skills like critical thinking, creativity, and emotional intelligence.
- AI may misinterpret language, tone, and deeper meaning, requiring human oversight.
- Integrity and responsible AI use
- AI should not enable professional dishonesty or replace human expertise in assessment reports.
- Appropriate permission must be obtained for AI-assisted tasks, with human oversight maintained in decision-making.
- Collaborative approach
- An ongoing evaluation of AI’s role in research and programming is needed, with collaboration between educators and AI specialists to bridge technological divides.
Should You Use Copilot or ChatGPT?
On this page, we will be specifically focusing on Microsoft Copilot and ChatGPT. ChatGPT and Microsoft Copilot are powerful tools that can tackle a wide range of everyday tasks. From drafting text and translating languages to extracting information from images, writing code, solving math problems, and even generating recipes, they handle it all with speed and efficiency. If you’ve got a few tasks in mind that an AI chatbot might streamline or simplify, there’s a good chance that either tool will handle them with ease (Guinness, 2025).
Still, Copilot and ChatGPT have different strengths:
Feature/Use Case | Copilot | ChatGPT |
---|---|---|
Integration with Microsoft | ✔️ Fully integrated (Word, Excel, Teams) | ❌ Not integrated into Microsoft products |
Document & Report Drafting | ✔️ Seamless with MS apps | — Requires copy & paste |
Data Security | ✔️ Enterprise-level compliance (FERPA) | ❌ No built-in institutional agreements |
Creativity & Brainstorming | — Capable, but more structured | ✔️ Excels at open-ended creativity |
Response Style | — Task-focused | ✔️ More conversational and exploratory |
James Madison University (JMU) has established policies regarding the use of gAI. Currently, the only AI tool approved for campus-wide use with JMU data is Microsoft Copilot Chat, which operates with commercial data protections to ensure user privacy. Faculty, staff, and students can access this service by signing in with their JMU credentials. Users must comply with JMU's data stewardship and IT policies, as well as Microsoft's terms of service. For more details, visit JMU's AI policy page.
Additional Resources
Microsoft Copilot Chat
- Privacy and protections
- 5 Ways to use Copilot (Video)
- A Quick Guide to Microsoft Copilot
- What is Copilot?
- Copilot Features
- Cite Sources
- Image Creator from Designer
- Visual Searches
- Create
- PDF Feature
- Enhance teaching and learning with Microsoft Copilot
- Learning module to learn basic concepts, modes, and features to design effective prompts and analyze results
ChatGPT
References
Best practices for prompt engineering with the OpenAI API. (2024). OpenAI. https://help.openai.com/en/articles/6654000-best-practices-for-prompt-engineering-with-the-openai-api.com
Bigelow, S. J. (2023, September 22). 10 prompt engineering tips and best practices | TechTarget. Search Enterprise AI. https://www.techtarget.com/searchenterpriseai/tip/Prompt-engineering-tips-and-best-practices
General tips for designing prompts – nextra. (2025, January 7). https://www.promptingguide.ai/introduction/tips?utm_source=chatgpt.com
Prompt engineering best practices: Tips, tricks, and tools | DigitalOcean. (2024). https://www.digitalocean.com/resources/articles/prompt-engineering-best-practices
Vergadia, P., & Williams, K. (2023, August 14). Best practices for prompt engineering. Google Cloud Blog. https://cloud.google.com/blog/products/application-development/five-best-practices-for-prompt-engineering
What is prompt engineering? | ibm. (2023, November 27). https://www.ibm.com/think/topics/prompt-engineering
What is prompt engineering? - Ai prompt engineering explained - aws. (2025). Amazon Web Services, Inc. https://aws.amazon.com/what-is/prompt-engineering/