Stephanie Laggini Fiore
While we have dealt with many aspects of AI and teaching in this blog series, we want to end the series with the most important aspect—talking to your students about AI and learning. One of the realities of the present moment is that we are all in the midst of a disruptive change, one that neither we nor our students fully understand how to navigate. Therefore, whether or not we decide to allow the use of AI in our classes, it is vitally important to discuss these tools with our students in productive ways.
At the CAT, we have seen plenty of draconian language on syllabi over the years (“Don’t even think about cheating; you will be caught!!”), but the old adage about catching more flies with honey than with vinegar stands true here as well. Establishing trust in the learning environment, having clarifying conversations about AI and the choices you have made for the course, engaging students in thinking critically about the use of these tools and what they mean for society and for learning, and welcoming students’ thoughts will be far more effective than setting up an adversarial dynamic. We recommend dedicating time to discussing generative AI during the first week of the semester and then re-engaging students briefly before each written assignment. You should, of course, take some time to go over your AI syllabus statement, explaining your reasons for the decisions you have made, but it is important to go beyond that conversation to allow space for students to reflect on what it means to use these tools for learning.
Here are some thoughts on how to speak to your students about AI:
- Consider using an anonymous poll that asks the extent to which your students have used these tools. This will provide a window into how familiar your students are with generative AI.
- Begin the conversation by asking students what they know about generative AI. You may be surprised about what they do (or don’t) know. Continue with a clarifying conversation on how generative AI tools work, including their benefits and pitfalls. It will be most effective if you can show examples of those benefits and pitfalls—for instance, an example of a hallucination (inaccuracies) or biased content that it might reproduce.
- Engage students in thinking about how your assignments help students to achieve the goals of your course. We often recommend using Bloom’s Taxonomy for this exercise. If, for example, you have a goal that reaches the level of evaluation on the taxonomy, how will the assignments (if completed by the student) aid in their attainment of that goal?
- Think about how to connect your students to the value of what they are learning. Often students see our courses (especially our required courses) simply as hoops to jump through on the way to a degree. Can you articulate for your students the reason why what they are learning will benefit them? What relevance will it have for their professions, personal growth, future academic work, or communities? Helping students to find meaning in what they are learning will be key to managing AI use.
- Include a discussion about AI and academic integrity. Why is academic integrity important? How can we think about the use of generative AI in ethical terms? Uses case studies to have them ponder whether those uses are ethical; for instance, how they would feel if you offloaded all student feedback to an AI? Would that be an ethical use of the tool or would it be a breach of your responsibility as an instructor?
- Ask students to discuss important philosophical questions that will get them thinking about the nature of learning, thought, and voice, such as:
- Why do we write? What kinds of thinking happens when we write? Query students about how they use writing outside of class: do they keep a journal, write their opinions on social media, text friends when something important happens? Why might they turn to writing to express their thoughts?
- What does it mean to cede our thinking and our voice to non-sentient machines? Do we want to live in a world where none of our passions and ideas are expressed in the way that we want to express them, and where originality of thought is replaced by a process of scraping a dataset for answers?
Talking to a student when you suspect cheating
You’ve followed our advice above and talked to your students about AI from day one of the semester, clarifying permissible use in your course. Still, you suspect that a student in your class has used AI in ways that you have not allowed. The first step is always to talk to the student. Here are some tips for tackling this discussion:
- Don’t take it personally! Cheating can often feel like a personal attack and a betrayal of all the work you’ve put into your teaching. Remember that a student’s decision to use AI to take shortcuts is probably about them, not about you.
- Check your biases. Is your suspicion of your student’s work well-founded? Would you have the same concerns if the work had been handed in by other students?
- Beware of falsely accusing students outright. As was established in a previous post, our ability to accurately identify the use of AI generative tools at present is quite weak.
- Ask the student to meet with you. Simply say something like “I have some concerns about your assignment. Please come to see me.”
- When you meet with the student, try not to be confrontational (remember that you may not be certain they used AI in an unauthorized manner). Instead, start by asking them questions that will give them a moment to tell the story of their writing process, such as: How were you feeling about the assignment? What do you think was challenging about it? Why don’t you tell me what your process was for getting it done. If there is research involved, you can ask what research they used. If they were writing on something they were supposed to read or visit (an art exhibit, for instance), ask pointed questions that get at whether they actually engaged in that activity.
- Then state your concerns: I’m concerned because the writing in this assignment doesn’t seem to match the writing in your other assignments, and the AI detector tool said that it is AI written. Point out any inconsistencies, odd language, repetition, or hallucinated citations with the student.
- Use developmental language. Remember that your student may have used generative AI without realizing it is considered cheating, or there may have been factors that made them feel that they needed to cheat. A conversation with your student can be a learning opportunity for them.
- Discuss with the colleagues in your department what a reasonable penalty might be for unauthorized use of generative AI. Consider also when it might be necessary to contact The Office of Student Conduct and Community Standards. (Remember, however, that speaking with your student is always the first step before taking further action.) If your conclusion is that the student cheated, you’ll have to decide whether you allow them to complete the assignment again on their own (perhaps with a penalty) or whether you’ll give no options to right the ship. Consider that we are in a developmental stage with these tools and it might be good to give the do-over if the student owns up to it.
- Self-reflect. Given that students often take shortcuts for reasons related to the course structure, review our blog post on academic integrity and AI in order to take steps to promote academic integrity and consider whether your course is designed to reflect these best practices.
In a world in which AI is here to stay, it is essential that we support students’ ethical and productive interaction with these tools. No matter the discipline, we need to take on the responsibility of developing our students to adapt to this new reality with full awareness of the implications of AI use for learning, for work, and for society.
We know that this is all new and it is not easy—the CAT is here to help. To book an appointment with a CAT educational developer or educational technology specialist, go to catbooking.temple.edu or email email@example.com.