Another Look at Active Learning, Part 4: Long-term Active Learning Techniques

Dana Dawson, Ph.D. and Cliff Rouder, Ed.D.

In part 2 of our series on active learning, we identified high-impact, easy-to-implement active learning techniques, and in part 3, we explored peer and collaborative learning. In this part, we’re going to examine high-impact active learning techniques that extend beyond a single class period, possibly spanning an entire semester or even the entire year. These techniques require a bigger planning component and involve a significant shift in roles for the instructor and for students as students assume more responsibility for their own learning. However, they also yield big rewards, including deriving solutions to real-world problems, increased engagement and greater ability to communicate/work effectively in groups. So without further ado, let’s take a look at a variety of long-term active learning techniques.

Experiential Learning

Experiential learning encompasses activities that invite students to learn by doing. Students apply the skills and knowledge they’re learning in their courses within actual civic and work-based contexts that place them in a position to grapple with and reflect on real-world problems. This approach improves student engagement and retention because there are meaningful stakes involved and students can see how what they’re learning will be used in post-graduate situations.

Case-Based Learning

In case-based learning, students are presented with real-world scenarios and asked to apply their knowledge to come up with solutions or to advise on the best way to address the issue at hand. It is a form of guided inquiry where the problem is defined and students use knowledge and skills gained in the classroom to tackle the problem, or in some instances, choose from a set of possible solutions or analyze how others have resolved the scenario. 

Project-Based Learning

In project-based learning, students are presented with a complex real-life scenario that has multiple potential solutions. Working in groups, students develop a plan and design and create a ‘hands-on-solution’ in the form of a product or artifact to address the problem. 

Problem-Based Learning

Here, students are presented with a case or scenario where they define the problem, explore related issues, and identify a solution. While similar to case-based and project-based learning, problem-based learning is generally less structured and more open ended, and the problem is typically not as well defined as in case-based learning. Problem-based learning focuses on the process of discovery, with students working to define and solve the problem.

Service Learning and Community-Based Learning

Community-based learning (CBL)  (also called service learning) integrate direct community engagement into academic courses to mutually benefit students and community partners. These approaches emphasize the development of students’ civic awareness, knowledge, skills, values and goals to produce tangible social change and promote students’ critical thinking, self-efficacy, interpersonal skills, civic and social responsibility, academic development, and educational success. 

Process Oriented Guided Inquiry Learning (POGIL)

POGIL is a structured protocol in which students work collaboratively to investigate a topic. This approach begins with the assumption that we learn and complete tasks more effectively in groups because everyone has gaps in their skillset and knowledge base. By pooling assets, students are more able to tackle complex problems, thereby facilitating higher order tasks and ultimately promoting better learning experiences. Key elements of the POGIL approach are the use of persistent teams who together work through challenging, inquiry-based problems and the assigning of roles within those teams that enable students to self-manage the group’s work. Faculty members serve as guides while students learn through the process of inquiry.

Team-based Learning

Team-based learning (TBL) engages student knowledge through individual testing and collaborative work in persistent teams. There are four essential elements of team-based learning: 1. Groups: Groups must be intentionally formed and managed. 2. Accountability: Students must be accountable for the quality of their individual and group work. 3. Feedback: Students must receive frequent and timely feedback. 4. Assignment design: Group assignments must promote both learning and team development. When these four elements are implemented in a course, the stage is set for student groups to evolve into cohesive learning teams. 

The Flipped Classroom

In traditional classrooms, students consume content during class time, generally through listening to lectures. The flipped classroom moves more passive learning activities–such as listening, watching, or reading–outside of class time to make space within the classroom to practice skills and receive feedback. Students gain first exposure to content prior to class, ideally complete a knowledge check before coming to class so they and the professor can assess understanding ahead of the class period, and then participate in in-class or in-clinic activities that prompt higher-level thinking about the content through collaborative learning.

Gamification

We enjoy games because they’re fun, interactive and in many cases, capitalize on our competitive tendencies. There are many ways to bring the energy of play to the classroom by gamifying learning, ranging from easy-to-implement options such as using the “Competition” activity option in Poll Everywhere (for which Temple has an institutional license) to more complex activities such as Reacting to the Past (RTTP), a role-playing and immersive pedagogy. For example, our 2022 STEM Educators’ lecturer, Dr. André Thomas, described his successful creation of a video game designed to educate calculus students on the concept of limits. Where games are devised in such a way that they are enjoyable, involve collaboration, and require students to retrieve and apply course material, they are a surefire mechanism for encouraging engagement and information recall. And where they involve exploring and contextualizing a subject position that the student then must inhabit through an assigned role, as in RTTP, they allow for meaningful reflection on the complexities of contemporary and historical situations.

Executing long-term structured active learning

As mentioned above, long-term active learning techniques can require additional planning and intentional steps for ensuring group success. Faculty often ask us if there are ways to have students work more equitably and effectively in team projects. Asset-mapping can be used to promote equitable teamwork. 

Asset Mapping

We know that student group projects can be a valuable experience for students. However, even with an equal distribution of work, they may not always be equitable. This can be especially true in disciplines where underrepresented and marginalized groups might be stereotyped as not being capable enough to handle the project. Consistent with prior research in STEM fields, Stoddard and Pfeifer from Worcester Polytechnic Institute’s required first-year interdisciplinary project-based learning course, women and students of color more frequently experienced their ideas being ignored or shut down, being assigned less important tasks, dealing with an overpowering teammate, and having their work go unacknowledged or claimed by others.

They employed an equity-based approach using asset mapping originally developed by Kretzman and McNight in 1993. In essence, asset mapping gives students the opportunity to get to know their and their team’s strengths, interests, identities, and needed areas of growth related to the project. But it goes well beyond that. Asset mapping is just the initial step of a process that enables students to take a deeper dive into bias and stereotyping as they evaluate their own behaviors and the dynamics of their teams. As a way to operationalize asset mapping, Stoddard and Pfeifer developed this toolkit containing three modules that include the tools, activities, assignments, and rubrics needed at different times of the semester. 

If you’re interested in trying one of these longer-term active learning techniques, don’t hesitate to reach out to a CAT staff member for support. Stay tuned for the next installment of our active learning blog series which will focus on active learning in large classes!

___________________________________________________________________________

References

Amaral, G. (November 11, 2019). Using “Reacting to the Past” Role-Playing Games to Foster Vigorous Active Learning. EDvice Exchange Blog. Center for the Advancement of Teaching. 

Association for Experiential Education. “What is Experiential Education?” 

Bringle, R. G., & Clayton, P. H. (2012). Civic education through service learning: what, how, and why?. In Higher education and civic engagement: Comparative perspectives (pp. 101-124). New York: Palgrave Macmillan US.

Guo, P., Saab, N., Post, L. S., & Admiraal, W. (2020). A review of project-based learning in higher education: Student outcomes and measures. International journal of educational research, 102, https://doi.org/10.1016/j.ijer.2020.101586..

McLean, S. F. (2016). Case-Based Learning and its Application in Medical and Health-Care Fields: A Review of Worldwide Literature. Journal of Medical Education and Curricular Development, 3. https://doi.org/10.4137/JMECD.S20377 

Pereira, O. P., & Costa, C. A. (2019). Service Learning: Benefits of Another Learning Pedagogy. Economic Research, 3(9), 17-33.

Pfeifer, G., & Stoddard, E. A. (2018). Diversity, equity, and ınclusion tools for teamwork: asset mapping and team processing handbook.

Poorvu Center for Teaching and Learning. Team-Based Learning.

Rouder, C. (March 21, 2022). A Game-based Approach to Teaching Calculus: Implications of the Research for STEM Courses. EDvice Exchange, Center for the Advancement of Teaching.

Team-Based Learning Collaborative. What is TBL: Overview.

The POGIL Project. General POGIL Book

The POGIL Project. What is POGIL?

Thomson, B. (September 24, 2019) Flipping the Classroom. EDvice Exchange Blog. Center for the Advancement of Teaching. 
Yew, E. H., & Goh, K. (2016). Problem-based learning: An overview of its process and impact on learning. Health professions education, 2(2), 75-79. https://www.sciencedirect.com/science/article/pii/S2452301116300062

Another Look at Active Learning, Part 3: Peer & Collaborative Learning In The Classroom

Dana Dawson, Ph.D. and Jennifer Zaylea, MFA

In Another Look At Active Learning, part 2 of this series, we put forward several easy to implement active learning activities. Here we will focus on activities geared towards accomplishing depth of knowledge through peer and collaborative learning – encouraging a space for synthesizing rather than memorizing. Peer learning allows students to teach one another while expanding and solidifying their knowledge. Collaborative learning is similar to peer learning, but the students are grouped and work towards a common goal, all the while developing communication skills, learning from each other, and sharing their own unique perspectives. 

Benefits of Peer and Collaborative Learning

One of the biggest benefits of using intentional peer and collaborative learning in the classroom is that it promotes a sense of belonging, self-efficacy and community among our students. Several decades of empirical research have demonstrated the positive relationship between effectively implemented collaborative learning and not only our students’ emotional health, but their achievement, effort, persistence, and motivation (Scager, 2016). Students get to know one another and to forge connections. Their discussions help challenge false beliefs that get in the way of learning, such as the belief that they are the only one who doesn’t understand a particular concept or is struggling with material. A student’s peer may find ways of explaining a concept that is more comprehensible than the explanations or examples we use. And the process of peer learning promotes the development of so-called “soft skills” such as empathetic listening, communication, collaboration and problem solving.

Teaching Students How to Do Group Work Successfully

Learning with and from peers requires skills that our students are often still developing. It may be necessary to teach students some of the skills required for successful peer learning such as assigning tasks, asking follow-up questions or expressing disagreement. Here are some suggestions and activities to develop peer learning skills:

  • Talk explicitly with students about skills required for successful group work, particularly where they are tasked with completing a longer term project (more on that in the next blog). Discuss what active listening looks like in practice. Point out to students how you model these skills in your teaching. Stephen Brookfield’s “Conversational Moves” activity, described in Brookfield and Preskill’s Discussion As A Way Of Teaching: Tools and Techniques for Democratic Classrooms, can be a great way to introduce students to strategies for promoting effective discussion.
  • Explore the role of emotion in peer learning. How do students feel when a teammate drops the ball? How does disagreement impact the experience of working in teams?
  • Address specific approaches to managing disagreement and communicating when another student’s point is not clear. Discuss ground rules for discussion early in the semester and consider asking students to participate in the creation of guidelines. The Hopes and Fears Protocol can be a helpful tool for this process. You might also provide examples of language to use in order to disagree respectfully (“What I hear you say is… The way I think about this topic is…”). 
  • Help students develop skills pertinent to collaboration by assigning roles such as leader, recorder, reporter, devil’s advocate and/or time-keeper. Process Oriented Guided Inquiry Learning encourages the use of the role of Analyst or Reflector, which is an individual who considers and periodically reports to the group on how successfully the group is communicating and collaborating.

Considerations When Implementing Peer/Collaborative Learning Activities

  • Articulate the “why” of the activity and how it relates to larger goals within the course and within students’ respective professions. For example, explain that a job in advertising is not an isolated work space; rather it is a highly collaborative work space where ideas and tasks are shared. Each person brings their own knowledge and skills, and participates in a reliable fashion. A manager will select group members as they deem appropriate, and it will often not be the grouping of friends that the students might be thinking. Remind students that developing collaboration skills in the classroom will hone their skills so that they can be more effective and successful professionals.
  • Clear instructions are essential when it comes to active learning and it is especially important as individual activities build into peer/collaborative activities. Early on you might need to help students by breaking out the tasks associated with the activity so that they can determine the most useful path to follow in order to achieve the goal. Providing the students with the opportunity to write out a few of their task-related skills on an index card before groups are formed might help to ensure that students do not get stuck trying to figure out who can contribute what. Ensure students know whether the activity is part of their grade for a course, if that grade is individual or a group grade, and how the grade is determined.
  • Make the deliverable something that is meaningful to the course and to the peer/collaborative groups. If the deliverable is considered “throw away” material, then the students are less likely to put much effort into the activity. Ensuring that the deliverable is something of value, like publishing their project on a public facing website, will encourage the students to actively participate as a whole rather than letting one or two students do all the work.
  • Regular group check-in’s are a necessity in peer/collaborative work. However, feedback can also create negative division within the group dynamic. To keep the group dynamic healthy, there might be a regular meeting at the start of each class where the group members identify items on a checklist that have been, or need to be, completed and by whom. Then, at the end of each class, the task list is revisited to ensure that all tasks are listed. This might mean that group members are adding to the list as they find they need additional work in one area or another to achieve the goal. The task list keeps each member accountable for their contribution and acts as a reminder of that accountability. In addition to a group task list, encouraging students to offer one another constructive feedback during their sessions is a great way to further build healthy communication skills.

Activities that encourage peer and collaborative learning

Investigating AI 

  • Compare output – Students work in small groups using their own prompts to see how differently they might approach researching a topic.
  • Work on prompts together – Students work in small groups to co-write prompts to get the best (most closely related) response to a question they know the answer to.
  • Play “spot the AI generated content” – Faculty provide pre-generated content to student groups and ask the groups to vet the content.
  • Prompt Engineering – Students compete to engineer prompts to generate a result you’ve tested and obtained beforehand.
  • Prompting Competition – Have students compete in creating the best prompt to elicit the most complete, useful, or interesting output to a course-related question or topic. Formulating a useful prompt requires clear articulation of the student’s own understanding, and comparing results allows students to practice their analytical skills. (This one comes from our Survival Guide to AI and Teaching pt.4: Make AI Your Friend.)

Role play: Involve students in activities where they assume different roles and play out scenarios that you or they have created. Students might shy away from this activity if the focus is only on one group at a time, so you might consider having several groups role playing in front of one other group rather than the class.

Interpreted lecture: Ask individual students or small groups to provide a short summary of your professor’s lecture in varying increments (every 15-20 minutes). This works best if you inform students ahead of time that you will be calling on them for this purpose so that they will be prepared for the next study session. You may need to call on other students to fill in any gaps, or fill in the gaps yourself. 

Case studies: Provide students with a case study or problem. Break students into groups of 3-5. Students work through the problem and present a proposed solution to the class. Note: students can be working on the same problem, or each team can receive a different problem. 

Debates: Form teams of students. Each team takes a particular stance on an issue. Ask debaters to debate an issue based on evidence, to clearly state points, to logically organize their argument, and to be persuasive. Those not on a team are the judges. 

Create a study guide: In pairs or groups, have students review their notes and create their own study guide. Review with the class as a whole.

Create possible board questions: In pairs or groups, have students draft questions that might appear on their board exams or other high stakes assessments.

Rotating Stations: Four to six white boards or poster-sized sticky notes are arranged around the room. Each has a different prompt at the top. Students circulate around the room, reading responses and adding their own. 

Teach each other / Update your classmate: In groups, have students take turns trying to teach the rest of their group a section of material. This will help them (and you!) gauge the depth of their understanding on a particular topic/concept. You may also ask students to write a memo to a real or fictional student who missed the last class session. In the memo, they describe the missed content and anticipate why the information might be important for understanding new content. 

Jigsaw: 

Step 1: Organize students into a group of 4-6 people. 

Step 2: Divide the day’s reading or lesson into 4-6 parts, and assign one student in each group to be responsible for a different segment. 

Step 3: Give students time to learn and process their assigned segment independently.

Step 4: Put students who completed the same segment together into an “expert group” to talk about and process the details of their segment. 

Step 5: Have students return to their original “Jigsaw” groups and take turns sharing the segments they’ve become experts on. 

Snowball: 

Step 1: Have students work in pairs for a few minutes to discuss a response to a prompt that you’ve given them. 

Step 2: Direct each pair to sit with another pair and now share amongst the four of you. 

Step 3: Repeat to form a group of 8. 

Step 4: Repeat until you have your whole class as one group discussing the issue. 

Formulate a report: Individually, students draft a report such as an incident report, a project report, meeting minutes, or progress reports, etc.. In small groups, peers review the report and suggest improvements. Individuals revise their reports and submit them. This activity is based on formulating Incident Reports, but the reports could be on a variety of subjects such as meetings, projects, medical reports, or other activities. 

Peer review: In pairs or small groups have students peer review student-created materials such as documentation, treatment plans, exercise programs, or discharge recommendations. Students can meet as a group to discuss the findings. Once feedback is provided students revise the materials before submitting.

Operate a tool: In groups of 2 to 4, students practice operating a tool or piece of equipment. As each student takes a turn the rest of the group provides them with feedback on their use of the tool.  

Pass the Answer: Students write the answer to a prompt on an index card. They then swap answers with a nearby colleague. Turn and repeat swapping with someone else. Then swap one more time with someone else nearby. The instructor then calls on students to read the answer they are holding. This makes it easier for students to speak out in a large lecture hall, because they are offering someone else’s answer rather than their own.

As we mentioned in blog post 2 of this series, you do not need to implement all of the activities listed above. We encourage you to use an activity or two during your class sessions and to select activities that are designed to meet the learning goals of your course. Once the activities are completed, reflect and determine if those activities helped to accomplish peer and collaborative learning. Peer and collaborative learning are learned skills just like any course content you might be delivering. Students have a similar learning curve when it comes to communication. Build in the time for this learning when you are creating peer/collaborative learning activities.

References:

Scager, Karin et al. “Collaborative Learning in Higher Education: Evoking Positive Interdependence.” CBE life sciences education vol. 15,4 (2016): ar69. doi:10.1187/cbe.16-07-0219
Brookfield, Stephen D. and Stephen Preskill. Discussion As A Way Of Teaching: Tools and Techniques for Democratic Classrooms. 2nd Edition. Jossey Bass, 2005

Another Look at Active Learning: A Blog Series, Part I

Stephanie Laggini Fiore

If I had a quarter for every time someone says something to me like “I had 4 years of French in high school and I don’t remember a word of it”, I could pay for a trip to Hawaii right now. As a world language instructor, I hate hearing that. I want everyone who studies language to feel as if they can travel and be able to communicate in the target language in everyday situations. But, of course, the way language was taught when I was younger was not effective in making this ultimate goal a reality. Plug-and-chug homework exercises, rote drills, and mandates to memorize long lists of vocabulary and verb tenses were the dominant modes of teaching. As anyone who has memorized loads of content for an exam and then promptly forgotten it can attest to, mastery and retention of information and the ability to apply it in varied situations is not served well by these methods. 

What we know from decades of research on learning is that students learn best through active learning. (If you love lectures, don’t stop reading!! I’ll get to you in a minute.) Active learning engages students in the work of learning. It asks them to do more than just absorb information by listening to the experts tell them what to think. Instead, they participate in a wide variety of activities that ask them to cognitively engage with the course content by assessing their own knowledge of a topic, collaborating with others to solve problems, discussing key points, analyzing and evaluating information, and more. In active learning environments, students are moved beyond the remembering level of Bloom’s Taxonomy of the Cognitive Domain to higher order levels of thinking that lead to deep learning. 

In 2014, Scott Freeman et al. published a meta-analysis of 225 studies on active learning (Freeman et al., 2014), in which they found that average grades on exams in active learning classes increased by half a letter grade, and that failure rates were 55% lower in active learning classes rather than in classes that utilized traditional lecturing (Again, hold on lecturers. I’m getting to you!) Freeman’s comment in an interview in Wired is telling: “The impact of these data should be like the Surgeon General’s report on ‘Smoking and Health’ in 1964–they should put to rest any debate about whether active learning is more effective than lecturing” (Wired, May 12, 2014). Since that time, research continues to add evidence to point us to the benefits of “hands-on” and “minds-on” learning (Yannier et al., 2021) and the increased benefits of active learning environments for underrepresented students (Eddy & Hogan, 2017), while a recent literature review adds that active learning environments also benefit students’ well-being (Ribeiro-Silva et al., 2022). 

But what about lectures, you say? (See, I told you I’d get to you!) Students aren’t experts, so they would learn better from an expert, right? There is a place for lectures done well. The false dichotomy that embracing active learning means throwing lectures out entirely is unhelpful.  As Stephen Brookfield reminds us, lectures are useful for explaining complex concepts with clarifying examples, introducing alternative perspectives, and modeling intellectual attitudes and behaviors. But a traditional lecture—that is, all-lecture-all-the- time—showcases what you know as the expert, but does not work well to bring novice learners along for the intellectual ride. When the dominant voice you hear in the room is your own, it’s time to stop and take stock of the teaching methods you are using. The best lecturers pause to ask questions, use demonstrations and media to support the point they are making, give students time for reflection—in short, they use active learning techniques embedded in their lectures to help students learn.

Today, in world language classes, students begin using the language from day one. They still need to memorize information, but it is in service to the actual use of the language. They make plenty of mistakes, but I remind them that struggle is part of learning and that they will come to a point where they are feeling less struggle and more triumph in their learning journey. It’s also scary. Instead of reading from pre-written scripts, they are creating language on the spot without a safety net. That mirrors authentic use of the language and helps them navigate real situations when they are faced with them. A supportive atmosphere that encourages experimentation and active effort in using the language, de-emphasizes perfection, and supports resilience in learning is what it’s all about. The best part is that we all enjoy learning so much more. 

You and your students can too! There is a continuum of active learning strategies available to instructors, starting from simple and informal think-pair-share activities and brief polling questions that take a few minutes, all the way to semester-long and highly structured team-based learning activities that require intentional preparation. You can dip your toe in, starting out small with just a few strategies, or dive into active learning wholeheartedly. In this blog series, we will guide you through your choices, explore how to employ active learning in online and large class environments, and also consider how to set up active learning for success. We hope you’ll join us and try some active learning strategies on for size.

Stephanie Fiore, PhD, is Associate Vice Provost and Senior Director of Temple’s Center for the Advancement of Teaching.

A Guide to Creating an Analytic Rubric

Dana Dawson, Ph.D.

Rubrics are tools used by faculty to guide our assessment of student performance and to make our expectations transparent for students. Using a rubric can help make grading more efficient for faculty and fair for students, but when constructed well and shared along with assignment or activity descriptions, they also benefit student learning. Rubrics explicitly represent our performance expectations and allow students to direct their effort toward the intended goal of an activity or assignment. By asserting ahead of time our highest expectations, we encourage students to reach toward those high standards. The use of rubrics promotes more specific feedback and guidance on future performance which allows students to target specific areas for improvement. When we encourage students to review the rubric ahead of time and reflect on feedback after the fact, we can help our students develop the habit of reflecting on their learning.

Rubrics take a variety of forms from checklists of attributes that would demonstrate competence to analytic rubrics featuring descriptions of levels of competence in relation to different criteria. In this post, I will guide you through the steps of creating an analytic rubric (a rubric that features discrete dimensions and criteria descriptions of performance standards for each of those dimensions) featured in our rubric creation worksheet.

Parts of an Analytic Rubric

Analytic rubrics consist of dimensions, scale labels and descriptions of performance standards. A rubric may feature any number of dimensions, but including too many dimensions may make the rubric difficult for a student to interpret. Dimensions of a rubric may be weighted differently to indicate to students those that are most crucial to success on the assignment. Scales generally range from 3-5 criteria levels.

Scale label 1Scale label 2Scale label 3
Dimension 1(Number of points)Description of performance standardsDescription of performance standardsDescription of performance standards
Dimension 2(Number of points)Description of performance standardsDescription of performance standardsDescription of performance standards
Rubric Scale Example

Steps to Create a Rubric

1) Reflect

Begin with a freeform reflection on your goals for the assignment or activity. What are the main things you want this activity/assignment to accomplish; in other words, what are your goals? What content knowledge and skills is/are needed to productively complete this assignment/activity? What behaviors demonstrate achievement of the assignment’s goals? What are the highest expectations you have for students on this assignment? What evidence can students provide that would show they have accomplished what you hoped they would accomplish when you created the assignment/activity? What would the worst demonstration of this assignment look like? 

2) List

Use your reflection to formulate a list of the most important attributes of success on the activity/assignment. What would an excellent submission or performance look like? What specific characteristics would it have? What are the most important attributes of success for this assignment/activity? Include a description of the highest level of performance you expect for the item.

3) Group

Group items with similar performance criteria and give your groups titles. These groups will become your rubric dimensions. For example, your list may look something like this:

  • Presentation is cogent
  • Presentation is organized
  • Thesis demonstrates thoughtful analysis of the text
  • Thesis and evidence demonstrates familiarity with the text
  • There is evidence for the thesis
  • Presentation anticipates counter-points
  • Specific position (perspective, thesis/hypothesis) is imaginative, taking into account the complexities of an issue*
  • Limits of position (perspective, thesis/ hypothesis) are acknowledged
  • Organizes and synthesizes evidence to reveal insightful patterns, differences, or similarities related to focus
  • Central message is compelling (precisely stated, appropriately repeated, memorable, and strongly supported)

The above list may be grouped as follows:

Thesis

  • Thesis demonstrates thoughtful analysis of the text
  • There is evidence for the thesis
  • Central message is compelling (precisely stated, appropriately repeated, memorable, and strongly supported)

Textual analysis

  • Thesis and evidence demonstrates familiarity with the text
  • Addresses complexity of text

Supporting points

  • Presentation anticipates counter-points
  • Limits of position (perspective, thesis/ hypothesis) are acknowledged

Creativity

  • Specific position (perspective, thesis/hypothesis) is imaginative, taking into account the complexities of an issue
  • Organizes and synthesizes evidence to reveal insightful patterns, differences, or similarities related to focus

4) Apply

You may now use your groups to fill in the left hand side of the rubric and your notes from the reflecting and listing phases of this exercise to establish your criteria. This worksheet includes a blank table that you may use to begin drafting your rubric. Remember to consider whether you want to assign point values to each of the dimensions in your rubric.

A Note on Scale Labels

Too often, the language we use for our scale labels can read as harsh and judgmental for students reading the scale. For example, scale labels such as “Weak,” “Poor” or “Unacceptable” do not imply for our students a belief that they can improve. Here are some suggestions for scale label language that is less likely to be discouraging.

Advanced, intermediate high, intermediate, novice

Exceeds expectations, meets expectations, developing towards expectations

Exceeds expectations, meets expectations, progressing, not there yet

Distinguished, proficient, intermediate, novice

Mastery, partial mastery, progressing, emerging

Sophisticated, highly competent, fairly competent, not yet competent

Concluding Thoughts

Taking the time to reflect on your goals for an activity or assignment and to concretely articulate your expectations will not only improve the quality of the rubric you create, but will help guide your instruction. Clearly identifying what you expect your students to know or be able to do will allow you to work backwards from those expectations to the exercises and materials needed in order for students to build the necessary skills and content knowledge.

For help designing and implementing rubrics, feel free to book an appointment with a CAT educational developer or educational technology specialist. Go to catbooking.temple.edu or email cat@temple.edu.

*Some of the performance criteria description language used here is borrowed from the AACU Value Rubrics.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

Towards Shining Teaching Moments

Stephanie Laggini Fiore

I love the beginning of the fall semester, but this fall, I’m feeling especially optimistic! The fall semester always brings such promise with it, as we plan what we’ll teach to a fresh crop of students that we hope (fingers crossed!) will respond to what we have to offer in positive and, dare I say it, soul-satisfying ways. There is nothing like that moment when you see the spark in a student’s eyes or the triumph they feel when they’ve mastered a difficult concept or skill. Every fall, we hope for more of these moments that reconnect us to the joy of teaching. 

I still have imprinted on my memory one of these moments in an Italian course I created to improve student literacy in the language. In addition to the required reading for the class, I had students choose a book from a lending library I had created of all kinds of reading material in Italian, from romance novels to non-fiction to classic works of literature. They were to read 20 minutes a night, keep a log of what they were reading, and swap out the book for a new one when they had finished that one. The point was, of course, to get them reading frequently enough to develop their literacy and to cultivate the belief that they could read in the language. One day early in the semester, a student brought in the book she had just finished to exchange it for a new one and, as she was putting it in the box, she held it up and said for all to hear, “I can’t believe it, but I actually read this WHOLE book!” Her sense of accomplishment and, at the same time, disbelief, was palpable – a true moment of joy in her learning.  

At the CAT, those moments often come in my work with faculty, and this August has been truly special, leaving me with a sense of great optimism for this academic year. Perhaps it has been the energy I’ve derived from large in-person events where I’ve been able to reconnect with colleagues, discuss teaching, and drink in your energy as you anticipate a promising new semester. The first-in-a-very-long-time university-wide New Faculty Orientation and the annual TA Orientation that the CAT hosts were generative events, full of new faces from all over the world, veteran instructors and staff who came to welcome them, and great conversations about teaching. The two interdisciplinary cohorts of faculty who completed the Teaching with AI Teaching Circle brought inspiring creativity and brave openness to change as they gathered together over two days to consider how to intentionally incorporate generative AI tools into teaching. And my numerous visits to collegial assemblies and departmental gatherings to discuss generative AI and teaching has meant for me reconnection, rethinking, and renewal. The long and the short of it is that I feel the same positive and soul-satisfying vibes derived from gratifying moments with students when you, my dear colleagues, experience a spark about teaching in reimagined ways. 

I am keeping my fingers and toes crossed that you experience many of those shining moments in your classes this semester. Let’s go create some sparks! 

Stephanie Laggini Fiore serves as Associate Vice Provost and Senior Director at Temple University’s Center for the Advancement of Teaching 

Full Disclosure of the Terms of Success: Nine Things to Tell Your Students

Dana Dawson, Ph.D.

In a 1997 essay entitled “For Openers… an Inclusive Course Syllabus,” Terence Collins argues for the importance of what he calls “full disclosure of the terms of success” – making explicit the “befuddling mores, assumptions, work habits, background knowledge, key terms, or other markers of the academic subculture too often left implicit, inaccessible to outsiders.” By the time most college instructors or TA’s teach a course, lab, studio or recitation for the first time, we have been embedded in the context of higher education for long enough to have forgotten what we found mystifying and incomprehensible in those early days on campus. It’s important to periodically remind ourselves that what is obvious to us needs to be made explicit to our students.

So, in service of encouraging full disclosure of the terms of success and in keeping with a genre of pseudo-journalism I often find irresistible, I present 9 things all professors should tell their students (including their graduate and professional students).

1. It’s normal to feel like an imposter. 

What we have come to call “imposter syndrome” is the feeling that we do not have the requisite skills or knowledge to be where we are and that we have somehow tricked others into believing we are something we’re not (Clance and Imes, 1978). Unfortunately, such negative self-beliefs, however unfounded, can have very real effects on learning and persistence (Holden et al., 2021). Reassure your students that it was not an accident that they found their way into your classroom or program. Share experiences you or your colleagues have had with feelings that you don’t belong and how you overcame them. 

2. You can ask for things. 

You may have noticed that while some students don’t hesitate to ask for extensions, help, accommodations or clarifications, others suffer in silence even where there are supports they could be taking advantage of. Your students may worry that asking for help is a sign that they don’t belong (see #1 above) or feel unsure of what they can ask you about and when it’s appropriate to ask. Make it clear that they can ask, even if the answer may not always be yes. 

3. Treat your learning as a never-ending research project.

There is no one-size-fits-all approach to succeeding in one’s studies, so it’s important that our students regularly ask themselves whether what they’re doing is working. Encourage your students to use metacognitive strategies to interrogate their study practices and find opportunities for improvement (McGuire and McGuire, 2015).

4. All students can benefit from academic support.

The best way to ensure students who need academic support will seek it out is to reinforce the idea that all students benefit from academic support (Thomas and Tagler, 2019). Remind your students that even star athletes receive coaching. Academic support will benefit any student and will most benefit those who seek support early and often. Remember that students coming to your campus from high school may be completely unfamiliar with student support centers, mental health counseling centers, student health clinics and other student supports. Transfer and graduate students who are new to your campus might be familiar with such supports but not where to find them. Be sure to include this information in your syllabus and course site, and to bring it up in class.

5. We are all still learning.

Another way of saying this is that there are no bad questions. Be transparent about your on-going learning, for example, research findings that surprised you and changed how you thought about your field or an article, book or conference presentation that taught you something new. 

6. What your discipline does and how your course fits into that framework.

When I started my undergraduate degree, I had never heard of Sociology, the discipline I ultimately chose as a major. As soon as I started taking Sociology courses, I knew I was in the right place but struggled to explain to my family what I was going to do with the degree because I wasn’t entirely sure how the content taught in my courses was applied outside of an academic context (or in an academic context, for that matter). Pull back the curtain on your discipline. What are the big questions? Why do they matter? Where does what your course covers fit into the fabric of your discipline? How do people use the skills and knowledge specific to your field in non-academic contexts?

7. What you assume your students already know and can do at the start of your course and what to do if they’re missing any pieces.

Are there concepts, authors, formulas, procedures, methods, etc. that your students should be familiar with? Are there courses you’re assuming they’ve taken? Being explicit about anticipated prior knowledge in a pre-semester questionnaire or early in the semester will give your students an opportunity to fill in gaps sooner rather than later.

8. Preferred communication guidelines.

Do you expect to be called Dr. ___? Would you rather not be called Dr. ___? Do you refuse to read emails that don’t begin with “Dear ___,” and end with a period? Should students nudge you if they haven’t received a response to an email within a couple of days? A couple of weeks? In addition to ensuring you are communicated with in a manner with which you are comfortable, this is an important part of our students’ professional development.

9. You’re glad they’re in your class.

I’m glad you read this far! There. Now didn’t it feel good to read that?

References

Clance, Pauline Rose, and Suzanne Ament Imes. “The Imposter Phenomenon in High Achieving Women: Dynamics and Therapeutic Intervention.” Psychotherapy: Theory, Research and Practice, vol. 15, no. 3, 1978, pp. 241-247.

Collins, Terence. “For Openers, An Inclusive Course Syllabus.” New Paradigms for College Teaching, edited by W. E. Campbell & K. A. Smith, Interaction Book Company, 1997, pp. 79-102.

Holden, Chelsey L., et al. “Imposter Syndrome Among First- and Continuing-Generation College Students: The Roles of Perfectionism and Stress.” Journal of College Student Retention: Research, Theory & Practice, 2021. DOI 15210251211019379.

McGuire, Stephanie, Saundra Yancy McGuire, and Thomas Angelo. Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation. Routledge, 2015.

Thomas, Christopher L., and Michael J. Tagler. “Predicting Academic Help-Seeking Intentions Using the Reasoned Action Model.” Frontiers in Education. Vol. 4. Frontiers Media SA, 2019.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

Survival Guide to AI and Teaching, pt. 10: Talking to Your Students About AI and Learning

Stephanie Laggini Fiore

While we have dealt with many aspects of AI and teaching in this blog series, we want to end the series with the most important aspect—talking to your students about AI and learning. One of the realities of the present moment is that we are all in the midst of a disruptive change, one that neither we nor our students fully understand how to navigate. Therefore, whether or not we decide to allow the use of AI in our classes, it is vitally important to discuss these tools with our students in productive ways. 

At the CAT, we have seen plenty of draconian language on syllabi over the years (“Don’t even think about cheating; you will be caught!!”), but the old adage about catching more flies with honey than with vinegar stands true here as well.  Establishing trust in the learning environment, having clarifying conversations about AI and the choices you have made for the course, engaging students in thinking critically about the use of these tools and what they mean for society and for learning, and welcoming students’ thoughts will be far more effective than setting up an adversarial dynamic. We recommend dedicating time to discussing generative AI during the first week of the semester and then re-engaging students briefly before each written assignment. You should, of course, take some time to go over your AI syllabus statement, explaining your reasons for the decisions you have made, but it is important to go beyond that conversation to allow space for students to reflect on what it means to use these tools for learning. 

Here are some thoughts on how to speak to your students about AI:

  • Consider using an anonymous poll that asks the extent to which your students have used these tools. This will provide a window into how familiar your students are with generative AI.
  • Begin the conversation by asking students what they know about generative AI. You may be surprised about what they do (or don’t) know. Continue with a clarifying conversation on how generative AI tools work, including their benefits and pitfalls. It will be most effective if you can show examples of those benefits and pitfalls—for instance, an example of a hallucination (inaccuracies) or biased content that it might reproduce. 
  • Engage students in thinking about how your assignments help students to achieve the goals of your course. We often recommend using Bloom’s Taxonomy for this exercise. If, for example, you have a goal that reaches the level of evaluation on the taxonomy, how will the assignments (if completed by the student) aid in their attainment of that goal? 
  • Think about how to connect your students to the value of what they are learning. Often students see our courses (especially our required courses) simply as hoops to jump through on the way to a degree. Can you articulate for your students the reason why what they are learning will benefit them? What relevance will it have for their professions, personal growth, future academic work, or communities? Helping students to find meaning in what they are learning will be key to managing AI use.
  • Include a discussion about AI and academic integrity. Why is academic integrity important? How can we think about the use of generative AI in ethical terms? Uses case studies to have them ponder whether those uses are ethical; for instance, how they would feel if you offloaded all student feedback to an AI? Would that be an ethical use of the tool or would it be a breach of your responsibility as an instructor?
  • Ask students to discuss important philosophical questions that will get them thinking about the nature of learning, thought, and voice, such as:
    • Why do we write? What kinds of thinking happens when we write? Query students about how they use writing outside of class: do they keep a journal, write their opinions on social media, text friends when something important happens? Why might they turn to writing to express their thoughts? 
    • What does it mean to cede our thinking and our voice to non-sentient machines? Do we want to live in a world where none of our passions and ideas are expressed in the way that we want to express them, and where originality of thought is replaced by a process of scraping a dataset for answers? 

Talking to a student when you suspect cheating

You’ve followed our advice above and talked to your students about AI from day one of the semester, clarifying permissible use in your course. Still, you suspect that a student in your class has used AI in ways that you have not allowed. The first step is always to talk to the student. Here are some tips for tackling this discussion: 

  • Don’t take it personally! Cheating can often feel like a personal attack and a betrayal of all the work you’ve put into your teaching. Remember that a student’s decision to use AI to take shortcuts is probably about them, not about you. 
  • Check your biases. Is your suspicion of your student’s work well-founded? Would you have the same concerns if the work had been handed in by other students? 
  • Beware of falsely accusing students outright. As was established in a previous post, our ability to accurately identify the use of AI generative tools at present is quite weak.  
  • Ask the student to meet with you. Simply say something like “I have some concerns about your assignment. Please come to see me.” 
  • When you meet with the student, try not to be confrontational (remember that you may not be certain they used AI in an unauthorized manner). Instead, start by asking them questions that will give them a moment to tell the story of their writing process, such as: How were you feeling about the assignment? What do you think was challenging about it? Why don’t you tell me what your process was for getting it done. If there is research involved, you can ask what research they used. If they were writing on something they were supposed to read or visit (an art exhibit, for instance), ask pointed questions that get at whether they actually engaged in that activity.  
  • Then state your concerns: I’m concerned because the writing in this assignment doesn’t seem to match the writing in your other assignments, and the AI detector tool said that it is AI written. Point out any inconsistencies, odd language, repetition, or hallucinated citations with the student.  
  • Use developmental language. Remember that your student may have used generative AI without realizing it is considered cheating, or there may have been factors that made them feel that they needed to cheat. A conversation with your student can be a learning opportunity for them. 
  • Discuss with the colleagues in your department what a reasonable penalty might be for unauthorized use of generative AI. Consider also when it might be necessary to contact The Office of Student Conduct and Community Standards. (Remember, however, that speaking with your student is always the first step before taking further action.) If your conclusion is that the student cheated, you’ll have to decide whether you allow them to complete the assignment again on their own (perhaps with a penalty) or whether you’ll give no options to right the ship. Consider that we are in a developmental stage with these tools and it might be good to give the do-over if the student owns up to it.
  • Self-reflect. Given that students often take shortcuts for reasons related to the course structure, review our blog post on academic integrity and AI in order to take steps to promote academic integrity and consider whether your course is designed to reflect these best practices.

In a world in which AI is here to stay, it is essential that we support students’ ethical and productive interaction with these tools. No matter the discipline, we need to take on the responsibility of developing our students to adapt to this new reality with full awareness of the implications of AI use for learning, for work, and for society. 

We know that this is all new and it is not easy—the CAT is here to help. To book an appointment with a CAT educational developer or educational technology specialist, go to catbooking.temple.edu or email cat@temple.edu.

A Survival Guide To AI and Teaching pt. 9: AI and Equity in the Classroom

Dana Dawson, Ph.D

In previous posts in this series, we noted how generative AI can perpetuate biases and exacerbate the digital divide. Here, we will explore in more depth the potential for these tools to widen the institutional performance gaps that impact learning in higher education, but also the potential for generative AI to create a more equitable learning environment. We conclude with suggestions for what you can do to minimize possible negative impacts of generative AI for students in your courses. 

Rapid improvements in the capabilities of generative AI have a tendency to provoke doom spiraling, and there are indeed some very real concerns we will have to grapple with in coming years. While generative AI at times produces helpful summaries of content or concepts, it is prone to error. Students with tenuous confidence in higher education or their capabilities of succeeding in their studies are less likely to deeply engage in their coursework (Biggs, 2011; Carver, 1998) and may rely excessively or uncritically on AI tools. Over-reliance on generative AI to reduce effort, and not as a mechanism for jumpstarting or supporting conceptual work robs students of opportunities to practice and develop the very creative, critical thinking and analysis skills that are likely to become increasingly valued as AI is more widely available. In addition, where we neglect to carefully vet content created by AI, we run the risk of repeating erroneous information or perpetuating disinformation. The prospect of bias and stereotypes impacting students’ experience in higher education arises not only from the content generative AI produces (Bender et al., 2021; Ferrara, 2023), but from the challenge of determining whether a student has appropriately used the tools. AI detectors cannot reliably differentiate human- from AI-generated content. Faculty must be aware that judgments of whether students relied excessively on AI may be influenced by assumptions that have more to do with factors such as race, gender or spoken language fluency than student performance. Finally, faculty who wish to encourage students to experiment with and integrate the use of AI tools must be aware that inequitable access to broadband internet connection and digital tools, along with varying levels of preparation to effectively use the tools, may differentially impact students. Variable access to broadband prior to, or during their postsecondary studies raises digital equity concerns. Some students will come to our classes well-equipped to engineer prompts and vet generated content while others will be encountering these technologies for the first time. That high quality AI applications are often behind paywalls compounds these issues. 

On the other hand, some scholars and policy-makers have pointed to ways that these tools can be productively used to support student learning and success. AI tools such as ChatGPT can be used to fill in knowledge gaps related to a field of study or to being a college student more generally that are particularly salient for first-generation students or those whose previous educational experiences insufficiently addressed certain skills or topics. GPT3 responses to prompts such as “What are the best ways to study?” and “How do I succeed in college?” generate strategies that are useful and can be expanded upon with additional prompts. Warschauer et al. point out that for second language learners, the ability to quickly generate error-free email messages or to get feedback on one’s writing reduces the extra burden of studying disciplinary content in a second language. Students can prompt generative AI tools to explain concepts using relatable analogies and examples. For students with disabilities, generative AI can serve as an assistive technology, for example by improving ease of communication for those who must economize words, assisting with prioritizing tasks, helping practice social interactions or modeling types of communication. 

RECOMMENDATIONS

1. Reduce the potential for bias to impact your assessment of unauthorized student use of generative AI tools by determining the following before the start of the coming semester:

  • Which assessments have the most potential for unauthorized use?
  • Is there an alternative mechanism for assessing student learning for those assessments most prone to unauthorized use?
  • What are my guidelines for appropriate use of generative AI tools in this class?
  • Can I reliably detect inappropriate use?
  • Is my determination of inappropriate use subject to bias? 
  • What will my next steps be if I suspect inappropriate use?

If you’re not sure whether to allow use of generative AI tools, review our decision tree tool.

2. Clearly communicate your classroom policies on use of generative AI and talk with (not to) your students about those policies, ensuring they understand acceptable limits of use.

3. If you are encouraging the use of generative AI tools as learning tools, consider questions of access by:

  • Assessing the extent to which your students know how to use and have access to the tools; and
  • Showing students how to use the tools in ways that will benefit their education (for example, using follow-up prompts to focus initial queries). Temple University Libraries has created an AI Chatbots and Tools guide to help our students learn to judiciously use these tools.

4. Educate students on how generative AI tools may be biased, can perpetuate stereotypes and can be used to increase dissemination of mis- and dis-information.

5. Help students find their own voice and value a diversity of voices in writing and other content that has the potential to be generated by AI tools.

6. Consider a SoTL project.

In the next (and final) installment of our series, we’ll focus on how to talk to your students about generative AI. In the meantime, if you’d like to discuss AI or any other topic related to your teaching, please book an appointment for a one-on-one consultation with a member of the CAT staff.
 

Works Referenced

Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623).

Biggs, J. (2012). What the student does: Teaching for enhanced learning. Higher education research & development, 31(1), 39-55.

Carver, C. S., & Scheier, M. (1998). On the self-regulation of behavior. Cambridge, UK: Cambridge University Press.

Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv:2304.03738.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

A Survival Guide To AI and Teaching pt. 8: Academic Integrity and AI: Is Detection the Answer?

Stephanie Laggini Fiore, Associate Vice Provost

Even if you’ve done your due diligence in clarifying acceptable use of AI in your course, you may still suspect that students are using these tools in unauthorized ways. While unauthorized AI use is not considered plagiarism, it is still cheating and a violation of the university’s standards on academic honesty, as it both uses “sources beyond those authorized by the instructor in writing papers, preparing reports, solving problems, or carrying out other assignments” and engages “in any behavior specifically prohibited by a faculty member in the course syllabus, assignment, or class discussion.” The sticky question is, therefore, “How can I be sure that students have indeed inappropriately used these tools to complete their work?” We may be tempted to lean on detection methods as a solution, but is that the answer to this conundrum?

Can Humans Detect AI Work Unaided?

In playing with AI tools, you may have noticed some quirks in the output they provide (based on your prompts): they can be repetitive, go off on tangents unrelated to the topic at hand, or simply produce generic or illogical text. Generative AI can also “hallucinate” citations or quote text that simply doesn’t exist. These “AI tells” can sometimes tip us off to unauthorized AI use by our students. But how good are we at accurately identifying these tells? Our colleagues at the University of Pennsylvania conducted an investigation into human ability to detect AI text. They found that participants in their study were significantly better than random chance at detecting AI output but that there was large variability in ability among the participants. The good news is that their findings suggest that detection is a skill that can be developed with training over time (Dugan et al., 2023). At this point, however, few of us have had the targeted training referenced by the authors nor have we been able to dedicate the time necessary to improve. Barring glaring hallucinations or illogical content, most of us are simply not yet familiar enough with the features of AI text to be confident that our hunches are accurate. Try the test the researchers used; you may find, like me, that identifying AI text can be pretty darn challenging. And, of course, these tools will continue to evolve and improve, so our ability to detect non-human content may dwindle as generative AI advances.

Can AI Detectors Do the Job?

Don’t we all wish that AI Detectors (such as Turnitin, GPTZero, Copyleaks, or Sapling) were the answer to all of our generative AI concerns? Sadly, the simple and definitive answer to whether AI detectors can reliably detect AI-generated writing is “not at this time.” The reality is that these detector tools are flawed, delivering both false positives and negatives. In addition, unlike plagiarism detection tools, there is no way to verify that the detector’s conclusions are correct as the results do not link to source material in the same way. The CAT and the Student Success Center are conducting an investigation into error rates in a variety of AI detectors; early indications are concerning. In the meantime, others have pointed to the unreliability of the tools in both formal and informal investigations (here’s another), and in explanations of why these tools fail. Companies creating AI detectors themselves include disclaimers such as Turnitin’s statement that it “does not make a determination of misconduct…rather, we provide data for educators to make an informed decision.” They then go on to advise us to apply our “professional judgment” to these situations. That professional judgment, though, can itself be flawed. 

Some faculty have been advised to run student work through multiple detectors, but the potential for (both positive and negative) bias may come into play as we make decisions about which detector to believe when they return different results (which, from our experience, they most likely will). My wonderful student couldn’t possibly have used AI so I believe the detector that says it’s human-written. OR I don’t doubt for a minute that this student cheated, so I believe the detector that says it is AI-written. Importantly, these detector tools can’t tell us if students have used AI in the ways we have outlined in our syllabi as permissible. Let’s say I am allowing students to use AI for idea generation or for writing an outline, but not for writing full drafts of papers. The detector cannot tell me whether students have used AI in permissible ways. Finally, there are already hacks out there with advice on how to beat the detectors; for example, videos that demonstrate how to run AI-generated content through a rephraser in order to fool AI detectors. All this adds up to inconsistent and unreliable results whereby catching those who have engaged in academically dishonest behavior is hit or miss and does not provide incontrovertible proof of misconduct. Most importantly, we have to consider the very real and potentially damaging effects of wrongfully accusing students of cheating when they have not.*

What’s a Harried Faculty Member To Do?

If detectors aren’t reliable and our own skills at detecting AI writing are not mature, what’s the answer? While we will all be adjusting to this new reality for a while, we can keep some fundamental principles in mind to nudge our students towards transparency and academic honesty, the first of which is to give up on a surveillance mentality as it simply won’t be effective (and you don’t want to police students anyway, right?). Instead, think developmentally and pedagogically by taking these steps:

1. Shift from a reactive to a proactive stance. Test your assessments in a generative AI tool to see how vulnerable they are to AI use. Then make some intentional decisions about whether to change assessments or create new ones. In the long run, of course, it is all about our assessments. We may have used these same types of assessments for decades, but they simply may not work in the way we want them to in the age of AI. Review blog posts #4#5 and #6 to think about changes you may make to your assessments, or if you missed our Using P.I. to Manage A.I. series, see our suggestions there. Remember you can also make an appointment with a CAT developer to help you think this through.

2. Put a statement about AI in your syllabus clarifying acceptable use of AI! I can’t repeat this enough. Our colleagues at The Office of Student Conduct and Community Standards have expressed to us that it is essential to have clear guidelines clarifying what is and isn’t acceptable use of AI in our courses.

3. Engage your students in a discussion about generative AI and academic integrity, including why you have set the standards you have in your course. Remind them periodically about the ethics of generative AI use. (Look for an upcoming blog post for guidance on how to speak with your students about AI.)

4. Design courses that reduce the factors that induce students to cheat. James Lang, in his excellent book Cheating Lessons: Learning From Academic Dishonesty, reminds us that the literature on cheating points to an emphasis on performance, high stakes riding on the outcome, an extrinsic motivation for success, and a low expectation of success as factors that promote academic dishonesty. The good news is that we know also from the literature on learning that evidence-based teaching practices such as formative assessments, scaffolded assignments, ample opportunity for practice and feedback, development of a positive learning environment, and helping students to find relevance and value in what they are learning will both deter cheating by reducing these factors, and improve learning. Need help in reducing the temptation to cheat? Make an appointment with a CAT developer.

5. Plan thoughtfully for how you will manage situations where you suspect unauthorized use of generative AI, starting with a conversation with the student. (We’ll include advice on how to speak to students in the aforementioned future blog post.)

There is no doubt that generative AI is a disruptor in the educational space. Our response to that disruption matters for learning and for our relationship with students. Let’s work together thoughtfully towards a productive and forward-looking response. The answer is not detection—it is development

*Note: If I haven’t convinced you to avoid these flawed detectors in accusing students of cheating, I agree with Sarah Eaton that it is essential to transparently state in your syllabus that you will be using detectors. Do not resort to deceptive practices in an effort to “catch” students. In addition, never use detectors as the sole source of evidence as, of course, the results may not be reliable.

Stephanie Laggini Fiore serves as Associate Vice Provost at Temple University’s Center for the Advancement of Teaching.

A Survival Guide to AI and Teaching pt.7: Inoculating Our Students (and Ourselves!) Against Mis- and Disinformation in the Age of AI

Dana Dawson

In a previous blog post in this series, we suggested making generative AI a subject of critical analysis in your courses. Here, we will focus on the importance of teaching our students to critically engage with content generated by AI tools and with the implications of generative AI use for our information environment. This topic lies at the intersection of digital literacy, information literacy and the newly emerging field of AI literacy (Ng et al.; Wuyckens, Landry and Fastrez). While our students will need to develop the digital literacy required to solve problems in a technology-rich environment characterized by the regular use of AI tools, they will require the information literacy skills to navigate through a complex information ecosystem. Though generative AI tools are digital tools that generate information, we have a tendency to interact with AI tools as if they are social beings (Wang, Rau and Yuan, 1325-1326) and the manner in which they generate information requires special attention to issues of authorship, the impact of data-set bias and the potential automation of disinformation dissemination.

  • As the efficacy and availability of generative AI tools advances, both we and our students will face a variety of information-related challenges. Generative AI can be used to automate the generation of online misinformation and propaganda, significantly increasing the amount of mis- and disinformation we are exposed to online. Flooding our information environment with disinformation not only increases exposure to bad information, but distracts from accurate information and increases skepticism in content generated by credible scholarly and journalistic sources. Even where users do not intend to propagate misinformation, Ferrara and others have pointed out that bias creeps into text and images produced by generative AI through the source material used for training data, the design of a model’s algorithm, data labeling processes, product design decisions and policy decisions (Ferrara, 2).These limitations can result in the creation of content that seems accurate but is entirely made up, a phenomenon known as AI hallucinations.

Our task as educators is to prepare our students to navigate an information environment characterized by the use of generative AI by inoculating against disinformation, helping them develop the skill and habit of verifying information, and building a conception of the components of a healthy information environment.

Tools for Inoculation

Inoculating ourselves and our students against mis- and dis-information functions much the same as inoculating ourselves against viruses through controlled exposure. By “pre-bunking” erroneous content students may themselves create using generative AI tools or may encounter online, we can help reduce the potential for them to be misled in later encounters.

  • Ask students to use ChatGPT to outline one side of a contemporary debate and then to outline the other side of the debate. Have them experiment with prompting the tool to write in the voice of various public figures or to modify the message for different audiences. Analyze what the tool changes with each prompt. Look for similar messages in social and news media.
  • Use the resources of the Algorithmic Justice League to explore how algorithms reproduce race- and gender-based biases.
  • If you assign discussion board entries to your students, secretly select one student each week to use ChatGPT or another generative AI tool to write their response. Ask students to discuss who they believe used AI that week and why.
  • Have students experiment with Typecast or other AI voice generators to create messages in the voice of public figures that are aligned or misaligned with that individual’s stance on contemporary issues.
  • Have students investigate instances of the use of tools such as Adobe Express to create misleading images that circulated online (for example, fake viral images of explosions at the Pentagon and the White House). The News Literacy Project keeps a list here. Analyze who circulated the images and why. How were they discovered to be fake? Ask students to experiment with the image generating and editing tools used in the instances they discover, or with free alternatives.

Tools for Verifying Information

Zeynep Tufekci argues that the proliferation of generative AI tools will create a demand for advanced skills including “the ability to discern truth from the glut of plausible-sounding but profoundly incorrect answers.” Help your students hone their analytical skills, understand the emotional aspects of information consumption and develop a habit of questioning and verifying.

  • Increase students’ self-awareness of their own information consumption habits and methods for verifying information they are exposed to. Ask students to keep a journal for a week of their social media consumption and what they shared, liked, up- or down-voted, reposted, etc. on social media for a week. What kind of content do they tend to engage with? What feelings motivated them to share or interact with content and how did they feel afterward? If shared content included information or took a stance on a topic, did they verify before sending? What do they notice about their information consumption after observing their habits for a week, and what might they consider changing?
  • Introduce students to the SIFT method (Stop, Investigate the source, Find better coverage, Trace claims, quotes and media to the original context). Note that some students may already know of this popular approach to addressing information online, so be sure to first ask if anyone can describe the method for others. Discuss how this approach may need to be modified in the age of AI. Challenge your students to design a modified method that accounts for the difficulty of finding a source and tracing claims where generative AI tools are involved.
  • Given the difficulty or even impossibility of differentiating AI generated content from human generated content and tracing AI generated content to its source, help students focus on analyzing the content itself. Teach students lateral reading strategies and have them investigate claims in articles posted online using these strategies.
  • Develop your students’ habit of asking questions by utilizing tools such as the Question Formulation Technique (registration is free) and the Ultimate Cheatsheet for Critical Thinking.

Tools for Shared Understanding

One of the most insidious consequences of AI generated disinformation is the way in which it can undermine our confidence in the reality of anything we see or hear. While it’s important that we prepare students to confront disinformation and to be aware of how generative AI will impact their information environment, we must also reinforce the importance of trust and shared understanding for the functioning of a healthy democracy.

  • Help students recognize and overcome simplistic and dualistic thinking. Developing an awareness of the criteria and procedures used by different disciplines to verify claims will provide a framework for students to establish their ways of verifying claims. One approach might be to analyze the basis upon which generative AI tools such as ChatGPT makes claims.
  • If confronted by a clear instance of mis- or disinformation in the context of a classroom or course-related interaction (for example, a student asserts the truth of a conspiracy theory that is blatantly false in a discussion board post), correct the inaccuracy as soon as possible. Point to established evidence for your claim. Help students see the difference between topics upon which we can engage in fruitful debate and topics where there is broad agreement, and to identify bad-faith approaches to argumentation.
  • Ask students to create a healthy media diet for themselves. Where might they find verifiable information on topics of interest? What constitutes a good source of information on that topic?
  • Promote empathy for others. We are more likely to believe inaccurate information about others if we are already predisposed to think of those individuals or groups negatively.
  • Encourage students to see themselves as an actor within their information environment. Have them reflect on all of the sources of information they access and contribute to, including those within your class. Ask them to consider how they are using generative AI tools to inject content into that environment and what the implications of their decisions, and similar decisions of others may be on that information environment overall.

In the next installment of our series, we’ll dive a little deeper into the issue of bias and equity as it relates to AI. In the meantime, if you’d like to discuss digital literacy, artificial intelligence, or any other topic related to your teaching, please book an appointment for a one-on-one consultation with a member of the CAT staff.

References

Carolusa, Astrid, Yannik Augustin, André Markus, Carolin Wienrich.  Digital interaction literacy model – Conceptualizing competencies for literate interactions with voice-based AI systems.  Artificial intelligence, 2023, Vol.4, p.100114

Ecker, U. K., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., … & Amazeen, M. A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13-29.

Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv preprint arXiv:2304.03738.

Goldstein, J. A., Sastry, G., Musser, M., DiResta, R., Gentzel, M., & Sedova, K. (2023). Generative language models and automated influence operations: Emerging threats and potential mitigations. arXiv preprint arXiv:2301.04246.

Ng, Davy Tsz Kit, Leung, Jac Ka Lok, Chu, Samuel Kai Wah and Qiao, Maggie Shen. Conceptualizing AI literacy: An exploratory review. Computers and education. Artificial intelligence, 2021, Vol.2, p.100041

Organization for Economic Co-operation and Development, 2013

Wang, Bingcheng, Rau, Pei-Luen Patrick, Yuan, Tianyi. “Measuring user competence in using artificial intelligence: validity and reliability of artificial intelligence literacy scale.” Behaviour & information technology, 2022, Vol.42 (9), p.1324-1337

Wuyckens, Geraldine, Landry, Normand, and Fastrez, Pierre. Untangling media literacy, information literacy, and digital literacy: A systematic meta-review of core concepts in media education. Journal of Media Literacy Education, 14(1), 168-182, 2022  https://doi.org/10.23860/JMLE-2022-14-1-12

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching.