Grouping by Strengths

by Meg Steinweg and Melanie Trexler

Faculty frequently design team projects to enable students to accomplish tasks they cannot complete alone and to build teamwork skills. The latter, according to the National Association of Colleges and Employers (NACE), is one of the top eight career-readiness skills that students need to learn in college (NACE 2022). Yet, instructors face a common challenge: How do you put students in groups that work well together?

The following activity helps instructors create groups that incorporate students self-determined strengths, student choice, and instructor matching-making. Additionally, the assignment invites students to reflect on their strengths and express agency in choosing their group members.

Part I: Strengths Assessment

  1. Access https://high5test.com/ – Select “Find Your Top Strengths”
  2. Take the 100 question High5 Test. (8–15 minutes) Answer as best you can.
  3. Read and reflect on your results.

Part 2: Write the Paper

Reflect critically on the five strengths as they relate to your life and to your role in a group. For each of the five strengths:

  1. State the strength
  2. Describe it. Copy and paste the paragraph about your strength from the High5 website.
  3. Write a paragraph noting where you see this strength appear in your own life and in how you work in groups. Use examples of group work in other classes, on teams (ex: sports, volunteering, etc.), and/or in internships or jobs.
  4. Conclusion: Do you think these describe your core strengths as an individual? Why?

Part 3: Presentation

Present your strengths to the class in a 2–3-minute presentation.* Highlight at least 2 strengths you possess. How do you use these strengths in a group? Why are you a valuable team member? What are strengths you are looking for in a group member? Why?

*This could be recorded, and presentations viewed by students outside of class.

Part 4: Listening and group member selection write-up

As you listen to your peer’s presentations consider how peers’ skills and strengths compliment your own. You do have a voice in choosing potential group members, though the instructor determines which groups work together. You will be in a team with at least one person you select.

  1. In order, list four group members you would like to work with. 
  2. In one paragraph per person (3–4 sentences), explain:
    • How do your strengths complement each other in a group project?
      • What is one possible way your strengths could clash and how could you overcome that challenge?

Meg Steinweg is Associate Professor of Biology at Roanoke College. Melanie Trexler is Associate Professor of Religion at Roanoke.

creative commons license

This article was released under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license as part of the Teaching Messages Collection 2023-24.

Snack Baskets for Better Learning

Wren Mills, PhD

For many years, I have brought a snack basket with me to my classes.  Some colleagues have given me side-eye for this, but food insecurity on college campuses is a real problem, and it appears (in my experience) to have been made worse by the pandemic.  We have known for a long time that school children have problems concentrating when they do not have proper nutrition, hence free breakfast and lunch programs nationwide.  College students are no different. While many institutions require on-campus students to purchase meal plans if they live in the residence halls, struggling students usually purchase the bare minimum, which might be as little as one meal per day. Some students stack schedule their classes with no breaks (especially common with first year students) and sometimes forget to grab their own snack to tie them over before they head to class for the day.  Other students work all day and come to night classes without time to grab something to eat before their long night classes. With food insecurity and forgetfulness in mind, my snack basket began.

On the first day of class, I bring and talk about the snack basket, about food insecurity, and that I never want anyone to sit in my class hungry and thinking about what or when they will next eat or trying to ignore their rumbling tummies.  I want them to take a snack if they need it, but if they don’t, please leave it for those who do. It is rare that anyone takes what I’d call “more than a fair share,” but sometimes they do. If I notice a student doing that, I ask them privately after class if they know about the food pantry on campus (which is conveniently across from my building). [At Temple, the appropriate resource to refer students is The Barnett & Irvine Cherry Pantry, located in room 224A of the Howard Gittis Student Center. -Ed.]

My snacks are varied. I ask on my information sheet if there are any food allergies, too, so that if we have someone with a peanut allergy, for example, I can share with the class as a whole to avoid the things that might set off their neighbor.  (Students usually happily “out” themselves and their allergies and let people know if it’s a “not in the room with me, please” or a “just make sure I’m not next to you” allergy.)  I always have a breakfast bar or granola of some kind.  I also include chips, cookies, and trail mix.  There are suckers.  There is chocolate.  There are gummies and hard candies.  There are applesauce sleeves.  I get cheese and crackers, too.  It is rare that a hungry soul can’t find something to help them out.

How do I afford this?  Like probably all of you, I’m certainly not wealthy or reimbursed.  I watch the clearance areas of my local groceries—they often put boxes there that are damaged, but the food inside is perfectly fine and 75% off or more.  I use coupons.  I watch for sales.  I live for the weeks after Valentine’s, Easter, Halloween, and Christmas!  If I bring fresh fruit, I do so on Mondays so that hopefully it’s gone by Friday. And I remind the students that this is out of my own pocket, and that I do my best to keep the snack basket full, but sometimes it might be a bit empty.  I’ve had students bring things to contribute that they bought (or their parents did) and decided they don’t like.  I’ve had colleagues contribute, too.  (The student workers and graduate assistants always know they can come and get a snack, too, if they need one).

I know not everyone will be interested in doing this. Some will say this isn’t part of their job.  And it’s not—it does go beyond normal teaching, service, and scholarly duties.  Alternatively, it would only take a moment of your time to look up what help is available on your campus and in your community for food-insecure students.  Most campuses now have food pantries. In my city, the churches near campus offer meals and food pantries to our student population just as they do everyday folks just trying to make ends meet.  Having a resource sheet that you can hand students will be appreciated and remembered by them.

Articles of Interest

College student hunger statistics and research. (n.d.). Feeding America.org. https://www.feedingamerica.org/research/college-hunger-research

McCoy et al. (2022). Food insecurity on college campuses: The invisible epidemic. Health Affairs. https://www.healthaffairs.org/content/forefront/food-insecurity-college-campuses-invisible-epidemic

Wren Mills is Assistant Professor in the School of Leadership and Professional Studies at Western Kentucky University.

creative commons license

This article was released under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license as part of the Teaching Messages Collection 2023-24.

A Guide to Creating an Analytic Rubric

Dana Dawson, Ph.D.

Rubrics are tools used by faculty to guide our assessment of student performance and to make our expectations transparent for students. Using a rubric can help make grading more efficient for faculty and fair for students, but when constructed well and shared along with assignment or activity descriptions, they also benefit student learning. Rubrics explicitly represent our performance expectations and allow students to direct their effort toward the intended goal of an activity or assignment. By asserting ahead of time our highest expectations, we encourage students to reach toward those high standards. The use of rubrics promotes more specific feedback and guidance on future performance which allows students to target specific areas for improvement. When we encourage students to review the rubric ahead of time and reflect on feedback after the fact, we can help our students develop the habit of reflecting on their learning.

Rubrics take a variety of forms from checklists of attributes that would demonstrate competence to analytic rubrics featuring descriptions of levels of competence in relation to different criteria. In this post, I will guide you through the steps of creating an analytic rubric (a rubric that features discrete dimensions and criteria descriptions of performance standards for each of those dimensions) featured in our rubric creation worksheet.

Parts of an Analytic Rubric

Analytic rubrics consist of dimensions, scale labels and descriptions of performance standards. A rubric may feature any number of dimensions, but including too many dimensions may make the rubric difficult for a student to interpret. Dimensions of a rubric may be weighted differently to indicate to students those that are most crucial to success on the assignment. Scales generally range from 3-5 criteria levels.

Scale label 1Scale label 2Scale label 3
Dimension 1(Number of points)Description of performance standardsDescription of performance standardsDescription of performance standards
Dimension 2(Number of points)Description of performance standardsDescription of performance standardsDescription of performance standards
Rubric Scale Example

Steps to Create a Rubric

1) Reflect

Begin with a freeform reflection on your goals for the assignment or activity. What are the main things you want this activity/assignment to accomplish; in other words, what are your goals? What content knowledge and skills is/are needed to productively complete this assignment/activity? What behaviors demonstrate achievement of the assignment’s goals? What are the highest expectations you have for students on this assignment? What evidence can students provide that would show they have accomplished what you hoped they would accomplish when you created the assignment/activity? What would the worst demonstration of this assignment look like? 

2) List

Use your reflection to formulate a list of the most important attributes of success on the activity/assignment. What would an excellent submission or performance look like? What specific characteristics would it have? What are the most important attributes of success for this assignment/activity? Include a description of the highest level of performance you expect for the item.

3) Group

Group items with similar performance criteria and give your groups titles. These groups will become your rubric dimensions. For example, your list may look something like this:

  • Presentation is cogent
  • Presentation is organized
  • Thesis demonstrates thoughtful analysis of the text
  • Thesis and evidence demonstrates familiarity with the text
  • There is evidence for the thesis
  • Presentation anticipates counter-points
  • Specific position (perspective, thesis/hypothesis) is imaginative, taking into account the complexities of an issue*
  • Limits of position (perspective, thesis/ hypothesis) are acknowledged
  • Organizes and synthesizes evidence to reveal insightful patterns, differences, or similarities related to focus
  • Central message is compelling (precisely stated, appropriately repeated, memorable, and strongly supported)

The above list may be grouped as follows:

Thesis

  • Thesis demonstrates thoughtful analysis of the text
  • There is evidence for the thesis
  • Central message is compelling (precisely stated, appropriately repeated, memorable, and strongly supported)

Textual analysis

  • Thesis and evidence demonstrates familiarity with the text
  • Addresses complexity of text

Supporting points

  • Presentation anticipates counter-points
  • Limits of position (perspective, thesis/ hypothesis) are acknowledged

Creativity

  • Specific position (perspective, thesis/hypothesis) is imaginative, taking into account the complexities of an issue
  • Organizes and synthesizes evidence to reveal insightful patterns, differences, or similarities related to focus

4) Apply

You may now use your groups to fill in the left hand side of the rubric and your notes from the reflecting and listing phases of this exercise to establish your criteria. This worksheet includes a blank table that you may use to begin drafting your rubric. Remember to consider whether you want to assign point values to each of the dimensions in your rubric.

A Note on Scale Labels

Too often, the language we use for our scale labels can read as harsh and judgmental for students reading the scale. For example, scale labels such as “Weak,” “Poor” or “Unacceptable” do not imply for our students a belief that they can improve. Here are some suggestions for scale label language that is less likely to be discouraging.

Advanced, intermediate high, intermediate, novice

Exceeds expectations, meets expectations, developing towards expectations

Exceeds expectations, meets expectations, progressing, not there yet

Distinguished, proficient, intermediate, novice

Mastery, partial mastery, progressing, emerging

Sophisticated, highly competent, fairly competent, not yet competent

Concluding Thoughts

Taking the time to reflect on your goals for an activity or assignment and to concretely articulate your expectations will not only improve the quality of the rubric you create, but will help guide your instruction. Clearly identifying what you expect your students to know or be able to do will allow you to work backwards from those expectations to the exercises and materials needed in order for students to build the necessary skills and content knowledge.

For help designing and implementing rubrics, feel free to book an appointment with a CAT educational developer or educational technology specialist. Go to catbooking.temple.edu or email cat@temple.edu.

*Some of the performance criteria description language used here is borrowed from the AACU Value Rubrics.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

Using Reading Prompts to Promote Students’ Academic Reading

Zeenar Salim

Do you have concern around students attending classes without pre-reading? Ever wondered how you can make them read? Students in higher education are expected to comprehend the text, connect their prior experiences with the text, evaluate the text, and consider alternative view-points to the text. Reading prompts are considered to be a way to motivate students to read. It improves students’ comprehension and critical thinking skills by engaging them actively with the reading material. 

Provision of reading cues/prompts helps the learners to actively read as well as analyze their own thoughts during and after reading to expand, clarify or modify their existing thinking about the concepts or idea at hand. The reading prompts can be categorized into six categories a) identification of problem or issue b) making connections c) interpretation of evidence d) challenging assumptions e) making applications, and f) taking a different point of view. Sample questions for each category are as follows:

  1. What is the key issue/concept explained in the article? What are the complexities of the issue? (Identification of problem or issue)
  2. How is what you are reading different from your prior knowledge around the issue/topic? (Making connections)
  3. What inferences can you draw from the evidence presented in the reading? (Interpretation of evidence)
  4. If you got a chance to meet the author, what are the key questions that you would ask the author? (Challenging assumptions)
  5. What are the lessons for your practice that you have drawn from this reading? (Application)
  6. If you wrote a letter to your friend who has no expertise in this subject area, how would you explain to him the theoretical concept presented in the article? (Taking a different point of view)

Generally, students are asked to complete the reading prompts before the next class by writing a paragraph-long response to each question. Teachers may ask some or all questions depending upon the learning objectives of the session and may adapt the question(s) to gauge specific information around the text. For more sample questions and detailed literature around reading prompts, please read Tomasek (2009). 

Reference

Tomasek, T. (January 01, 2009). Critical Reading: Using Reading Prompts to Promote Active Engagement with Text. International Journal of Teaching and Learning in Higher Education. Retrieved from https://files.eric.ed.gov/fulltext/EJ896252.pdf

Zeenar Salim is a Fulbright PhD Candidate at Syracuse University, where she works at RIDLR (Research in Design Learning Resources).

creative commons license This post is released under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license.

Towards Shining Teaching Moments

Stephanie Laggini Fiore

I love the beginning of the fall semester, but this fall, I’m feeling especially optimistic! The fall semester always brings such promise with it, as we plan what we’ll teach to a fresh crop of students that we hope (fingers crossed!) will respond to what we have to offer in positive and, dare I say it, soul-satisfying ways. There is nothing like that moment when you see the spark in a student’s eyes or the triumph they feel when they’ve mastered a difficult concept or skill. Every fall, we hope for more of these moments that reconnect us to the joy of teaching. 

I still have imprinted on my memory one of these moments in an Italian course I created to improve student literacy in the language. In addition to the required reading for the class, I had students choose a book from a lending library I had created of all kinds of reading material in Italian, from romance novels to non-fiction to classic works of literature. They were to read 20 minutes a night, keep a log of what they were reading, and swap out the book for a new one when they had finished that one. The point was, of course, to get them reading frequently enough to develop their literacy and to cultivate the belief that they could read in the language. One day early in the semester, a student brought in the book she had just finished to exchange it for a new one and, as she was putting it in the box, she held it up and said for all to hear, “I can’t believe it, but I actually read this WHOLE book!” Her sense of accomplishment and, at the same time, disbelief, was palpable – a true moment of joy in her learning.  

At the CAT, those moments often come in my work with faculty, and this August has been truly special, leaving me with a sense of great optimism for this academic year. Perhaps it has been the energy I’ve derived from large in-person events where I’ve been able to reconnect with colleagues, discuss teaching, and drink in your energy as you anticipate a promising new semester. The first-in-a-very-long-time university-wide New Faculty Orientation and the annual TA Orientation that the CAT hosts were generative events, full of new faces from all over the world, veteran instructors and staff who came to welcome them, and great conversations about teaching. The two interdisciplinary cohorts of faculty who completed the Teaching with AI Teaching Circle brought inspiring creativity and brave openness to change as they gathered together over two days to consider how to intentionally incorporate generative AI tools into teaching. And my numerous visits to collegial assemblies and departmental gatherings to discuss generative AI and teaching has meant for me reconnection, rethinking, and renewal. The long and the short of it is that I feel the same positive and soul-satisfying vibes derived from gratifying moments with students when you, my dear colleagues, experience a spark about teaching in reimagined ways. 

I am keeping my fingers and toes crossed that you experience many of those shining moments in your classes this semester. Let’s go create some sparks! 

Stephanie Laggini Fiore serves as Associate Vice Provost and Senior Director at Temple University’s Center for the Advancement of Teaching 

Full Disclosure of the Terms of Success: Nine Things to Tell Your Students

Dana Dawson, Ph.D.

In a 1997 essay entitled “For Openers… an Inclusive Course Syllabus,” Terence Collins argues for the importance of what he calls “full disclosure of the terms of success” – making explicit the “befuddling mores, assumptions, work habits, background knowledge, key terms, or other markers of the academic subculture too often left implicit, inaccessible to outsiders.” By the time most college instructors or TA’s teach a course, lab, studio or recitation for the first time, we have been embedded in the context of higher education for long enough to have forgotten what we found mystifying and incomprehensible in those early days on campus. It’s important to periodically remind ourselves that what is obvious to us needs to be made explicit to our students.

So, in service of encouraging full disclosure of the terms of success and in keeping with a genre of pseudo-journalism I often find irresistible, I present 9 things all professors should tell their students (including their graduate and professional students).

1. It’s normal to feel like an imposter. 

What we have come to call “imposter syndrome” is the feeling that we do not have the requisite skills or knowledge to be where we are and that we have somehow tricked others into believing we are something we’re not (Clance and Imes, 1978). Unfortunately, such negative self-beliefs, however unfounded, can have very real effects on learning and persistence (Holden et al., 2021). Reassure your students that it was not an accident that they found their way into your classroom or program. Share experiences you or your colleagues have had with feelings that you don’t belong and how you overcame them. 

2. You can ask for things. 

You may have noticed that while some students don’t hesitate to ask for extensions, help, accommodations or clarifications, others suffer in silence even where there are supports they could be taking advantage of. Your students may worry that asking for help is a sign that they don’t belong (see #1 above) or feel unsure of what they can ask you about and when it’s appropriate to ask. Make it clear that they can ask, even if the answer may not always be yes. 

3. Treat your learning as a never-ending research project.

There is no one-size-fits-all approach to succeeding in one’s studies, so it’s important that our students regularly ask themselves whether what they’re doing is working. Encourage your students to use metacognitive strategies to interrogate their study practices and find opportunities for improvement (McGuire and McGuire, 2015).

4. All students can benefit from academic support.

The best way to ensure students who need academic support will seek it out is to reinforce the idea that all students benefit from academic support (Thomas and Tagler, 2019). Remind your students that even star athletes receive coaching. Academic support will benefit any student and will most benefit those who seek support early and often. Remember that students coming to your campus from high school may be completely unfamiliar with student support centers, mental health counseling centers, student health clinics and other student supports. Transfer and graduate students who are new to your campus might be familiar with such supports but not where to find them. Be sure to include this information in your syllabus and course site, and to bring it up in class.

5. We are all still learning.

Another way of saying this is that there are no bad questions. Be transparent about your on-going learning, for example, research findings that surprised you and changed how you thought about your field or an article, book or conference presentation that taught you something new. 

6. What your discipline does and how your course fits into that framework.

When I started my undergraduate degree, I had never heard of Sociology, the discipline I ultimately chose as a major. As soon as I started taking Sociology courses, I knew I was in the right place but struggled to explain to my family what I was going to do with the degree because I wasn’t entirely sure how the content taught in my courses was applied outside of an academic context (or in an academic context, for that matter). Pull back the curtain on your discipline. What are the big questions? Why do they matter? Where does what your course covers fit into the fabric of your discipline? How do people use the skills and knowledge specific to your field in non-academic contexts?

7. What you assume your students already know and can do at the start of your course and what to do if they’re missing any pieces.

Are there concepts, authors, formulas, procedures, methods, etc. that your students should be familiar with? Are there courses you’re assuming they’ve taken? Being explicit about anticipated prior knowledge in a pre-semester questionnaire or early in the semester will give your students an opportunity to fill in gaps sooner rather than later.

8. Preferred communication guidelines.

Do you expect to be called Dr. ___? Would you rather not be called Dr. ___? Do you refuse to read emails that don’t begin with “Dear ___,” and end with a period? Should students nudge you if they haven’t received a response to an email within a couple of days? A couple of weeks? In addition to ensuring you are communicated with in a manner with which you are comfortable, this is an important part of our students’ professional development.

9. You’re glad they’re in your class.

I’m glad you read this far! There. Now didn’t it feel good to read that?

References

Clance, Pauline Rose, and Suzanne Ament Imes. “The Imposter Phenomenon in High Achieving Women: Dynamics and Therapeutic Intervention.” Psychotherapy: Theory, Research and Practice, vol. 15, no. 3, 1978, pp. 241-247.

Collins, Terence. “For Openers, An Inclusive Course Syllabus.” New Paradigms for College Teaching, edited by W. E. Campbell & K. A. Smith, Interaction Book Company, 1997, pp. 79-102.

Holden, Chelsey L., et al. “Imposter Syndrome Among First- and Continuing-Generation College Students: The Roles of Perfectionism and Stress.” Journal of College Student Retention: Research, Theory & Practice, 2021. DOI 15210251211019379.

McGuire, Stephanie, Saundra Yancy McGuire, and Thomas Angelo. Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation. Routledge, 2015.

Thomas, Christopher L., and Michael J. Tagler. “Predicting Academic Help-Seeking Intentions Using the Reasoned Action Model.” Frontiers in Education. Vol. 4. Frontiers Media SA, 2019.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

Survival Guide to AI and Teaching, pt. 10: Talking to Your Students About AI and Learning

Stephanie Laggini Fiore

While we have dealt with many aspects of AI and teaching in this blog series, we want to end the series with the most important aspect—talking to your students about AI and learning. One of the realities of the present moment is that we are all in the midst of a disruptive change, one that neither we nor our students fully understand how to navigate. Therefore, whether or not we decide to allow the use of AI in our classes, it is vitally important to discuss these tools with our students in productive ways. 

At the CAT, we have seen plenty of draconian language on syllabi over the years (“Don’t even think about cheating; you will be caught!!”), but the old adage about catching more flies with honey than with vinegar stands true here as well.  Establishing trust in the learning environment, having clarifying conversations about AI and the choices you have made for the course, engaging students in thinking critically about the use of these tools and what they mean for society and for learning, and welcoming students’ thoughts will be far more effective than setting up an adversarial dynamic. We recommend dedicating time to discussing generative AI during the first week of the semester and then re-engaging students briefly before each written assignment. You should, of course, take some time to go over your AI syllabus statement, explaining your reasons for the decisions you have made, but it is important to go beyond that conversation to allow space for students to reflect on what it means to use these tools for learning. 

Here are some thoughts on how to speak to your students about AI:

  • Consider using an anonymous poll that asks the extent to which your students have used these tools. This will provide a window into how familiar your students are with generative AI.
  • Begin the conversation by asking students what they know about generative AI. You may be surprised about what they do (or don’t) know. Continue with a clarifying conversation on how generative AI tools work, including their benefits and pitfalls. It will be most effective if you can show examples of those benefits and pitfalls—for instance, an example of a hallucination (inaccuracies) or biased content that it might reproduce. 
  • Engage students in thinking about how your assignments help students to achieve the goals of your course. We often recommend using Bloom’s Taxonomy for this exercise. If, for example, you have a goal that reaches the level of evaluation on the taxonomy, how will the assignments (if completed by the student) aid in their attainment of that goal? 
  • Think about how to connect your students to the value of what they are learning. Often students see our courses (especially our required courses) simply as hoops to jump through on the way to a degree. Can you articulate for your students the reason why what they are learning will benefit them? What relevance will it have for their professions, personal growth, future academic work, or communities? Helping students to find meaning in what they are learning will be key to managing AI use.
  • Include a discussion about AI and academic integrity. Why is academic integrity important? How can we think about the use of generative AI in ethical terms? Uses case studies to have them ponder whether those uses are ethical; for instance, how they would feel if you offloaded all student feedback to an AI? Would that be an ethical use of the tool or would it be a breach of your responsibility as an instructor?
  • Ask students to discuss important philosophical questions that will get them thinking about the nature of learning, thought, and voice, such as:
    • Why do we write? What kinds of thinking happens when we write? Query students about how they use writing outside of class: do they keep a journal, write their opinions on social media, text friends when something important happens? Why might they turn to writing to express their thoughts? 
    • What does it mean to cede our thinking and our voice to non-sentient machines? Do we want to live in a world where none of our passions and ideas are expressed in the way that we want to express them, and where originality of thought is replaced by a process of scraping a dataset for answers? 

Talking to a student when you suspect cheating

You’ve followed our advice above and talked to your students about AI from day one of the semester, clarifying permissible use in your course. Still, you suspect that a student in your class has used AI in ways that you have not allowed. The first step is always to talk to the student. Here are some tips for tackling this discussion: 

  • Don’t take it personally! Cheating can often feel like a personal attack and a betrayal of all the work you’ve put into your teaching. Remember that a student’s decision to use AI to take shortcuts is probably about them, not about you. 
  • Check your biases. Is your suspicion of your student’s work well-founded? Would you have the same concerns if the work had been handed in by other students? 
  • Beware of falsely accusing students outright. As was established in a previous post, our ability to accurately identify the use of AI generative tools at present is quite weak.  
  • Ask the student to meet with you. Simply say something like “I have some concerns about your assignment. Please come to see me.” 
  • When you meet with the student, try not to be confrontational (remember that you may not be certain they used AI in an unauthorized manner). Instead, start by asking them questions that will give them a moment to tell the story of their writing process, such as: How were you feeling about the assignment? What do you think was challenging about it? Why don’t you tell me what your process was for getting it done. If there is research involved, you can ask what research they used. If they were writing on something they were supposed to read or visit (an art exhibit, for instance), ask pointed questions that get at whether they actually engaged in that activity.  
  • Then state your concerns: I’m concerned because the writing in this assignment doesn’t seem to match the writing in your other assignments, and the AI detector tool said that it is AI written. Point out any inconsistencies, odd language, repetition, or hallucinated citations with the student.  
  • Use developmental language. Remember that your student may have used generative AI without realizing it is considered cheating, or there may have been factors that made them feel that they needed to cheat. A conversation with your student can be a learning opportunity for them. 
  • Discuss with the colleagues in your department what a reasonable penalty might be for unauthorized use of generative AI. Consider also when it might be necessary to contact The Office of Student Conduct and Community Standards. (Remember, however, that speaking with your student is always the first step before taking further action.) If your conclusion is that the student cheated, you’ll have to decide whether you allow them to complete the assignment again on their own (perhaps with a penalty) or whether you’ll give no options to right the ship. Consider that we are in a developmental stage with these tools and it might be good to give the do-over if the student owns up to it.
  • Self-reflect. Given that students often take shortcuts for reasons related to the course structure, review our blog post on academic integrity and AI in order to take steps to promote academic integrity and consider whether your course is designed to reflect these best practices.

In a world in which AI is here to stay, it is essential that we support students’ ethical and productive interaction with these tools. No matter the discipline, we need to take on the responsibility of developing our students to adapt to this new reality with full awareness of the implications of AI use for learning, for work, and for society. 

We know that this is all new and it is not easy—the CAT is here to help. To book an appointment with a CAT educational developer or educational technology specialist, go to catbooking.temple.edu or email cat@temple.edu.

A Survival Guide To AI and Teaching pt. 9: AI and Equity in the Classroom

Dana Dawson, Ph.D

In previous posts in this series, we noted how generative AI can perpetuate biases and exacerbate the digital divide. Here, we will explore in more depth the potential for these tools to widen the institutional performance gaps that impact learning in higher education, but also the potential for generative AI to create a more equitable learning environment. We conclude with suggestions for what you can do to minimize possible negative impacts of generative AI for students in your courses. 

Rapid improvements in the capabilities of generative AI have a tendency to provoke doom spiraling, and there are indeed some very real concerns we will have to grapple with in coming years. While generative AI at times produces helpful summaries of content or concepts, it is prone to error. Students with tenuous confidence in higher education or their capabilities of succeeding in their studies are less likely to deeply engage in their coursework (Biggs, 2011; Carver, 1998) and may rely excessively or uncritically on AI tools. Over-reliance on generative AI to reduce effort, and not as a mechanism for jumpstarting or supporting conceptual work robs students of opportunities to practice and develop the very creative, critical thinking and analysis skills that are likely to become increasingly valued as AI is more widely available. In addition, where we neglect to carefully vet content created by AI, we run the risk of repeating erroneous information or perpetuating disinformation. The prospect of bias and stereotypes impacting students’ experience in higher education arises not only from the content generative AI produces (Bender et al., 2021; Ferrara, 2023), but from the challenge of determining whether a student has appropriately used the tools. AI detectors cannot reliably differentiate human- from AI-generated content. Faculty must be aware that judgments of whether students relied excessively on AI may be influenced by assumptions that have more to do with factors such as race, gender or spoken language fluency than student performance. Finally, faculty who wish to encourage students to experiment with and integrate the use of AI tools must be aware that inequitable access to broadband internet connection and digital tools, along with varying levels of preparation to effectively use the tools, may differentially impact students. Variable access to broadband prior to, or during their postsecondary studies raises digital equity concerns. Some students will come to our classes well-equipped to engineer prompts and vet generated content while others will be encountering these technologies for the first time. That high quality AI applications are often behind paywalls compounds these issues. 

On the other hand, some scholars and policy-makers have pointed to ways that these tools can be productively used to support student learning and success. AI tools such as ChatGPT can be used to fill in knowledge gaps related to a field of study or to being a college student more generally that are particularly salient for first-generation students or those whose previous educational experiences insufficiently addressed certain skills or topics. GPT3 responses to prompts such as “What are the best ways to study?” and “How do I succeed in college?” generate strategies that are useful and can be expanded upon with additional prompts. Warschauer et al. point out that for second language learners, the ability to quickly generate error-free email messages or to get feedback on one’s writing reduces the extra burden of studying disciplinary content in a second language. Students can prompt generative AI tools to explain concepts using relatable analogies and examples. For students with disabilities, generative AI can serve as an assistive technology, for example by improving ease of communication for those who must economize words, assisting with prioritizing tasks, helping practice social interactions or modeling types of communication. 

RECOMMENDATIONS

1. Reduce the potential for bias to impact your assessment of unauthorized student use of generative AI tools by determining the following before the start of the coming semester:

  • Which assessments have the most potential for unauthorized use?
  • Is there an alternative mechanism for assessing student learning for those assessments most prone to unauthorized use?
  • What are my guidelines for appropriate use of generative AI tools in this class?
  • Can I reliably detect inappropriate use?
  • Is my determination of inappropriate use subject to bias? 
  • What will my next steps be if I suspect inappropriate use?

If you’re not sure whether to allow use of generative AI tools, review our decision tree tool.

2. Clearly communicate your classroom policies on use of generative AI and talk with (not to) your students about those policies, ensuring they understand acceptable limits of use.

3. If you are encouraging the use of generative AI tools as learning tools, consider questions of access by:

  • Assessing the extent to which your students know how to use and have access to the tools; and
  • Showing students how to use the tools in ways that will benefit their education (for example, using follow-up prompts to focus initial queries). Temple University Libraries has created an AI Chatbots and Tools guide to help our students learn to judiciously use these tools.

4. Educate students on how generative AI tools may be biased, can perpetuate stereotypes and can be used to increase dissemination of mis- and dis-information.

5. Help students find their own voice and value a diversity of voices in writing and other content that has the potential to be generated by AI tools.

6. Consider a SoTL project.

In the next (and final) installment of our series, we’ll focus on how to talk to your students about generative AI. In the meantime, if you’d like to discuss AI or any other topic related to your teaching, please book an appointment for a one-on-one consultation with a member of the CAT staff.
 

Works Referenced

Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021, March). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM conference on fairness, accountability, and transparency (pp. 610-623).

Biggs, J. (2012). What the student does: Teaching for enhanced learning. Higher education research & development, 31(1), 39-55.

Carver, C. S., & Scheier, M. (1998). On the self-regulation of behavior. Cambridge, UK: Cambridge University Press.

Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv:2304.03738.

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching

A Survival Guide To AI and Teaching pt. 8: Academic Integrity and AI: Is Detection the Answer?

Stephanie Laggini Fiore, Associate Vice Provost

Even if you’ve done your due diligence in clarifying acceptable use of AI in your course, you may still suspect that students are using these tools in unauthorized ways. While unauthorized AI use is not considered plagiarism, it is still cheating and a violation of the university’s standards on academic honesty, as it both uses “sources beyond those authorized by the instructor in writing papers, preparing reports, solving problems, or carrying out other assignments” and engages “in any behavior specifically prohibited by a faculty member in the course syllabus, assignment, or class discussion.” The sticky question is, therefore, “How can I be sure that students have indeed inappropriately used these tools to complete their work?” We may be tempted to lean on detection methods as a solution, but is that the answer to this conundrum?

Can Humans Detect AI Work Unaided?

In playing with AI tools, you may have noticed some quirks in the output they provide (based on your prompts): they can be repetitive, go off on tangents unrelated to the topic at hand, or simply produce generic or illogical text. Generative AI can also “hallucinate” citations or quote text that simply doesn’t exist. These “AI tells” can sometimes tip us off to unauthorized AI use by our students. But how good are we at accurately identifying these tells? Our colleagues at the University of Pennsylvania conducted an investigation into human ability to detect AI text. They found that participants in their study were significantly better than random chance at detecting AI output but that there was large variability in ability among the participants. The good news is that their findings suggest that detection is a skill that can be developed with training over time (Dugan et al., 2023). At this point, however, few of us have had the targeted training referenced by the authors nor have we been able to dedicate the time necessary to improve. Barring glaring hallucinations or illogical content, most of us are simply not yet familiar enough with the features of AI text to be confident that our hunches are accurate. Try the test the researchers used; you may find, like me, that identifying AI text can be pretty darn challenging. And, of course, these tools will continue to evolve and improve, so our ability to detect non-human content may dwindle as generative AI advances.

Can AI Detectors Do the Job?

Don’t we all wish that AI Detectors (such as Turnitin, GPTZero, Copyleaks, or Sapling) were the answer to all of our generative AI concerns? Sadly, the simple and definitive answer to whether AI detectors can reliably detect AI-generated writing is “not at this time.” The reality is that these detector tools are flawed, delivering both false positives and negatives. In addition, unlike plagiarism detection tools, there is no way to verify that the detector’s conclusions are correct as the results do not link to source material in the same way. The CAT and the Student Success Center are conducting an investigation into error rates in a variety of AI detectors; early indications are concerning. In the meantime, others have pointed to the unreliability of the tools in both formal and informal investigations (here’s another), and in explanations of why these tools fail. Companies creating AI detectors themselves include disclaimers such as Turnitin’s statement that it “does not make a determination of misconduct…rather, we provide data for educators to make an informed decision.” They then go on to advise us to apply our “professional judgment” to these situations. That professional judgment, though, can itself be flawed. 

Some faculty have been advised to run student work through multiple detectors, but the potential for (both positive and negative) bias may come into play as we make decisions about which detector to believe when they return different results (which, from our experience, they most likely will). My wonderful student couldn’t possibly have used AI so I believe the detector that says it’s human-written. OR I don’t doubt for a minute that this student cheated, so I believe the detector that says it is AI-written. Importantly, these detector tools can’t tell us if students have used AI in the ways we have outlined in our syllabi as permissible. Let’s say I am allowing students to use AI for idea generation or for writing an outline, but not for writing full drafts of papers. The detector cannot tell me whether students have used AI in permissible ways. Finally, there are already hacks out there with advice on how to beat the detectors; for example, videos that demonstrate how to run AI-generated content through a rephraser in order to fool AI detectors. All this adds up to inconsistent and unreliable results whereby catching those who have engaged in academically dishonest behavior is hit or miss and does not provide incontrovertible proof of misconduct. Most importantly, we have to consider the very real and potentially damaging effects of wrongfully accusing students of cheating when they have not.*

What’s a Harried Faculty Member To Do?

If detectors aren’t reliable and our own skills at detecting AI writing are not mature, what’s the answer? While we will all be adjusting to this new reality for a while, we can keep some fundamental principles in mind to nudge our students towards transparency and academic honesty, the first of which is to give up on a surveillance mentality as it simply won’t be effective (and you don’t want to police students anyway, right?). Instead, think developmentally and pedagogically by taking these steps:

1. Shift from a reactive to a proactive stance. Test your assessments in a generative AI tool to see how vulnerable they are to AI use. Then make some intentional decisions about whether to change assessments or create new ones. In the long run, of course, it is all about our assessments. We may have used these same types of assessments for decades, but they simply may not work in the way we want them to in the age of AI. Review blog posts #4#5 and #6 to think about changes you may make to your assessments, or if you missed our Using P.I. to Manage A.I. series, see our suggestions there. Remember you can also make an appointment with a CAT developer to help you think this through.

2. Put a statement about AI in your syllabus clarifying acceptable use of AI! I can’t repeat this enough. Our colleagues at The Office of Student Conduct and Community Standards have expressed to us that it is essential to have clear guidelines clarifying what is and isn’t acceptable use of AI in our courses.

3. Engage your students in a discussion about generative AI and academic integrity, including why you have set the standards you have in your course. Remind them periodically about the ethics of generative AI use. (Look for an upcoming blog post for guidance on how to speak with your students about AI.)

4. Design courses that reduce the factors that induce students to cheat. James Lang, in his excellent book Cheating Lessons: Learning From Academic Dishonesty, reminds us that the literature on cheating points to an emphasis on performance, high stakes riding on the outcome, an extrinsic motivation for success, and a low expectation of success as factors that promote academic dishonesty. The good news is that we know also from the literature on learning that evidence-based teaching practices such as formative assessments, scaffolded assignments, ample opportunity for practice and feedback, development of a positive learning environment, and helping students to find relevance and value in what they are learning will both deter cheating by reducing these factors, and improve learning. Need help in reducing the temptation to cheat? Make an appointment with a CAT developer.

5. Plan thoughtfully for how you will manage situations where you suspect unauthorized use of generative AI, starting with a conversation with the student. (We’ll include advice on how to speak to students in the aforementioned future blog post.)

There is no doubt that generative AI is a disruptor in the educational space. Our response to that disruption matters for learning and for our relationship with students. Let’s work together thoughtfully towards a productive and forward-looking response. The answer is not detection—it is development

*Note: If I haven’t convinced you to avoid these flawed detectors in accusing students of cheating, I agree with Sarah Eaton that it is essential to transparently state in your syllabus that you will be using detectors. Do not resort to deceptive practices in an effort to “catch” students. In addition, never use detectors as the sole source of evidence as, of course, the results may not be reliable.

Stephanie Laggini Fiore serves as Associate Vice Provost at Temple University’s Center for the Advancement of Teaching.

A Survival Guide to AI and Teaching pt.7: Inoculating Our Students (and Ourselves!) Against Mis- and Disinformation in the Age of AI

Dana Dawson

In a previous blog post in this series, we suggested making generative AI a subject of critical analysis in your courses. Here, we will focus on the importance of teaching our students to critically engage with content generated by AI tools and with the implications of generative AI use for our information environment. This topic lies at the intersection of digital literacy, information literacy and the newly emerging field of AI literacy (Ng et al.; Wuyckens, Landry and Fastrez). While our students will need to develop the digital literacy required to solve problems in a technology-rich environment characterized by the regular use of AI tools, they will require the information literacy skills to navigate through a complex information ecosystem. Though generative AI tools are digital tools that generate information, we have a tendency to interact with AI tools as if they are social beings (Wang, Rau and Yuan, 1325-1326) and the manner in which they generate information requires special attention to issues of authorship, the impact of data-set bias and the potential automation of disinformation dissemination.

  • As the efficacy and availability of generative AI tools advances, both we and our students will face a variety of information-related challenges. Generative AI can be used to automate the generation of online misinformation and propaganda, significantly increasing the amount of mis- and disinformation we are exposed to online. Flooding our information environment with disinformation not only increases exposure to bad information, but distracts from accurate information and increases skepticism in content generated by credible scholarly and journalistic sources. Even where users do not intend to propagate misinformation, Ferrara and others have pointed out that bias creeps into text and images produced by generative AI through the source material used for training data, the design of a model’s algorithm, data labeling processes, product design decisions and policy decisions (Ferrara, 2).These limitations can result in the creation of content that seems accurate but is entirely made up, a phenomenon known as AI hallucinations.

Our task as educators is to prepare our students to navigate an information environment characterized by the use of generative AI by inoculating against disinformation, helping them develop the skill and habit of verifying information, and building a conception of the components of a healthy information environment.

Tools for Inoculation

Inoculating ourselves and our students against mis- and dis-information functions much the same as inoculating ourselves against viruses through controlled exposure. By “pre-bunking” erroneous content students may themselves create using generative AI tools or may encounter online, we can help reduce the potential for them to be misled in later encounters.

  • Ask students to use ChatGPT to outline one side of a contemporary debate and then to outline the other side of the debate. Have them experiment with prompting the tool to write in the voice of various public figures or to modify the message for different audiences. Analyze what the tool changes with each prompt. Look for similar messages in social and news media.
  • Use the resources of the Algorithmic Justice League to explore how algorithms reproduce race- and gender-based biases.
  • If you assign discussion board entries to your students, secretly select one student each week to use ChatGPT or another generative AI tool to write their response. Ask students to discuss who they believe used AI that week and why.
  • Have students experiment with Typecast or other AI voice generators to create messages in the voice of public figures that are aligned or misaligned with that individual’s stance on contemporary issues.
  • Have students investigate instances of the use of tools such as Adobe Express to create misleading images that circulated online (for example, fake viral images of explosions at the Pentagon and the White House). The News Literacy Project keeps a list here. Analyze who circulated the images and why. How were they discovered to be fake? Ask students to experiment with the image generating and editing tools used in the instances they discover, or with free alternatives.

Tools for Verifying Information

Zeynep Tufekci argues that the proliferation of generative AI tools will create a demand for advanced skills including “the ability to discern truth from the glut of plausible-sounding but profoundly incorrect answers.” Help your students hone their analytical skills, understand the emotional aspects of information consumption and develop a habit of questioning and verifying.

  • Increase students’ self-awareness of their own information consumption habits and methods for verifying information they are exposed to. Ask students to keep a journal for a week of their social media consumption and what they shared, liked, up- or down-voted, reposted, etc. on social media for a week. What kind of content do they tend to engage with? What feelings motivated them to share or interact with content and how did they feel afterward? If shared content included information or took a stance on a topic, did they verify before sending? What do they notice about their information consumption after observing their habits for a week, and what might they consider changing?
  • Introduce students to the SIFT method (Stop, Investigate the source, Find better coverage, Trace claims, quotes and media to the original context). Note that some students may already know of this popular approach to addressing information online, so be sure to first ask if anyone can describe the method for others. Discuss how this approach may need to be modified in the age of AI. Challenge your students to design a modified method that accounts for the difficulty of finding a source and tracing claims where generative AI tools are involved.
  • Given the difficulty or even impossibility of differentiating AI generated content from human generated content and tracing AI generated content to its source, help students focus on analyzing the content itself. Teach students lateral reading strategies and have them investigate claims in articles posted online using these strategies.
  • Develop your students’ habit of asking questions by utilizing tools such as the Question Formulation Technique (registration is free) and the Ultimate Cheatsheet for Critical Thinking.

Tools for Shared Understanding

One of the most insidious consequences of AI generated disinformation is the way in which it can undermine our confidence in the reality of anything we see or hear. While it’s important that we prepare students to confront disinformation and to be aware of how generative AI will impact their information environment, we must also reinforce the importance of trust and shared understanding for the functioning of a healthy democracy.

  • Help students recognize and overcome simplistic and dualistic thinking. Developing an awareness of the criteria and procedures used by different disciplines to verify claims will provide a framework for students to establish their ways of verifying claims. One approach might be to analyze the basis upon which generative AI tools such as ChatGPT makes claims.
  • If confronted by a clear instance of mis- or disinformation in the context of a classroom or course-related interaction (for example, a student asserts the truth of a conspiracy theory that is blatantly false in a discussion board post), correct the inaccuracy as soon as possible. Point to established evidence for your claim. Help students see the difference between topics upon which we can engage in fruitful debate and topics where there is broad agreement, and to identify bad-faith approaches to argumentation.
  • Ask students to create a healthy media diet for themselves. Where might they find verifiable information on topics of interest? What constitutes a good source of information on that topic?
  • Promote empathy for others. We are more likely to believe inaccurate information about others if we are already predisposed to think of those individuals or groups negatively.
  • Encourage students to see themselves as an actor within their information environment. Have them reflect on all of the sources of information they access and contribute to, including those within your class. Ask them to consider how they are using generative AI tools to inject content into that environment and what the implications of their decisions, and similar decisions of others may be on that information environment overall.

In the next installment of our series, we’ll dive a little deeper into the issue of bias and equity as it relates to AI. In the meantime, if you’d like to discuss digital literacy, artificial intelligence, or any other topic related to your teaching, please book an appointment for a one-on-one consultation with a member of the CAT staff.

References

Carolusa, Astrid, Yannik Augustin, André Markus, Carolin Wienrich.  Digital interaction literacy model – Conceptualizing competencies for literate interactions with voice-based AI systems.  Artificial intelligence, 2023, Vol.4, p.100114

Ecker, U. K., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., … & Amazeen, M. A. (2022). The psychological drivers of misinformation belief and its resistance to correction. Nature Reviews Psychology, 1(1), 13-29.

Ferrara, E. (2023). Should chatgpt be biased? challenges and risks of bias in large language models. arXiv preprint arXiv:2304.03738.

Goldstein, J. A., Sastry, G., Musser, M., DiResta, R., Gentzel, M., & Sedova, K. (2023). Generative language models and automated influence operations: Emerging threats and potential mitigations. arXiv preprint arXiv:2301.04246.

Ng, Davy Tsz Kit, Leung, Jac Ka Lok, Chu, Samuel Kai Wah and Qiao, Maggie Shen. Conceptualizing AI literacy: An exploratory review. Computers and education. Artificial intelligence, 2021, Vol.2, p.100041

Organization for Economic Co-operation and Development, 2013

Wang, Bingcheng, Rau, Pei-Luen Patrick, Yuan, Tianyi. “Measuring user competence in using artificial intelligence: validity and reliability of artificial intelligence literacy scale.” Behaviour & information technology, 2022, Vol.42 (9), p.1324-1337

Wuyckens, Geraldine, Landry, Normand, and Fastrez, Pierre. Untangling media literacy, information literacy, and digital literacy: A systematic meta-review of core concepts in media education. Journal of Media Literacy Education, 14(1), 168-182, 2022  https://doi.org/10.23860/JMLE-2022-14-1-12

Dana Dawson serves as Associate Director of Teaching and Learning at Temple University’s Center for the Advancement of Teaching.