Post-Election Resources for Faculty

As human beings, ongoing global and national events—such as elections, wars, and political conflict—can significantly impact our emotions. Even if you do not teach a course directly related to current events, you may find that your students are discussing these issues with each other and want you to acknowledge what is happening and its impact on them and their communities. Whatever the outcome of this election, it will be on everyone’s minds, so preparing ourselves and our students for any moments of conflict and emotional intensity that may arise is helpful. 


The CAT has created a collection of election resources to assist you in planning for election day and the days following. Also available to you is the CAT Tip Video Series, Teaching in Tumultuous Times, which was created to provide concrete steps and strategies for preparing ourselves and our students for challenging conversations and hot moments. These include:

Begin the semester by setting the tone for your classroom. Dana walks you through some suggestions for doing just this.

Linda offers suggestions to help us think about how we might prepare ourselves for challenging conversations in the classroom by using the Hopes, Fears and Agreements activity to build community and set guidelines for interactions within our courses.

Knowing your triggers and what to do about them is important and so is building a community of support for yourself. Jeff outlines shat happens when we feel triggered and some things you might consider doing to work learn to recognize and handle those triggers.

Elizabeth summarizes and represents three ways one might slow things down when a conversation begins to get heated.

We all know that hot moments can happen in the classroom. In this video, Stephanie offers some tips for how to manage them and how to get your class back on track.

While we recommend taking a little time to review the resources above, we’ve also created a quick-start guide to managing hot moments: The Five Rs: Remind, Reflect, Regroup, Recess, Refer. Please take a moment to review these ideas. A few moments of preparation can make a difference in your class. As part of that preparation, make sure to have the Dean of Students Post-Election Resources webpage handy to provide resources to students who may need them.

Remember too that self-care is important at this time. We encourage you to find a community of people you can go to for support and perspective, as these events can impact your own well-being. The CAT will be hosting Drop-In Water Cooler sessions to provide space for open discussion and community with CAT consultants and your colleagues on the following dates (no registration required):

  • Thursday, November 7: 11:00-12:30 (Zoom
  • Friday, November 8 12:30-1:30 (TECH 107)
  • Monday, November 11: 11:00am-1:00pm (TECH 107)

As always, we invite you to connect with us at the CAT, where we are here to support you in person and online through 1-1 consults, open lab hours, and workshops.

As always, we invite you to connect with us at the the CAT, where we are here to support you in person and online through 1-1 consults, open lab hours, and workshops.

Beyond Jamboard: Top Collaborative Whiteboard Alternatives for Teaching

By Jonah Chambers

Whether you’re a current Jamboard user or looking to integrate new tools into your teaching, collaborative online whiteboards offer students a shared virtual space for brainstorming ideas, visualizing concepts, and engaging in group work. As Google plans to retire its popular Jamboard service on December 31, 2024, here are some excellent replacement whiteboarding tools to consider.

Zoom Whiteboard

For a Temple-supported option, Zoom Whiteboard stands out as an excellent choice. Although it can be used within Zoom meetings, Whiteboard can also be created and shared outside of meetings for use during in-person classes or for students to do work asynchronously. 

Zoom Whiteboard mirrors many essential features of Jamboard, such as posting digital Post-It notes for brainstorming and gathering quick feedback. Participants can also comment on each other’s posts and draw connecting lines between elements, which is ideal for diagramming and visualizing complex concepts. For group work, educators can create multiple pages, providing dedicated workspaces for each team. Another significant advantage of Zoom Whiteboard is the ability for creators to lock their content, preventing accidental deletions—an enhancement over Jamboard’s limitations.

If you’re interested in having your students use AI, Zoom has integrated its AI Companion tools directly into Whiteboard, allowing educators and students to generate content as a jumping-off point for further engagement or discussion. Its combination of features makes Zoom Whiteboard a robust tool for fostering engagement and creativity whether you are teaching in person, online, or asynchronously.

Miro

For educators seeking a more feature-rich experience, Miro presents a compelling alternative. While Temple does not provide support or licenses for this platform, educators can apply for a free Education Plan. Miro offers a wide array of functionalities, including the ability to post digital sticky notes, draw connections, mind map, embed images and videos, and create detailed diagrams, flowcharts, and wireframes.

One standout feature of Miro is its Frames tool, allowing users to organize boards into sections designated for different groups or different activities such as voting or brainstorming. The Presentation mode provides a structured approach for reviewing the fruits of group work or the outcomes of an activity with the entire class at the conclusion of a session.

Miro makes collaboration easy since students do not need individual accounts to participate. Educators can simply generate a link that allows anyone to interact with the board, similar to Google’s sharing features. For those who have previously used Jamboard, Miro also allows for the importation of existing Jamboard projects, ensuring that valuable work is preserved.

Padlet

Unlike the free-form environments of Jamboard, Zoom Whiteboard, or Miro, which can sometimes feel overwhelming due to their open layout, Padlet offers preset board layouts to help you effectively organize student contributions. For example, during a class debate where you ask students to offer supporting evidence or reasoning on a position, you can create distinct columns on the board for each position and ask students to contribute their ideas (using post it notes, audio, video, images and documents) in clearly defined sections. This structured method makes Padlet particularly valuable for facilitated discussions and activities that involve categorization.

While Temple does not support or hold a license with Padlet, free accounts enable users to create up to three Padlets. Once you’ve completed an activity, you can easily export these Padlets as images, allowing you to clear the board and recycle it for future classes or activities.

While the retirement of Jamboard is disappointing, we’re excited to share these alternatives that, in many cases, provide even more robust features and functionality. If you’d like to dive deeper into any of these tools or discuss how to integrate whiteboarding solutions into your teaching, feel free to schedule a one-on-one consultation or stop by one of our EdTech Labs.

 

Jonah Chambers works at Temple University’s Center for the Advancement of Teaching as Senior Educational Technology Specialist.

Set for the Semester: Essential CAT Resources for Fall Teaching

As you ready yourself and your courses for the start of the fall semester, we offer this resource round-up to help get you rolling.

Preparing Your Syllabus

The Fall 2024 Syllabus Guidance has been posted and includes sections on generative AI and the recording and distribution of recordings of class sessions. 

Using Canvas

Drop into one of our upcoming workshops on teaching with Canvas. If you’re new to Canvas, take our self-paced Ready, Set, Canvas course to get you up to speed. 

Getting Ready for the Election

Plan ahead by creating discussion guidelines and preparing for hot moments in the classroom with our Election Resource Guide.

Teaching and Learning with Generative A.I.

Visit our Faculty Guide to AI webpage.

This EDvice Exchange blog post rounds up past posts on teaching and generative AI. Sample syllabus statements outlining ideas for your policy on generative AI can be found here

CAT Services

The Center for the Advancement of Teaching (CAT) offers an array of services and resources that are tailored to your teaching. Whether you teach in person or online, undergraduate or graduate students, small classes or large lectures, we are here to support your instructional needs. As you begin the 2024-2025 academic year, know that the CAT is here to help.

CAT Technology Labs and Workspaces

Visit us at the following locations for:

  • Drop-in expert assistance with your teaching and educational technology questions (no appointment necessary)
  • Quiet workspaces
  • Access to computers, scanners, and printers
  • Comfortable lounge area with complimentary coffee for informal discussions with colleagues

Main Campus CAT: TECH Center (Bell Building), Suite 112

Hours: Monday through Friday, 8:30am-5:00pm

Health Sciences Campus CAT: Student Faculty Center, Room 200

Hours: Monday through Friday: 8:30am-5:00pm 

Ambler Campus CAT: Ambler Learning Center, Room 301

Hours: Monday through Thursday: 8:30am-10:00pm; Friday: 8:30am-5:00pm

Note: This self-service lab and workspace is for faculty use only. No CAT consultants are on site.

Virtual Drop-in EdTech Lab: available online via Zoom. The virtual lab is staffed by our Educational Technology Specialists during our regular business hours. Visit our Educational Technology Labs and Workspaces page for access. 

Hours: Monday through Friday, 8:30am-5:00pm.

For more information about our locations, see our website

Room Reservations

The CAT offers meeting spaces for faculty and staff at our Main Campus location. Visit CATbooking to reserve time in one of the following spaces:

  • Room 107: Breakout Room (Capacity: 12)
  • Room 110A: Collaboration Room (Capacity: 5)
  • Room 112G: Consult Room (Capacity: 2)

One-on-One Consultation Appointments

If you would like uninterrupted, dedicated time with a CAT pedagogy or educational technology specialist, please visit CATbooking to schedule a one-hour in-person or virtual consultation. When making an appointment with a consultant, you will have the option to choose an in-person meeting at the Main Campus CAT or Health Sciences Campus CAT, or a virtual (Zoom) meeting. Within the locations, appointment times are categorized by consultation topics offered at the CAT. If the hours available in the booking system are not feasible for you, please email cat@temple.edu to arrange an alternate time. 

Teaching Observations

Whether you are implementing a new teaching strategy, trying to solve a teaching challenge, or simply would like to check in with a colleague in order to reflect on your teaching, the CAT offers a variety of services. 

  • Classroom Observations: Set up an observation of your in-person or virtual class
  • Mid-semester Instructional Diagnosis: A CAT specialist visits your class to gain student consensus on their learning experience, and deliver that feedback to you 
  • Course Design: Work with a pedagogy specialist to design a course using evidence-based principles for delivering significant learning opportunities to your students.
  • Curriculum Mapping and Program Assessment: Get assistance designing and assessing a programmatic curriculum that will provide a pathway for student achievement of your program’s goals. 

Please visit our Consultations and Observations web page for more information.

Get in touch with us!

We’re here to help! If you need anything as you begin your semester, please email us at cat@temple.edu or call our main campus office at 215-204-8761.

Get Ready for Fall! A Course Design Round-Up

By Jeff Rients

Can you believe it is August again?!? If you are anything like me, then you already feel behind in prepping for the fall semester. But the CAT is here to help! We’ve got a wide variety of resources to help you start the new semester off right. 

If you are building a new course or substantially revising one, our Course Design blog series may be of assistance. Design Your Course for Significant Learning! provides a big picture overview of the design process, while Context Matters: Considering Situational Factors in Course Design can help you think about the specific challenges and key details in your course. (The big situational factor we’re all facing right now is generative artificial intelligence.) Our post Learning Goals: Dream Big! provides a framework for formulating learning goals that motivate students for success, while Aligning Assessments with Goals connects those goals with how you will assess and provide feedback. The post The Heart of the Course: Learning Activities! reviews the importance of providing students with the appropriate content and opportunities for practice before you assess them. And finally, Houston, We Have Liftoff! Successfully Implementing Your Course Design helps you put it all together.

Finally, if you want to do some thinking about alternative assessment plans, try our post A Brief Introduction to Ungrading.

You don’t have to design or redesign your course by yourself! You can make an appointment for a one-on-one consultation or visit any of our Ed Tech labs for drop-in assistance. Please join us for any of the many workshops the CAT is offering this fall. And if you need any other support in your mission teaching our wonderful Temple students, just let us know!

Navigating AI: Essential EDvice Exchange Reads for Fall

by Dana Dawson, Ph. D.

With the beginning of the fall semester steadily approaching, you may be pondering how you will address the use of generative AI in your courses. To help with your decision making, AI student guideline drafting as well as activity and assessment designing, we encourage you to take a look back at EDvice Exchange posts on the topic of planning for AI use in your classes. 

A Survival Guide to AI and Teaching

Our series “A Survival Guide To AI and Teaching” featured posts on everything from what generative AI is to the ways these tools will impact equity in education. 

Faculty Adventures in the AI Learning Frontier

Our spring 2024 series “Faculty Adventures in the AI Learning Frontier” showcased how Temple faculty members used and talked about AI in their classrooms during the previous semester.

In the final post in the series, Michael Schirmer, who teaches in the Fox School of Business, shared his experiences with generative AI both with his students and as part of his personal scholarly practice.

Using PI to Manage AI

Our series “Using PI to Manage AI” considered the most fundamental element of addressing AI in our courses: sound pedagogy. Posts in this series focused on evidence-based ways of designing assessments of student learning that encourage academic honesty, motivation, and a desire to learn.

CAT Tips Season 5 – Generative AI Tools

Finally, our most recent series of CAT Tips, short videos offering teaching tips and suggestions, focused on how generative A.I. tools can be used to support student learning.

CAT staff are available for consultation throughout the summer. To schedule an appointment, visit our consultation booking page. The CAT’s Ed Tech labs are also open Monday through Friday, 8:30-5 if you’d like to drop in and chat with a staff person about AI tools (or any other educational technology question). 

Our Faculty Guide to A.I. webpage features helpful information and resources, including information on Temple’s policy. 

We also welcome you to sign up for one of our pre-semester AI workshops.

AI or Nay? Deciding the Role of Generative AI in Your Classroom

Thursday, August 8, 2024, 11:00AM-12:00PM via Zoom

Register for this workshop

Generative AI tools to help with research, writing, ideation and creative work are now part of our educational landscape and cannot be ignored, but you may be feeling unsure how to address them in your classes. Should you encourage their use but place parameters on how they are used? Will allowing students to use AI tools reduce the efficacy of your classes? Or is AI use now an essential skill for our students? In this workshop, we will explore the implications of allowing students to use generative AI tools and help you work through the question of whether and how you will permit their use in your classes.

AI Assignment Re-Do Bootcamp

Register for this workshop

Monday, August 12, 2024 and Wednesday, August 14, 2024, 1:00PM-4:00PM, Tech 109

We have been learning more about generative AI tools such as ChatGPT and what it means to teach in a world where generative AI is available to us and our students. In this 2-day intensive workshop, faculty will collaborate to revise and/or develop assignments for the AI era. You will learn how to intentionally leverage AI tools for learning and development as well as how to modify assignments to make them more AI resistant. You will also learn a framework for discussing the assignment and the role of AI in your classroom with students. This is an opportunity for you to develop the best assignments you’ve ever used in your classroom. Bring an assignment you already use and stride boldly into the future with us!

AI Sandbox

Register for this workshop

Wednesday, August 14, 2024, 12:00PM-1:00PM, Tech 109

Have you been putting off familiarizing yourself with AI tools? Or have you tried some of the more commonly available apps but would like to learn more about the wide variety of tools that are available? Join us for a hands-on exploration of the AI tools that are changing the way we live, work and study.

Mindful Management of AI During Finals

by Dana Dawson

As we near the end of the semester, it’s important to carefully consider your plan of action should you suspect students have used generative AI in a manner that you explicitly prohibited. In past blog posts, we strongly encouraged faculty members to begin by meeting with the student in cases where you suspect unacceptable use of AI and to start with a conversation. However, in the case of final exams and projects, you may feel you don’t have time for that course of action. In this post, we offer some suggestions for how to prepare for and address AI use during the finals period.

Ensure Guidelines Are Clear

Review your final exams and final project instructions to determine whether you have clearly outlined where the use of generative AI is and is not allowed. Build guidelines into assignments as well as the syllabus to ensure students have it readily available. Have a conversation with your classes to ensure they understand the limitations of acceptable generative AI use and state the steps that will be taken if you suspect students have used generative AI (more on that below). 

Test Your Final Exams and Final Projects Using Generative AI

Run final exam questions or final project prompts through tools such as ChatGPT and Claude.AI and prompt the tools to take the exam or complete the project. Note that in ChatGPT, you can simply copy and paste the entire exam or project prompt and rubric into the tool and ask it to generate a response. Claude.AI allows you to upload a pdf and enter a prompt. If you find that the tools can successfully complete your exams or assignments, reconsider the questions and prompts. Can you link questions or project prompts to in-class work that will draw on students’ past experiences? Can you add reflective or metacognitive questions that are difficult to replicate using generative AI? See this EDvice Exchange blog post for assessment ideas that are less prone to AI use.

Be Wary of AI Detectors

It has been well-established that AI detectors are not reliably able to differentiate between human- and AI-written text. Assessments we conducted of Turnitin’s AI detector, and four other applications available for free online, show that these detectors are prone to false positives (identifying human-written text as generated by AI) and false negatives (identifying AI-written text as generated by humans). AI detectors should never be used as the sole basis for a judgment on whether a student has used AI; companies such as Turnitin acknowledge this, for example, saying in their own explanatory materials that detector predictions should be taken with a grain of salt and that the instructor must ultimately make their own interpretations. Notably, TurnItIn also indicates that a score of 20% or less AI-created should not be considered valid. As you assess AI detector reports, keep in mind that there are currently no completely reliable detectors of generative AI use in writing available to instructors.

Step on the Brakes

Confronting possible cheating is always stress-inducing. We see a block of text or a pattern of answers that seem unlikely to have been generated by a student and the stress response kicks in. This is not the optimal time to make a decision. Take a breath, step away. Consider factors that might be influencing your assessment of the student’s work or your willingness to accept the results of an AI detector. Talk to a colleague or a CAT consultant and carefully consider all factors before making a determination as to your course of action.

You Can Still Have a Conversation with Students

If you strongly suspect a student of using generative AI in a manner you have stated is not acceptable, ask the student to meet, by Zoom if they are already off campus. If they are not able to meet prior to the end of the grading period, issue an Incomplete for the course and do not grade the final exam or project until you have met with the student. 

Have a Back-Up Plan

If you speak with the student and they do not admit to using generative AI, have an actionable plan for how to proceed. Consider how you might replicate the element you suspect they used AI to complete. Can you conduct an oral exam? Can they write an essay or a reflective statement on their process of solving the exam question or completing the project in-person? To talk over your plan for considering possible AI use in these final weeks of the semester, don’t hesitate to reach out to schedule a consultation with a CAT specialist. 

Err On the Side of Caution

The suspicion that a student may be taking shortcuts can be upsetting and we are all struggling to manage course design and delivery in the age of AI but the risk of falsely accusing a student should be taken very seriously. A false accusation can derail a student’s entire educational trajectory and not only because of the possible impact on their GPA; more importantly, it can shake their trust in their faculty members, their experience with higher education and their motivation to continue, particularly where their sense of belonging is tenuous. Turnitin has acknowledged that their detector is more likely to generate a false positive in the case of English language learners or developing writers as some of the writing patterns more common among these populations are the same patterns AI detectors look for in identifying AI-generated text. We must exercise the utmost caution in accusing any student and be sure to give them the benefit of the doubt when engaging in these conversations. 

Plan for Next Semester

Finally, once finals are over and your grades are in, make an appointment with a CAT specialist to explore how to revise assignments that are particularly vulnerable to AI use. We can often avoid these problems in the future by revising our current assessments into ones that work better in the age of AI.

2024 STEM Educators’ Lecture Recap

By Cliff Rouder, Ph.D.

The CAT’s STEM Educators’ Lecture, held on April 10, 2024, featured guest speakers Dr. Tara Nkrumah and Cornelio “Coky” Aguilera. Dr. Nkrumah is an Assistant Professor in the Department of Teacher Preparation, Mary Lou Fulton Teachers College at Arizona State University. Her research is on equitable teaching practices for anti-oppressive discourse in education and science, technology, engineering, and mathematics (STEM). Coky Aguilera studied as an Acting Specialist at UW Madison, works professionally with Tampa-area theater companies, and along with Dr. Nkrumah and colleagues have brought the Theatre of the Oppressed to different universities to engage academic audiences in critical investigations of inequities. Check out this Youtube video to learn more about the historical roots of Theatre of the Oppressed.

We were delighted to have their colleagues Dr. Vonzell Agosto, Dr. Deirdre Cobb-Roberts, and doctoral candidate Maria Migueliz Valcarlos join as they engaged Temple STEM and theater faculty in an interactive and engaging session titled, Unmasking the “Isms” in STEM Education to Promote Equitable Teaching and Learning. The speakers began by introducing a framework for the session–Iris Marion Young’s Five Faces of Oppression. They used this framework to help us think about how “isms” such as racism, ableism or genderism can manifest through the five faces of oppression, which are 

  • Exploitation
  • Marginalization
  • Powerlessness
  • Cultural Imperialism
  • Violence
  • For a more in-depth look at this framework, see Young’s “Five Faces of Oppression” in Geographic Thought: A Praxis Perspective.

As participants worked through definitions of these facets of oppression and shared examples of how they can manifest in our disciplines, departments, and classrooms, the speakers then engaged participants in a series of theater-based exercises that encouraged them to use mimicry and the creation of human tableaus to explore and address physical and emotional aspects of oppression.

For more on Dr. Nkrumah’s research, check out these recent publications:

  • Nkrumah, T. (2023). The Inequities Embedded in Measures of Engagement in Science Education for African American Learners from a Culturally Relevant Science Pedagogy Lens. Education Sciences, 13(7), 739.
  • Nkrumah, T., & Scott, K. A. (2022). Mentoring in STEM higher education: a synthesis of the literature to (re) present the excluded women of color. International Journal of STEM Education, 9(1), 1-23.
  • Nkrumah, T., & Mutegi, J. (2022). Exploring racial equity in the science education journal review process. Science Education, 1-15. https://doi.org/10.1002/sce.21719

As always, our CAT staff is ready to help you! To explore how to incorporate this work into your STEM courses or how to design and implement classroom-based research in this area, book a consultation appointment or email a CAT staff member directly.

Faculty Adventures in the AI Learning Frontier: Assignments and Activities that Address Ethical Considerations of Generative AI Use

by Benjamin Brock, Ph.D and Dana Dawson, Ph.D

Title card: Faculty Adventurers in the AI Learning Frontier

In response to our fall 2023 survey on the use of generative AI (GenAI) in the classroom, we received a number of assignments and activities faculty members have designed to tackle the ethical issues raised by GenAI. Ethical concerns related to GenAI include such considerations as the implications for privacy when these tools are used, the possibility of over-reliance on GenAI for analytics and decision making, and exposure to inaccurate or biased information (Brown & Klein, 2020; Masters, 2023; Memarian & Doleck, 2023). The following activities and assignments equip students with the capacity to critically evaluate when and how it is appropriate to use GenAI tools and to protect themselves against possible risks of AI use.

Sherri Hope Culver, Media Studies and Production faculty member and Director of the Center for Media and Information Literacy (CMIL) at Temple University, asks students in her GenEd course, Media in a Hyper-Mediated World, to complete a reflection on the implications of AI use. She first asks them to listen to an episode of the podcast Hard Fork centered on data privacy and image manipulation and to read the Wired article “The Call to Halt ‘Dangerous’ AI Research Ignores a Simple Truth” (Luccione, 2023). Students are then instructed to write a 300-word reflection referencing the assigned material that addresses both concerns they have about use of AI and ways in which it could make their lives or society better. Professor Culver provides the following prompts to help students’ thinking:

  • What does critical thinking mean in a tech-centric, AI world?    
  • How might AI affect your free will?    
  • How might AI affect your concerns about privacy or surveillance?    
  • How should we prepare ourselves for an increasingly AI world?    
  • How might AI influence the notion of a public good?   
  • How might AI influence K-12 education?    
  • How might AI influence family life?    
  • What worries you about AI?    
  • What excites you about AI?    
  • What is our responsibility as media creators when we use AI?    
  • It has been said that AI will make life more “fast, free and frictionless.” Should everything first be “fast, free and frictionless”? Should that be the aim?
  • Is AI the end of truth?

In a dynamic, interactive, reflection-oriented honors course aimed at exploring the four pillars of Temple’s Honors Program (inclusive community, intellectual curiosity, integrity in leadership, and social courage), Dr. Amanda Neuber, Director of the Honors Program, is using AI as the discussion anchor for their unit on “integrity in leadership.” By way of multiple media modalities, students delve into the ethical and unethical uses of AI in academia. Students are asked to read “How to Use ChatGPT and Still Be a Good Person” and watch a related video exploring the meaning of integrity. Students then discuss whether or not AI can be used with integrity, how academic culture might frame one’s decision to use AI, and the “peaks and pitfalls” of AI use. Beyond the many important conversations focused on AI itself, the technology is used as a reference point as to what it means to lead with integrity and how to promote said quality in teams and organizations.

In another interactive, thought-based classroom initiative, mechanical engineer Dr. Philip Dames is bringing ethics and AI to Temple’s College of Engineering. Having reimagined for a modern era the “trolley problem” philosophical exercise in which one is faced with an ethical dilemma, students in Dr. Dames’ class consider having AI make decisions using autonomous cars as the basis for deliberation. They are prompted to think about how a vehicle should be programmed to respond to different scenarios by using examples from MIT Media Lab’s Moral Machine website. Students then reflect upon their scenario-based activities and experiences and engage in prompt-guided written reflection. Prompts include questions such as: 

  • How does the ownership model of autonomous vehicles affect how they should behave? For example, does it make a difference if a vehicle is owned by a single private citizen vs. publicly owned by the city and hired by individuals? 
  • What surprised you about the aggregated responses from different people shown to you at the end of the exercise? 
  • Are there other factors that you feel are important but were not considered in Moral Machine?

In this way, students not only explore elements to consider when designing autonomous vehicles, but make concrete what was once only abstract via critical thinking and hands-on engagement.

If you’d like more guidance on exploring how to use AI tools in your class, please visit our Faculty Guide to A.I. and/or book an appointment for a one-on-one consultation.

Brown, M., & Klein, C. (2020). Whose data? Which rights? Whose power? A policy discourse analysis of student privacy policy documents. The Journal of Higher Education, 91(7), 1149–1178. https://doi.org/10.1080/00221546.2020.1770045  

Masters, K. (2023). Ethical use of artificial intelligence in health professions education: AMEE Guide No. 158. Medical Teacher, 45(6), 574–584. https://doi.org/10.1080/0142159X.2023.2186203  

Memarian, B., & Doleck, T. (2023). Fairness, accountability, transparency, and ethics (FATE) in artificial intelligence (AI) and higher education: A systematic review. Computers and Education: Artificial Intelligence, 5 (2023), 1-12. https://doi.org/10.1016/j.caeai.2023.100152

Faculty Adventures in the AI Learning Frontier: Teaching with Generative AI in Health Sciences Education 

by Jonah Chambers, MA and Cliff Rouder, EdD 

Title card: Faculty Adventurers in the AI Learning Frontier

As part of our fall 2023 survey on generative AI (GenAI) in the classroom, we heard back from a wide variety of Temple faculty who teach a broad range of courses. In this installment, we’re going to take a look at how three health science instructors are incorporating GenAI tools like ChatGPT into their teaching.

Scott Burns, Professor of Instruction in the Department of Health and Rehabilitation Sciences, had his graduate physical therapy students prompt ChatGPT to create a generic plan of care for a specific health condition and then provide a detailed explanation of how the exercises it prescribes may or may not properly address the condition described in the scenario under consideration. In addition to having students demonstrate their knowledge of what constitutes a good plan of care by evaluating and critiquing the AI-generated plan, Professor Burns explains that the goal of the activity is to highlight that while generative AI may be useful for broad recommendations, it “currently lacks the ability to provide decision-making and rationale backed by anatomy, neuroscience, motor control/learning, and physiology.” 

Before he launched the assignment, Dr. Burns surveyed his class about their experiences with and perceptions of GenAI. He also wanted to gauge the level of anxiety surrounding it, given that there is concern in health-related fields that AI could replace the human provider. Students reported that they appreciated the opportunity to interact with AI, since the experience level with AI varied, and some had never even used it before. Dr. Burns plans to administer a more formal survey for the end of the semester to see if student perceptions of AI have shifted.

Alissa Smethers, Assistant Professor in the Department of Social and Behavioral Sciences, had her nutrition students prompt ChatGPT to create a 1-day, 2,000 kcal dietary pattern for a popular diet of their choice (Keto, Paleo, Atkins, etc.) and then submit the outputs to an established dietary analysis program and answer the following questions:

  • Does the plan provide 2000 kcal? If not, how far off was it?
  • Does the macronutrient composition and food choices reflect the popular diet you chose? If not, what foods would you add/remove?

Her students were surprised at how far off ChatGPT was at times, in some cases generating plans that differed by over 800 kcal from what the dietary analysis program provided. The goal was not only to ensure that students are learning the correct information but also that they develop critical thinking and research skills crucial to their work as nutrition professionals. In the future, she is considering having students evaluate how well ChatGPT is able to tailor the dietary patterns based on culture, income level, or other more personalized factors as well as reflect on the limitations of using a generative AI tool to create dietary patterns vs. working with a nutrition professional like a Registered Dietitian.

Leah Schumacher, Assistant Professor in the Department of Social and Behavioral Sciences, invited Health Science and Human Behavior students to roleplay as someone who either wants to avoid or already has a chronic disease and has turned to ChatGPT to provide answers or advice. She first asked students to pick one of the diseases they covered in her class and then pose questions about it to ChatGPT such as “Why did I have a stroke?” or “How do I avoid getting cancer?” She then had students prepare a submission for the assignment that included: 

  1. The full prompt they submitted to ChatGPT
  2. The full response ChatGPT provided
  3. A short 5-7-sentence reflection that compared the ChatGPT response to what they had learned in class through textbook readings, lectures, videos, etc. Specifically, she asked students to reflect upon the extent to which ChatGPT’s response hit upon aspects of the biopsychosocial model they studied in class, whether it touched upon major risk factors they covered, and if ChatGPT presented any information that was new to them.

Dr. Schumacher was careful to have students clearly distinguish between text generated by ChatGPT and their own written work in their submission. Not only did this assignment have students apply their understanding of the biopsychosocial model to a diverse set of cases, it also gave them the opportunity to reflect upon (and illuminate problematic aspects of) how people may use ChatGPT in their everyday lives.

Each of these professors has illuminated one of the most powerful ways of using GenAI in teaching: instead of taking its outputs at face value, they have their students question, evaluate, analyze and verify them using a variety of methods. Not only does this provide students an opportunity to apply their knowledge (a proven way to promote deep learning), but it also helps them sharpen their critical thinking skills surrounding the use of GenAI. These skills will likely not only prove helpful to them now but also in their future professional lives.

In the next installment, we’ll be looking at ethics in AI. In the meantime, if you’d like more guidance on exploring how to use AI tools in your class or assistance running your assignments through GenAI to better assess the value of using it, please visit our Faculty Guide to A.I., attend a workshop on using generative AI for teaching and learning, or book an appointment for a one-on-one consultation.

Faculty Adventures in the AI Learning Frontier: AI and (First Year) Writing

by Jeff Rients

Title card: Faculty Adventurers in the AI Learning Frontier

As part of our fall 2023 survey on AI in the classroom, we heard back from a wide variety of Temple faculty who teach a broad range of courses. In this installment, we’re going to take a look at what three First Year Writing instructors are doing with AI tools like ChatGPT.

 

First year writing instructor Jacob Ginsburg incorporated “AI and education” as a theme in his course. His students read Ted Chiang’s “Chat GPT is a Blurry JPEG of the Web,” Matteo Wong’s “AI Doomerism is a Decoy,” and some academic articles about the role of AI in education. In class, each student writes a paragraph about what it means to them to be a member of their generation. As homework, they then give ChatGPT four tasks:

  1. Respond to the same prompt as they wrote about in class (i.e. what it means to be a member of their generation).

  2. Make an argument FOR the use of AI in education.

  3. Make an argument AGAINST the use of AI in education.

  4. Each student devises a “silly” or “fun” task of their own.

Afterwards, everyone then discusses their prompts and results in class.

 

Professor Amy Friedman challenges her students to write an essay in which they summarize several disparate, current articles on generative AI in education and learning. She has used articles such as Valerie Pisano’s “Label AI-Generated Content,” Allison R. Chen’s “Research training in an AI world,” and Naomi S. Baron’s, “How ChatGPT Robs Students of Motivation to Write and Think for Themselves.” Her goal is for each student to formulate and articulate their own opinion about the role of generative AI in their own learning and education. Beforehand, students explore ChatGPT in class, including asking it to write in response to previous essay prompts. The class then collectively assesses the results and compares them to their own writing. 

Meanwhile at Temple’s Japan campus, Ryan Rashotte has developed two activities for his first year writing students. In the first one, students writing essays about a film ask ChatGPT to write a paragraph regarding how a specified element in the film supports a theme they are exploring. In response, students write about the strengths and weaknesses of ChatGPT’s argument. In the second assignment, students working in groups explore which art form they think is superior – television or film. As part of this investigation, they query ChatGPT for reasons in support of their choice. Students identify new and/or interesting arguments and identify their strengths and weaknesses. They are asked to consider how well the ChatGPT output would work if it were incorporated into their essay.


In the next installment, we’ll be looking at the way AI tools are being used in a variety of health sciences learning environments. In the meantime, if you’d like more guidance on exploring how to use AI tools in your class, please visit our Faculty Guide to A.I. and/or book an appointment for a one-on-one consultation.