Lunch and Learn: Practical AI Pedagogies

On Thursday, February 19th, the Center for Teaching Excellence and Innovation (CTEI) hosted a Lunch and Learn featuring two faculty members discussing Practical AI PedagogiesEmily Fisher, Director of Undergraduate Studies and Associate Teaching Professor of Biology, and Illysa Izenberg, Associate Teaching Professor in JHU’s Center for Leadership Education, presented two approaches to integrating AI into their courses. Caroline Egan, Teaching Academy Program Manager, moderated the discussion.

Fisher began the presentation by describing how she and co-instructors Nichole Broderick and Chiara De Luca redesigned an exam preparation assignment to intentionally incorporate generative AI, also known as the “AI Study Buddy,” in their Molecular Biology course. Their goal was not to outsource thinking to AI, but to foster metacognition; they wanted students to appreciate the role of retrieval in learning, view exam prep as a learning opportunity, and practice self-directed learning and studying.

In previous iterations of this assignment (2020 – 2024), students were asked to write and answer their own exam questions (without AI). While this exercise was useful, it was unclear how many students had improperly used AI to complete their work. Also, Fisher noted that many of the questions produced were generally of low quality.

In 2025, the instructors reframed the activity: instead of avoiding the use of AI, they integrated it deliberately. Students were asked to use HopGPT (GPT-4o mini) to complete the following steps:

  1. Prompt AI to generate a challenging exam question about molecular biology (without providing the answer).
  2. Answer the AI-generated question themselves.
  3. Ask AI to answer the same question.
  4. Evaluate the AI’s response for:
    -Accuracy
    -Similarity to their own answer
    -Alignment with the style of course exams
    -Overall usefulness as a study tool

The assignment was worth 2 points out of a total of 106 for the course. A non-AI alternative was available for students who preferred not to use generative AI. (Two students selected this option.)

Computer keyboard with "AI" key addedOverall, students rated the AI answers as highly accurate – but many times the students’ answers were different than the AI answers. Questions and answers were often very broad; the course content is much more detailed and includes experimental techniques that the AI answers did not include. Still, student feedback was mostly positive, rating it about “medium” in usefulness in helping them study. Fisher added that it was a valuable exercise, moving the focus away from writing exam questions and instead towards answering, analyzing, and critically evaluating content.

There were a few questions from the audience about the AI Study Buddy:

Q: Are there any data or future plans to compare these results with having students study together, asking each other questions?
EF: Not at the moment. This would be a challenge. I would have to trust that they weren’t using AI.

Q: Did students do better on the exam after doing this exercise?
EF: Students usually do well on exams in this class. I did not look to compare.

Q: Have you thought about asking HopGPT for more topics for more study? For example, telling students, “Here is what we think you should focus on, how does that compare with what AI recommends?”
EF: This is a good idea. Students are telling us they use AI to help them study. I guess I just wanted them to wonder how well it was helping.

Q: What about student equity to AI platforms?
EF: That’s why I used HopGPT, since all students have access to that one. I would have selected Claude if I did it again.

Q: Next time, would you keep it the same or expand it? Also, next time would you continue to focus on the answers?
EF: I thought it was useful the way it went. I would keep it. I love that there are ways to go deeper, but this is the right number of points for this class.

Illysa Izenberg continued the presentation by sharing her approach to using AI in her Engineering Management and Leadership course. Like many instructors, she wants to help students use AI tools productively without becoming overly dependent on them. Izenberg believes in total AI transparency between instructors and students and provides very clear guidance for students about its use. She uses color-coded symbols to show when and how students may use AI, and requires them to cite all sources (including AI tools) while providing detailed instructions on proper citation.

Izenberg has developed a unique framework that encourages students to view AI as “teammates” they can rely on for different types of support, depending on the task. Each teammate has a clearly defined role:

  • The Tasker handles repetitive work like formatting citations, organizing notes, or cleaning data — tasks that are routine but still require students to verify accuracy.
  • The Draftsmith can help refine writing, generate study materials, or suggest improvements. Students need discipline when using this role, as allowing AI to draft too much early on can lead to missed learning opportunities. For example, Izenberg notes that students need to develop their own writer’s voice which will be severely impacted if they rely on AI to draft everything for them.
  • The Facilitator acts as a thinking partner, asking questions that help the student consider alternatives, evaluate plans, and expand and sharpen their analyses. This role also requires discipline so that learning is not undermined. When used thoughtfully, this role can promote reflection and deeper understanding.

In addition to the AI teammates framework, Izenberg also introduced students to an “AI Gatekeeper,” a tool similar to a rubric that guides students in regulating their use of AI. The AI Gatekeeper asks students to first define their own criteria for using AI and then rate each task on a scale from 1 to 5 to decide whether to proceed (1 = no AI use, 5 = AI use is appropriate). Izenberg recommends a rating of at least “3” for students to consider using AI for assistance.   Example student criteria and ratings:

  • Practice and expertise:
    • 1 if this task gives me essential practice in a skill I’ll need later.
    • 5 if I’ve done it many times and already have the needed expertise.
  • Creativity vs. rote work:
    • 1 if the task requires creativity or original thought.
    • 5 if it is mostly rote.

Students were not required to use AI or these tools in class, but they were there if they chose to use them. According to Izenberg, students who used the tools realized that they knew less about AI than they initially thought and learned a great deal in thecollege student raising hand process: to push themselves when articulating their reasoning, to avoid the temptation of allowing AI to make decisions for them, and to resist using it too early in the process so their creativity would not be compromised. Several students found the tools extremely helpful and suggested that they be made available and integrated across all of their courses.

The presentation concluded with additional questions from the audience and facilitator Caroline Egan:

Q: Given that the paid-for LLMs seem like they do a better job, what are the equity issues around giving them GPTs and prompts to use for coursework?
II: This is something we need to acknowledge. First, no assignment required AI use and students could succeed in the course without it. For those who chose to use it, many didn’t realize that HopGPT can access various ChatGPT models as well as Claude and others. I tried to mitigate the equity issues by either giving them prompts that they could edit and use in HopGPT or giving them access to GPTs I created for my engineering students working on senior design. Every time I create a GPT for students, I hire 2-3 people to try out the GPT at least 3 times. At least one of those times has to be using the free version of ChatGPT. I made up a rubric to evaluate GPTs and so I can see where the GPT might not work well for someone with a free account before I launch it so I can edit it. For now, that’s the best I can do I think. I’m going to be working on porting over my GPTs to another tool during the next few months in the hopes of resolving this and other issues.

Q: Maybe [the AI Gatekeeper] could be addressed as part of the introductions for all classes.
II: I love that idea.  Students are harming themselves [with their unregulated approach to using AI] and they don’t know it.

CE: A question for Emily and Illysa: both of you have had students critically reflect on the use of AI. What would you recommend for faculty who haven’t done that yet?
II: It is important to decide for yourself and your course where the lines are. Ask yourself, what do students need to be able to do? From there, draw the lines. Then talk to the students about what happens if they go outside of the lines.
EF: I agree. In another course, I showed students data from a paper about what happens when you use AI summaries vs. a web search. Participants in the experiment learned less overall with the AI summaries.
II: Also, in my opinion, students are overloaded with classes, credits, and extracurriculars. It is costing them the ability to reflect and learn.  So we have to help them and ourselves by planning ahead on what we want to turn to AI for and what we want to do on our own. Without clear guidance from faculty, students are never sure if they are doing something they should hide. This harms trust. Make the lines clear, relate them to your learning outcomes, and then teach them Total AI Transparency. You don’t have to hide what you’re allowed to do!

CE: Emily, the case study you brought up was on metacognition. It was an opportunity for students to practice metacognition. Illysa, did you also do this?
II: Yes – anytime students used AI, they had to explain why they chose to use it, why they made those choices, and why they used or didn’t use the output. They had to evaluate the output compared to their original goal. They weren’t just doing it, they were explaining why and how. The point is to teach them discernment.

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image source: Lunch and Learn logo, Unsplash

Clickers: Beyond the Basics

On Friday, February 5, the Center for Educational Resources hosted the third Lunch and Learn—Faculty Conversations on Teaching. For this session, three presenters discussed their experiences using clickers (classroom polling systems).

Logo for Lunch and Learn program showing the words Lunch and Learn in orange with a fork above and a pen below the lettering. Faculty Conversations on Teaching at the bottom.Leah Jager and Margaret Taub, are both Assistant Scientists and Lecturers who co-teach Public Health Biostatistics in the Department of Biostatistics at Johns Hopkins Bloomberg School of Public Health. This is a required course for Public Health majors, and regularly sees enrollments of 170 plus students. The course focuses on quantitative methods used in public health research. Jager reported that many students feel intimidated by the math. There is no text book for the course, instead students watch short videos before class meetings.

Jager started the presentation, Clickers in Public Health Biostatiscs, with a hands-on demo where the audience used clickers to answer example questions. A basic use of clickers might include checking class attendance or taking a quick quiz on an assignment. Taub and Jager seek a dynamic classroom environment, using clickers to “provide fodder for interaction between students” and gaining formative assessment of student learning of new concepts being taught. In their teaching, clickers are used daily to promote problem solving and peer discussion. They start with “warm up questions” to review materials from previous classes, then move on to checking newly introduced concepts. Jager showed examples of poll results (these may be called results charts, plots, or histograms) and discussed how she and Taub would respond to situations where it was clear that many students understood concepts or not. When students are not clear on the answer to a question, the instructors have them pair up and discuss the question and their answers. The students re-vote, then Taub and Jager review the concept and correct answer. Even when it is apparent that most students understand the material, the instructors briefly review the question to be sure that no one is left behind.

Example of a case report form used to capture data in course survey. Cocoa Content in Chocolate Tasting Trial.Jager and Taub use clickers for data entry as well (see above), a practice that qualifies as beyond the basics. The JHU clicker system (i>clicker) is integrated with the JHU course management system, Blackboard. Using the survey tool in Blackboard as a data recording form allows the instructors to record student responses question by question. It then takes minimal effort to output a spreadsheet with data that can be shared with the class and used for exercises and assignments.

Emily Fisher, Director, Undergraduate Studies and Lecturer, Department of Biology, uses clickers in her classes (Biochemistry, Cell Biology, Genetics). Her presentation, Clickers Beyond the Basics.  Fisher began with a discussion of what she considered to be basic use. Class timeline showing when clicker questions are introduced in a basic use case scenario.This would include a question at the beginning of class to gauge understanding of a pre-class assignment, a formative assessment question midway through class, and a question at the end of class to “place today’s topic in the bigger picture.” This use encourages students to attend class (if answers count toward grade) and acts as a means to “reset the attention span clock.”

Going beyond the basics Class timeline showing when clicker questions are introduced in a beyond the basics use. Fisher uses clickers throughout the class period to help students evaluate data, understand how biological systems work, and engage in higher level critical thinking by engaging in complex problem solving. She also uses the questions to identify student misconceptions. Using student responses and gauging the results charts allows her to make sure that students don’t get lost as she works through building a model for problem solving. Fisher led the audience through a series of slides (see presentation) demonstrating her process.

Fisher noted that using clickers for teaching higher level problem solving takes time to implement but is worthwhile. She explains to students at the beginning of each course how and why she is using clickers in order to ensure buy-in. By developing a model, students get a preview for the type of thinking that will be required to answer exam questions. Students get to practice in class by articulating answers to peers. Fisher has found that the process motivates student engagement, breaks up the lecture structure with active learning, and allows students to see real-world situations.

In the discussion that followed, faculty attendees expressed concern about the amount of time that clicker questions take away from content delivery. Advice from clicker users was to move some content to videos and outside of class assignments. Quizzing can be used to motivate students to complete this coursework.

Johns Hopkins Krieger School of Arts & Sciences and Whiting School of Engineer faculty will receive email invitations for the upcoming Lunch and Learn presentations. We will be reporting on all of the sessions here at The Innovative Instructor.

*********************************************************************************************************

Macie Hall, Senior Instructional Designer
Center for Educational Resources

Image source: Lunch and Learn logo by Reid Sczerba, Center for Educational Resources. Other images were taken from the presentations by Leah Jager, Margaret Taub, and Emily Fisher.