Lunch and Learn: Active Learning Techniques

On Tuesday, March 12th, the Center for Teaching Excellence and Innovation (CTEI) hosted a Lunch and Learn on Active Learning Techniques: Advice and Guidance from Experienced Faculty. Faculty panelists included Nate Brown, Senior Lecturer, University Writing Program (KSAS); Robert Leheny, Professor and Department Chair, Department of Physics and Astronomy (KSAS); and Michael Falk, Vice Dean of Undergraduate Education and Professor, Department of Materials Science and Engineering (WSE). Caroline Egan, Teaching Academy Program  Manager, moderated the discussion.

Caroline began the session by asking panelists how they got started with active learning and what they would recommend to those who were thinking about implementing it.

Nate Brown described how he heavily relied on his lecture notes to deliver content to students when he first started teaching. As he’s gained more experience, he’s moved away from using lecture notes and actively involves students in their learning.  Brown structures his classes now in such a way that the students drive the conversation and solve problems collectively, leading to greater retention of information and increased engagement. He makes a point of having students think about “why they are doing what they are doing.”

Robert Leheny recalled how the Provost’s Gateway Sciences Initiative from several years ago provided funding to support the redesign of gateway science courses, such as the Introduction to Physics course, which he teaches. The goal of the Gateway Sciences Initiative was to evolve the pedagogy in large introductory science courses to produce better student outcomes.

The Intro to Physics redesign, which was modeled after similar large lecture style courses at NC State and MIT, moved away from a traditional lecture style to a much more interactive experience. Students are divided into groups and sit at round tables instead of in rows, and they solve problem sets together during class rather than at home. This in-class work is partly enabled by a flipped classroom approach which enables students to review the content before coming to class. Leheny said the department now offers two versions of Introduction to Physics that students can select from: an active learning style and a more traditional auditorium/lecture style.

Michael Falk first started implementing  active learning by using  clickers in a 200-student Introduction to Computer Programming course at the University of Michigan several years ago. Since he’s been at Hopkins, his classes have been smaller, allowing him to approach active learning in different ways. Falk gave an example of how he flipped an upper-level materials science course (which is now also part of the Gateway Computing program) so that students work in an online textbook outside of class and do more collaborative work during class. Another example is a First-Year Seminar class taught by Falk, Turing’s Shadow: Uncovering What’s Hidden in STEM. This discussion-based course covers a range of topics, some of which are sensitive, and students are often afraid to speak up. To address this, Falk created a series of discussion cards to help ease students’ discomfort. The cards provide discussion prompts for students, such as “Clarification: Ask for further explanation about something,” and they also inject some fun elements into the conversation by asking respondents to present their response “in the form of a song,” or “while walking around the room very quickly,” for example. It turns the exercise into a game and helps students to feel more comfortable participating in class.

Caroline continued by asking the panelists what their definition of active learning is and to provide a counter-example of it, which would bring its definition into better relief.

NB:   I think it involves giving students a stake in what we’re doing. For example, helping to define the parameters of a paper we’re going to write. I see the professorial role as one of support, like “air traffic control.” With active learning, students are involved in the creation of their own learning.

RL: The primary component of active learning in physics is peer instruction. Students need to be able to solve problems. We don’t use class time to introduce students to concepts, but instead give students an opportunity to practice solving problems where there are resources to help facilitate these skills. For example, students are divided into groups of three and explain to each other how they would go about solving a problem. The act of explaining the problem to someone else helps to solidify their own understanding. A counter example would be the old way of the instructor speaking from the blackboard, talking uninterrupted for most of the class period.

MF: Active learning is learning by doing. Students are engaging with content in a supportive environment. We are teaching a different group of students at Hopkins now – there are many more first- generation, limited-income, and/or underrepresented students with very different backgrounds. We need to think proactively about leveling the playing field for students. This is evident in the data around class outcomes: classes taught using active learning techniques have lower levels of students failing or dropping out. This is even more true for students in underrepresented groups.

RL: We also see this in Intro to Physics. We have the two versions of the course: one in the auditorium (which may have some active learning elements in it), and one designed specifically as an active learning course. The homework and exams are the same in both courses. The outcomes show that failing grades are much less likely to occur in the active learning course.

MF: I used to think my job as an instructor was to deliver content and material. Now, with active learning, I think my job is to deliver an experience.

Caroline continued by asking panelists for a simple active learning technique that instructors can implement right away.

NB: This may sound crazy – it comes from a writer colleague of mine. I was having students read out loud in class and noticed they were struggling: they didn’t feel comfortable, they were shy, or were experiencing language barriers, etc. I then asked them all to read in chorus (at the same time). No one understood what they were hearing, but it shocked them into it being ok to share. It really helped them get over their nerves.

RL: Think-Pair-Share. This technique works very well in a large lecture environment. We give students a multiple choice question and have everyone vote on the answer. Next, they have to find someone who voted differently and try to convince them to change their answer. We then ask students to vote again. The results are that there are usually more correct answers the second time. You do need good questions for this to be effective.

MF: An idea for STEM classes, figure out a way to shorten your lecture and hand out the problem set at the end of class. Allow students to work on them with each other. Ask students to write down the steps on how they would solve the problem, but not actually solve it. Allow time for a report out at the end. This gives them a chance to support each other while organizing information.

CE: A complimentary Humanities example: In my first-year writing class, I hand out labeled strips of paper to students with our class readings on them and ask them to organize the strips in a way that would help the students use the readings in their papers. Also, I give students writing prompts, break them into groups, and ask them to find out where they would find the answers to the prompts. This helps to get them in the right mindset of locating good sources.

Two other examples of active learning were mentioned by faculty guests. One instructor explained how she has students use Legos to construct the analysis of an argument. They connect more and more Legos to build supportive elements of their argument and take away those that they disagree with. Another instructor mentioned that she has students act out responses in class.

The session continued with questions from the audience for the panelists:

Q: In reference to Think-Pair-Share, have you observed any competitiveness among students or reluctance to participate in these activities?
MF: We tell students it has nothing to do with their grade.
RL: We do the same. We also tell them there is no curve and it is possible for everyone to get an A, which reduces overall competitiveness.
NB: One of the great things about this exercise, where students are engaging with each other, is that they get to hear from peers that are from all over the world. We turn it into a social space where they can feel comfortable sharing.

Q: (From a librarian) I recently had about 30 minutes to work with students in a research class. I received feedback from a student that I didn’t do enough active learning in the class, despite doing a brainstorming exercise with them. What do you do when you need more active learning in such a short amount of time?
RL: Explain to students why you structured the class like you did. It will help if you get their buy-in. Maybe the answer is to announce at the beginning that what you’re doing is in fact active leaning.
MF: Students like playing – it makes for a positive learning experience. Perhaps turn part of it into a game/play. And then explain what and why you’re doing it this way.

Q: Are there any active learning experiences to share when you’re guest lecturing? Do you use the same or different strategies?
MF: It needs to be a different strategy. As a guest, you don’t have the advantage of repetition or control of the environment. Explain to students what you’re doing and do the best you can with the constraints that you’re under.
CE: Be very intentional about your choices. At the end, ask them one thing they will remember from the class. This is a good recall exercise.
NB: As a guest speaker, you already are a bit novel since your presence is different than their regular day. Maybe use a novel activity that they will remember.

Q: Could you each share how you put groups together intentionally instead of having students self-form?
RL: Students are put into groups of three. Groups are engineered this way – we switch a few times during the semester. The students don’t know it, but we add them to groups according to their performance on the midterm. In each group, there is one person that scored at the top, one from the middle, and one from a low level. The top level person gets more practice articulating ideas. The lower level person gets the benefit of working with someone who has command of the material. We also group according to gender: we avoid placing two men and one woman in a group to avoid women being excluded. There is research that supports this.
NB: We also do a lot of group work. Halfway through the semester, I ask students to work with someone they haven’t worked with before. I also ask them to sit next to someone different. It results in a richer peer review experience.
MF: I have students do a self-assessment at the very beginning of the course and use the results of the assessment to group students.

For more information about the active learning topics discussed at the event, please see this  Active Learning For Distribution folder of materials developed by Caroline Egan.

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image source: Lunch and Learn logo, Unsplash, Pixabay

Lunch and Learn: Canvas Show and Tell

 On Wednesday, November 1st, the Center for Teaching Excellence and Innovation (CTEI) hosted a Canvas Show and Tell: Share and Learn about Engaging and Effective Uses of Canvas. Alison Papadakis, Teaching Professor and Director of Clinical Psychological Studies in the Department of Psychological & Brain Sciences, hosted the discussion. She was joined by Emily Braley, Assistant Dean for Undergraduate Academic Affairs and Associate Teaching Professor in the Department of Mathematics, and Jamie Young, Lecturer in the Department of Chemistry. Beth Hals, Brian Cole, and Caroline Egan from the CTEI helped facilitate the event.

Alison Papadakis opened the discussion describing how her interest in Canvas began with her kids, who were using it during COVID. (JHU was still using Blackboard at that time.) Watching her kids struggle with poorly designed Canvas classroom interfaces influenced the way she organized her own Canvas classroom once JHU adopted it as our new learning management system (LMS).  One big decision she made was to stay away from using the Module function, which is often the most common  way to organize content in Canvas. Instead, Papadakis explained how she used the Canvas Page function to create a page with a table outlining her course schedule with hyperlinks to the rest of her content. The homepage of her Canvas site looks like a calendar with hyperlinks for each class day. She regularly checks in with her students, asking if they have trouble finding anything in the course and they always assure her that they do not. Papadakis also makes the Files area in Canvas available to her students, as an additional way for them to access course content, but they tell her they don’t use it. She says the course schedule page is not the “prettiest” display of content, but the functionality works very well for her course and students can easily find what they need for each class period.

Papadakis also does a lot of student advising and needed a place to post links and share information with students. She decided to use a community site, which is similar to a website, but built inside of Canvas. All majors and minors have access to the site as well as other faculty; it is also possible to add other users to the site if necessary. Brian Cole clarified that the key difference between a standard Canvas course and community site is that a standard site is for credited courses and is automatically generated by  JHU’s Student Information System (SIS). Community sites, which all faculty have the ability to request, are for non-credit activities and are intended to share information and resources across multiple populations.

Emily Braley described how the mathematics department is using a community site to host their math placement exam. The university’s switch to Canvas provided an opportunity to revise the exam, which was previously hosted in Blackboard. In Canvas, students are provided with more information about why they are taking the exam as they are guided through a series of steps to help them decide which exam to take. With the help of CTEI staff, Braley described how they embedded a Microsoft form inside of Canvas that asks students what math courses they took in high school, including AP courses. The branching feature of the form then directs students to the appropriate placement exam based on their answers. There are also practice tests that students can take before the actual exam.

The exam itself is set up using a Canvas feature called Mastery Paths. This feature allows an instructor to set up to three ranges of scores for the exam; once they take the exam, student scores are translated into a recommendation for enrollment. Braley also created a customized grading scheme for the exam, which contains information about interpreting the results as well as the actual score for the students.

Braley is very excited about the potential for data analytics with the revised exam process. Using the form provides the department with data which can help identify trends and determine if students are being placed correctly.  All incoming math students are encouraged to take a math placement exam; so far this fall, close to 1100 students have taken the placement exam.

Jamie Young was looking for a way to avoid having to answer the same questions repeatedly from the 640 students in his Introduction to Chemistry lab course. Using HTML code, he was able to create a dropdown FAQ page in Canvas containing embedded links. He estimates he has received 50-60% less questions this semester so far since posting the FAQ page.  He also used HTML to add buttons and links to his syllabus that link out to everything in the course, similar to Alison Papadakis’s course schedule. He believes this saves time for students as they are able to find many things very quickly. Additionally, Young embedded a live Google Document into the course that contains his course schedule. This makes it really easy to update the schedule when necessary as any changes made will immediately be pushed to Canvas – no need to upload an edited document each time a change is made.

In another course, with a combined lecture and lab, Young struggled with displaying a large amount of content. He initially put everything into modules but wasn’t happy with how disorganized they became after adding so much material. He has since turned each module into its own page and links everything from the page. This has been working out much better – again, students are able to find things quickly and easily. Young insists you don’t need much coding knowledge to take advantage of these features in Canvas; you do need to know – or have access to – a  few HTML commands.

The discussion included the following questions from the audience:

Q (for Alison Papadakis): Do you need coding experience to create this [the course schedule]?
AP: I just created it in Word and cut and pasted it in – no coding necessary.

Q (for Alison Papadakis): How do you link the “tone” of your course to the course schedule?
AP: This is an in-person course, so there is a lot of in-class discussion around the course and how it works at the beginning. The course schedule is just the pragmatic piece so we can keep things organized.

Q (for Alison Papadakis): It looks like you assign readings before the semester begins – do you plan everything ahead of the semester, before it starts?
AP: I have taught this course over ten times, so I know basically what’s coming. I put placeholders in for things I don’t know yet. You’ll notice it says ‘Tentative Schedule’ so I can allow for shifting things around if needed. I do need to remember to update the Canvas calendar when making changes to my course schedule.

Q (for Alison Papadakis): Can anyone access the community site?
AP: No, they have to be added to the roster.

Q: (For Beth Hals, CTEI’s Sr. Instructional Technologist) Can you explain Mastery Paths? Is it the same as locking/unlocking a Module?
BH: Mastery Paths are affiliated with some sort of assessment in Canvas. As the instructor, you can set three different sets of score ranges that you use to then send students on their next ‘path’ based on their results. Unlocking modules is a little different – you first set prerequisites on a module that must be completed before the module will unlock.

Q (for Jamie Young): To a neophyte, it’s a little overwhelming to see what you’ve done – there seem to be many ways of doing the same thing. Could you compare and contrast the ways of organizing your syllabus?
JY: You can use the Rich Content Editor (RCE) in Canvas to build your syllabus. If you want to add something like buttons, you would then toggle the RCE to view the HTML editor. Using HTML  is more complicated for sure, but with some basic knowledge you can do it. I would be happy to share what I’ve done and then you can just fill in your information and cut and paste it into your course. To embed the Google Form, I followed online directions that I googled.

Brian Cole, CTEI’s Associate Director for Instructional Technology: You don’t need any HTML  knowledge to embed anything into Canvas. You can use the Rich Content Editor (RCE) to do this. There is an “embed” option in the menu of the editor. You also don’t have to do every page. You can pick and choose what parts of your course to make pretty.

Q: Did Jamie build his syllabus in AEFIS?
BC: No, Jamie built his syllabus using the Canvas Syllabus page. You can still use your own syllabus in conjunction with the AEFIS syllabus – they can coexist. (Note: New name for AEFIS is Heliocampus.)

Q (for Jamie Young): Could you provide a little more information on creating tabs?
JY: They are just HTML code. I used HTML 5. You have to go into the HTML editor in Canvas and use “div” tags to build tabs. Start with the blank tabs in html, then go back to the RCE and fill in the text as needed. You can use copy and paste to make it easier.

Q: Can I move JavaScript headers into Canvas?
BC: No, Canvas will strip them out. An alternative is to embed the page into the Canvas page.
BH: There is something called the Redirect tool that may help. This tool adds an item to your navigational menu. You pick the text for what will display in your menu and it will link to a particular page.

Q: Any ideas about making grading easier?
EB: We use auto grading on all quizzes. We also use banks of questions, so that each quiz pulls from different banks. New Quizzes has matching question types that are more work for students, more robust, but still auto graded. Another thing about New Quizzes is the ability to render Latex [a typesetting software for math symbols]. This has been very useful for us – it’s so much cleaner for students. It renders as accessible MathML, which can be read by a screen reader. This is much better than posting a PDF that is read as an image.
We also use Gradescope, which is an external tool that helps us streamline grading. Students upload their work to Gradescope (inside of Canvas) and you can set it up to help auto grade problems.
JY: We also use Gradescope extensively in Chemistry. We scan written work into Gradescope and it is automatically graded. The system has gotten better at reading handwriting. It has made handwritten assignments so much easier to grade. One caveat about Canvas quizzes: they don’t allow for numbers past 4 decimal places, which we need.

A word about accessibility in Canvas:
EB: You can have Canvas tell you if your material is accessible or not. Use the accessibility checker in the RCE to help you with this.
BH: I also wanted to mention that it’s very easy to duplicate pages in Canvas – build it once, duplicate the page, then fill in what you need to change. It’s like building a template for yourself and reusing it.

For more information about topics discussed at the event, please see this Canvas resource developed by Beth Hals.

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image source: Canvas logo

Community Conversation: Facilitating Difficult Conversations in the Classroom

The Center for Teaching Excellence and Innovation (CTEI) hosted a community conversation on Facilitating Difficult Conversations in the Classroom on Thursday, November 9th, as a follow-up to our recent blog post on the same topic.  The faculty panel included: Sherita Golden, Chief Diversity Officer at the School of Medicine & Hugh P. McCormick Family Professor of Endocrinology and Metabolism, Andrew Perrin, SNF Agora Professor and Department Chair of Sociology, and Mike Reese, Associate Dean of the Center for Teaching Excellence and Innovation & Associate Teaching Professor of Sociology. Caroline Egan from the CTEI facilitated the event.

Mike Reese opened the conversation by acknowledging that current events are an opportunity to apply course concepts but the instructor’s goal is to maintain civility on sensitive topics. He described some of the strategies he uses in his classroom, such as setting ground rules for class discussions. Reese explains the intentions of the rules to his students, which is to create a space that makes everyone feel comfortable participating. Some of his ground rules include:

  • Support arguments with evidence
  • Use ‘I’ statements – do not speak for others in the class
  • Do not generalize about groups
  • Allow students to speak without interruption (with caveat that the instructor can cut them off if they go on too long)
  • Listen actively – be open to what others are saying
  • Name-calling, sarcasm, inflammatory accusations are not permitted

Reese noted in his classes that historically the issue is less conversations becoming heated and more that students are hesitant to talk about politically-charged topics. He mentioned strategies to spark engagement, such as having students first work in small groups to discuss a topic. He also uses structured debate activities where students are assigned a specific perspective. In the debate activity, the pressure is taken off of the student since the role/perspective is assigned by the instructor.

In cases of traumatic events, Reese stated that instructors are not required to discuss the topic if they are not comfortable doing so, but should at least acknowledge the event. Research suggests students want events to be acknowledged, not ignored. Reese shared that a student this past week shared they don’t need to discuss the issue in every course, as the constant reminder may be counterproductive, but would prefer to discuss the event in courses that speak directly to the issue. The main message is be intentional about how you plan to discuss traumatic events and what boundaries you will place on the conversation before arriving to class.

Andrew Perrin continued the conversation by describing two principles he subscribes to:

  1. While all people deserve safety, no ideas deserve safety. The way we honor ideas is by submitting them to rigorous argument and evidence and testing them out.
  2. Emotions running high is not a reason to avoid discussions – it’s a reason to have better discussions. Explain and demonstrate to students how to listen thoughtfully to ideas and make judgments based on evidence.

While he supports Reese’s ground rules, Perrin takes a slightly different approach: he will often set the context for students, explain a scenario, and have them listen to their peers discuss the issue, recognizing their own ideas may turn out to be wrong. His pedagogical goal is to make sure that all reasonable ideas are raised so that they may be examined and challenged. At times Perrin will make arguments that he doesn’t believe in because he feels it’s important for them to be part of the conversation.

He believes most students politically are not committed either to the left or the right; they might be in between, they might not have thought about it much, or they might have mixed opinions. The instructor’s job is to make sure the debate includes more ideas than just those from the 5-10% on the right and left. Perrin acknowledged that it might be hard to engage on difficult topics and students might be uncomfortable, but that is an acceptable outcome in a university classroom. He stated, “Too often we think the goal is to come to consensus. I think the goal is to understand why people feel the way that they do.”

Sherita Golden teaches medical students in a clinical environment as well as staff from all parts of the medical establishment.  One of the issues that she addresses in her classroom is why we continue to see inequities in health, for example, the inequity of non-whites consistently having a higher prevalence of diabetes than whites.  Golden explains to her students how historical discrimination and racism dating back to the time of enslavement have led to the current situation; eugenics theory suggesting the biological inferiority of non-white populations, trust violations by the medical establishment due to unconsented experimentation on enslaved and marginalized communities, as well as healthcare clinician bias against minoritized patients. These are all factors adversely affecting healthcare quality and access today. Another example is the historic practice of redlining, which made it more difficult for African-Americans to obtain mortgages and build wealth through their homes. Golden uses these historical contexts to explain the roots of current social movements to her students and health system staff.

Golden acknowledged that conversations can become politically charged and offered the following suggestions to help diffuse the situation:

  • Acknowledge the humanity of the person speaking – listen with compassion and intent to respond rather than react.
  • Commit to providing information to the person if you aren’t sure how to address the issue in the moment.
  • Adopt a learner’s mindset:
    • What is the historical context of the issue?
    • What do you know about the lived experiences of those expressing the concern?
    • How might you benefit from knowing more in your teaching/leadership role?
  • Recognize that there are two sides to every story and the truth is somewhere in the middle.

The discussion continued with panelists taking questions from participants.

Q : I appreciate your [Perrin’s] idea of exploring other ideas of a particular subject, but at the same time, as we examine these charged issues and try to be objective, sometimes the discussion becomes politically neutral. I’m struggling with how I can resolve this issue of technical neutrality.

AP: I don’t think there are 2 sides, but actually 4, 5, or even more sides to every story. There are lots of different dimensions. One reason I don’t ask students to debate positions according to what I’ve set is because I think it is important for them to feel like what they’re saying matters and that they care about the issue. I’m not trying to say, “all ideas are fine” but instead, “all ideas deserve to be listened to.”  I do think historical context is important, as well as real world evidence, so it’s important not to let them stick with things that aren’t true.

MR: The classroom space allows students to voice lots of different ideas, and sometimes students come out not knowing where to go from there. Here is where reflection may help – ask them to articulate how they have moved on a particular issue, if at all.

SG: The goal is not to come to a resolution, but help them improve their argument and use of evidence. I encourage students to read constantly – learning is a lifelong process. Reading will help me (and them) back up why I feel a certain way, with evidence.

Q: How do you create space to honor opinions that you feel are wrong or will harm others?

AP: The key is how we say things. There is a reason to challenge the idea because it’s there – it is also submissible to evidence. The person should be able to explain why they think what they think. It really is important for people to be pushed to explain why they think this or that, where is the evidence, what makes this true. I like to ask, “Is there anything you could learn that, if true, would end up changing your position on this?” It is important to distinguish between what is true and what people think. So even potentially harmful ideas need to be discussed, if only to bring evidence to show why they are harmful. That said, it is also important to protect students who may be hurt. I will sometimes remind students that there are probably other students in the room who may be negatively affected by what they’re saying.

Guest: This makes me think about a case in class: we watched a film about a kidnapped woman in China who was sold to a villager as a wife. In the film there was a sympathetic attitude to the men in the villages that I disagreed with. This kind of sentiment was hard to watch.

MR: This is why reflection as the instructor is important, too.  If the discussion did not address your goals for the class, then perhaps reflect on how you might structure it differently next time.

SG: One of my favorite phrases to use in these situations: “Help me understand why you made that comment… What is at the root of what you’re saying?” I work in a clinical setting as well as an educational setting. We must show dignity and respect to all patients. We fall back to the core values of JH medicine: we need to be respectful of different points of view and perspectives.

Q: How do you de-escalate a conversation when it becomes highly charged?

AP: I like charged topics, I don’t feel scared of them. I like to ask students: “Why do you think that, what makes you say that, what do you think your opponents think?” It’s not useful for me to throw around “flashpoint” words; it works better for me to stick to the questions I just mentioned. In the classroom, you do have to move forward at some point. I ask the questions and try to keep the conversation moving.

MR: It’s also ok to take a break.

SG: I suggest listening sessions. This is a very important way to learn – let the person talk, we can all learn from that.

 

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image Source: Unsplash

Lunch and Learn: Generative AI – Teaching Uses, Learning Curves, and Classroom Guidelines

On Tuesday, October 3rd, the Center for Teaching Excellence and Innovation (CTEI) hosted its first Lunch and Learn of the academic year, a panel discussion titled, “Generative AI: Teaching Uses, Learning Curves, and Classroom Guidelines.” The three panelists included Jun Fang, Assistant Director of the Instructional Design and Technology Team in the Carey Business School, Carly Schnitzler, KSAS instructor in the University Writing Program, and Sean Tackett, Associate Professor in the School of Medicine.  The discussion was moderated by Caroline Egan, project manager in the CTEI. Mike Reese, director of the CTEI, also helped to facilitate the event. 

The panelists began by introducing themselves and then describing their experiences with generative AI. Jun Fang loves new technology and has been experimenting with AI since its inception. He noticed the faculty that he works with generally fall into two categories when it comes to using AI: some are quite concerned about students using it to cheat and are not ready to use it, while others see a great deal of potential and are very excited to use it in the classroom.  In speaking with colleagues from across the institution, Fang quickly realized these are common sentiments expressed by faculty in all JHU divisions. This motivated him to lead an effort to create a set of AI guidelines specifically geared toward faculty. The document contains a number of strategies for using AI including: designing engaging course activities, providing feedback for students on their assignments, and redesigning course assessments. The section on redesigning course assessments uses two approaches: the “avoidance approach,” which involves deliberately designing assessments without AI, and the “activation approach,” which intentionally integrates AI tools into the curriculum. The document includes specific examples of many of the strategies mentioned as well as links to widely used generative AI tools. 

Fang described a recent scenario in which a faculty member was concerned that students were using ChatGPT to generate answers to online discussion board questions.  To mitigate this situation, Fang suggested the faculty member revise the questions so that they were tied to a specific reading or perhaps to a topic generated in one of his online synchronous class sessions.  Another suggestion was to have students submit two answers for each question – one original answer and one generated by ChatGPT – and then have the students compare the two answers.  The faculty member was not comfortable with either of these suggestions and ended up making the discussion more of a synchronous activity, rather than asynchronous.  Fang acknowledged that everyone has a different comfort level with using AI and that one approach is not necessarily better than another.     

Carly Schnitzler currently teaches two introductory writing courses to undergraduates and is very open to using generative AI in her classroom.  At the start of the semester, she asked students to fill out an intake survey which included questions about previous writing experiences and any technologies used, including generative AI. She found that students were reluctant to admit that they had used these technologies, such as ChatGPT, for anything other than ‘novelty’ purposes because they associated these tools with cheating. After seeing the results of the survey, Schnitzler thought it would be beneficial for students to explore the potential use of generative AI in class. She asked students to do an assignment where they had to create standards of conduct in a first year writing class, which included discussing their expectations of the course, the instructor, their peers, and how AI would fit in among these expectations. The class came up with three standards: 

  1. AI tools should support (and not distract from) the goals of the class, such as critical thinking, analytical skills, developing a personal voice, etc.  
  2. AI tools can be used for certain parts of the writing process, such as brainstorming, revising, or editing, but students must disclose that AI tools were used. 
  3. If there appears to be an over-use or over-reliance on AI tools, a discussion will take place to address the situation rather than disciplinary action. (Schnitzler wants students to feel safe exploring the tools without fear of repercussion.) 

This assignment comes from an open collection of cross-disciplinary assignments that use text generation technologies, mostly in a writing context. TextGenEd: Teaching with Text Generation Technologies, co-edited by Schnitzler, consists of freely accessible assignments submitted by scholars from across the nation. Assignments are divided into categories, such as AI literacy, rhetorical engagements, professional writing, creative explorations, and ethical considerations. Most are designed so that the technologies used are explored by students and instructors together, requiring very little ‘expert’ technological skills.  Schnitzler noted that there is a call for new submissions twice each year and encouraged instructors to consider submitting their own assignments that use text generation AI.

Sean Tackett was initially fearful of ChatGPT when it was released last year. Reading article after article stating how generative AI was going to “take over” pushed him to learn as much as he could about this new technology. He began experimenting with it and initially did not find it easy to use or even necessarily useful in his work with medical school faculty. However, he and some colleagues recognized potential in these tools and ended up applying for and receiving a JHU DELTA grant to find ways they could apply generative AI to faculty development in the medical school. Tackett described how they are experimenting with generative AI in a curriculum development course that he teaches to the med school faculty. For example, one of the tasks is for faculty to learn to write learning objectives, so they’ve been developing prompts that can be used to specifically critique learning objectives. Another example is developing prompts to critique writing. Most of Tackett’s students are medical professionals who do not have a lot of time to learn new technologies, so his team is continually trying to refine prompts in these systems to make them as useful and efficient as possible. Despite being so busy, Tackett noted the faculty are generally enthusiastic about having the opportunity to use these tools.     

The discussion continued with a question and answer session with audience members: 

Q: How do we transfer and integrate this knowledge with teaching assistants who help manage the larger sized classes? What about grading?
ST: I would advocate for the potential of AI to replace a TA in terms of grading, but not in terms of a TA having a meaningful dialogue with a student. 
JF: Generative AI tools can be used to provide valuable feedback on assessments. There are a lot of tools out there to help make grading easier for your TAs, but AI can be used for the feedback piece. 

Q: How might professors provide guidelines to students to use generative AI to help them study better for difficult and complex topics?
MR: One possibility is to generate quiz questions – and then have students follow up by checking the work of these quizzes that have been generated.
CS: Using a ChatGPT or other text generation tool as a reading comprehension aid is something that has been useful for non-native English speakers. For example, adding a paragraph from an academic article into ChatGPT and asking what this means in plain language can be helpful.

CE: This gets to what I call ‘prompt literacy,’ which is designing better prompts to give you better answers. There is a very good series about this on Youtube from the University of Pennsylvania.
Sean, what have you experienced with prompting right now, in terms of challenges and opportunities?
ST: We’re trying to put together advice on how to better prompt the system to get more refined and accurate answers. After a few iterations of prompting the system, we refine the prompt and put it into a template for our faculty, leaving a few ‘blanks’ for them to fill in with their specific variables. The faculty are experts in their subject areas, so they can tell if the output is accurate or not. We’re in the process of collecting their output, to put together best practices about what works, what does not work.  

CE: What would you all like to see in terms of guidelines and best practices for AI on a web page geared towards using AI in the classroom?
Guest: And along those lines, how to we move forward with assigning research projects, knowing that these tools are available for students?
ST: I think it could be useful for students to learn research skills. They could use the tools to research something, then critique the results and explain how they verified those results. It can also be useful for generating ideas and brainstorming. Another thought is that there are a number of domain specific generative AI databases, such as Open Evidence which is useful in the medical field.  
CS: To Sean’s point, I think a comparative approach is useful with these tools. The tools are very good at pattern matching genre conventions, so doing comparative work within a genre could be useful.
JF: I think ChatGPT and other generative AI tools can be useful for different parts of the research process, such as brainstorming, structure, and editing. But not for something like providing or validating evidence.  

Q: As a grad student, I’m wondering how the presence of AI might force us to refine the types of questions and evaluations that we give our students. Are there ways to engineer our own questions so that the shift of the question is changed to avoid the problem [of having to refine and update the question] in the first place?
CS: There is an assignment in our collection that talks about bringing an assignment from past to present. Again, thinking in terms of a comparative approach, ask ChatGPT the question, and then ask your students the same question and see how they compare, if there are any patterns.  I think it can be helpful to think of ChatGPT as adding another voice to the room.
JF: We have a section in the guidelines on how to redesign assessment to cope with generative AI related issues. We suggest two approaches: the avoidance approach and the activation approach. The avoidance approach is for faculty who are not yet comfortable using this technology and want to avoid having students use it.  One example of this approach is for faculty to rework their assignments to focus on a higher level of learning, such as creativity or analysis, which will hopefully reduce or eliminate the opportunity for students to use AI tools. The activation approach encourages faculty to proactively integrate AI tools into the assessment process. One example of this approach I mentioned earlier is when I suggested to a faculty member to rework their discussion board questions to allow students to submit two versions of the answers, one created by them and the other by ChatGPT, and then analyze the results. 

Q: What is the ultimate goal of education? We may have different goals for different schools. Also, AI may bridge people from different social backgrounds. In China, where I grew up, the ability to read or write strongly depends on the social status of the family you come from. So there is some discomfort using it in the classroom.
CS: I feel some discomfort also, and that’s what led to the development of the guidelines in my classroom. I posed a similar question to my students: if we have these tools that can allegedly write for us, what is the point of taking a writing class?  They responded by saying things like, “writing helps to develop critical thinking and analytical skills,” to which I added, “being here is an investment in yourself as a student, a scholar, and a thinker.” I think asking students to articulate the value of the education that they want to get is really helpful in determining guidelines for AI.
ST: Going to school and getting an education is an investment of your time. You pay now so you can be paid later. But it’s not as transactional as that. AI is already in the work environment and will become more prevalent. If we’re not preparing students to succeed in the work environment, we are doing them a disservice. We teach students to apply generative AI in their classes so they are prepared to use it in the workforce.
JF: In the business school, everything is market driven. I think education can fit into that framework as well. We’re trying to provide graduates with the confidence they need to finish the work and meet the market’s need. We know that generative AI tools have really changed the world and they’re starting to emerge in every part of our life. We need to train students to realize that ChatGPT might be part of their education, part of life in the future, and part of the work in the future as well. There are things AI can help us do, but there are still fundamentals that students need to learn. One example is calculators: we still need to learn from the beginning that 1 + 1 = 2. 
CE: This question also reminded me of asking your students, what is the ultimate purpose of a research paper? Where do they think ChatGPT should fit into the research process?  

Q: I work at the library and we’re getting lots of questions about how to detect if students are using AI. And also, how do you determine if students are relying too heavily on AI?
JF: We also get this question from our faculty. The most used detection tool right now is Turnitin, which is embedded in Canvas. But the level of accuracy is not reliable. We encourage faculty to always validate before accepting the results.  For faculty who are actively using AI in the classroom, we also encourage them to provide clear guidance and expectations to students on how they are allowed to use it.  This may make it a little easier to determine if they are using it correctly or not.
MR: There are some other tools out there, such a GPTZero, ZeroGPT, but to Jun’s point, the difficult thing is that it’s different than plagiarism detection which says this is copied, and here’s the source. These tools say there’s a probability that part of this was taken, but you can’t point to a direct source. It’s up to instructors whether or not to use these tools, but consider using them to facilitate a conversation with students. In my own classes if I suspect academic misconduct, I usually start by asking them to explain, talk to me about what is happening before I make accusations. With these tools, there tends to be no hard evidence, just probabilities that something may have happened.  This is definitely an area we’re all still learning about.
Guest: I was just thinking that having a conversation with students about why they are turning to the tool in the first place might prevent misconduct.  Instead of sending them to an academic misconduct committee, we could have these conversations, like Carly mentioned. Making students aware of the limitations of the tool could also be helpful.
CS: Yes, I say that in our guidelines that I’m prioritizing conferences with students over immediate disciplinary action. I try to pre-empt anxiety students might feel around using these tools. Designing your assignments in a way that reduces anxiety is also helpful. For example, I tend to design assignments that build on one another throughout the semester in smaller bits, rather than one giant chunk all at once.  

Q: Is there any discussion around combining AI with teaching, such as generating personalized explanations of a topic? Students will have different levels of expertise and comfort with different topics.
ST: We’re trying to do this, to create a teaching aid for the future. We’re planning to use it to create assessment items.  

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image Source: Pixabay, Unsplash

 

Adapting to AI in the Classroom for Time-Strapped Instructors

In the past few months, we have spoken to many instructors – faculty, graduate students, even undergraduate teaching assistants –  who are doing very interesting things with artificial intelligence tools in their classes this coming fall. Some are writing grants to support research into classroom uses of AI, some are designing interactive online modules to help teach about the ethics of AI, and some are integrating AI tools into their instructional activities.

This blog post is for another instructor population: those that have not had the time or capacity to redevelop their courses, their assessments, or their activities to accommodate an AI world. “Redesigning assessments with AI in mind” might be the 20th item on a long list of to-dos for the coming semester. Adapting to new technologies that could change the classroom experience – and AI is certainly one of them – seems like an overwhelming task. Classes start in one week, and wrestling with the teaching and learning opportunities and challenges of artificial intelligence may not be an achievable goal.

However, there are some concrete steps and curated resources to take into account in terms of AI when planning and teaching your courses.

Recommendations for Starting with AI

Here are six recommendations (and one extra credit assignment). Following all of these suggestions will put you on good footing with the learning curve associated with AI in the classroom, but even doing one or two is a good way to start.

  1. Experiment with ChatGPT and other AI tools. Just get in there and start using them and see what they produce. In an article for the Chronicle of Higher Education, one writer said, “I started by reminding myself, anytime I was about to Google something, to ask ChatGPT.”[1] ChatGPT-ing (or using Google Bard) instead of Google-ing is a good on-ramp to AI usage. You may even find them useful to you as an instructor. Here are four basic generative AI models to start with along with prompt suggestions:
    1. ChatGPT – The first (and by some reports, still the most accurate) text-based generative AI. Prompt suggestion: Ask a basic question about teaching, e.g., “How can I grade exams more efficiently?” or “How can I provide written feedback more efficiently?”
    2. Google BardLess text-heavy than ChatGPT; potentially geared towards more logic-based questions, e.g., “How do I create a website in WordPress?”
    3. Microsoft BingAble to generate images as well as text and simultaneously harness the power of a search engine. Potential question: “Name the characteristics of neo-classical architecture and provide an example.”
    4. Fotor.com Image-generator AI. Potential question: “Provide an illustration for my chemistry class syllabus.”
  2. Run your assignments through an AI tool. This will help benchmark possible AI-generated responses to your assignments. More sophisticated AI users will engage in prompt engineering that could make uncited or incorrect usage of AI harder to detect, but getting at least one example of an AI response is helpful. It will not only provide a sightline into possible academic integrity issues but also point to whether your assignment may need to be revised or redeveloped, which could include integrating AI itself. Derek Bruff, a writer and higher education consultant, provides good guidance on assessment design in light of AI:
    1. Why does this assignment make sense for this course?
    2. What are specific learning objectives for this assignment?
    3. How might students use AI tools while working on this assignment?
    4. How might AI undercut the goals of this assignment? How could you mitigate this?
    5. How might AI enhance the assignment? Where would students need help figuring that out?
    6. Focus on the process. How could you make the assignment more meaningful for students or support them more in the work? [2]
  3. Add an AI policy to your syllabus. This may require doing some or all of the recommendations above, but even if you do not have the capacity to take a deep dive into AI tools before courses start, it is a good idea to take a stab at a policy, even if it is brief. As mentioned above, you will be adapting this policy fairly quickly. The sooner you develop a benchmark policy and determine what works and what does not, the better. Lance Eaton, a doctoral student in higher education at the University of Massachusetts at Boston, has crowdsourced a Google Document with many helpful examples of AI policies for syllabi. This is an excellent place to start.
  4. Determine your academic integrity policy for AI. This may be part of your general AI policy or it could be separate. Regardless, this will probably be V.1 of your academic integrity policy, but again, starting now will put you in a good position to iterate as needed. To start, review Academic Integrity Policies for Johns Hopkins Schools. Lance Eaton’s Google Document (above) has many examples of AI policies that include academic integrity statements.
  5. Teach your students how to cite AI tools. This information could be incorporated into a syllabus policy and/or academic integrity policy, but correct citation – at least according to August 2023 recommendations of these style guides – is step number one. Making your students aware that they need to cite uses of AI tools and giving them the tools for doing that will (hopefully) incentivize compliance with your academic integrity policies.
    1. APA Citation Guidance – ChatGPT
    2. MLA Citation Guidance – Generative AI
    3. Chicago Style Citation Guidance – ChatGPT
    4. Johns Hopkins Library Guide on Citation
  6. Talk to your local center for teaching and learning. All Hopkins Schools have teaching and learning centers, some have been publishing guidance on how to teach and learn with artificial intelligence tools, and many have been considering the possible consequences of AI in the classroom. Here’s a list of teaching and learning centers at Hopkins, and here are two rich resources developed by two CTLs at Hopkins:
    1. Teaching & Learning in the ChatGPT Era. This website was created by the Center for Learning Design & Technology at the Whiting School of Engineering. It provides a great overview on generative AI as well as providing guidance on academic integrity questions, student use of AI, and assessment design with AI. Kelly Orr, Nathan Graham, Olysha Magruder, Mel Rizzuto, and Edward Queen of the CLDT all contributed to the website as did adjunct faculty David Porter.
    2. Johns Hopkins University Generative AI Tool Implementation Guidance and Best Practices. Jun Fang, Assistant Director in Teaching & Learning@Carey in the Carey School of Business led the development of this resource with contributions from representatives at other schools and teaching and learning centers at Hopkins. This guide provides substantial guidance on using generative AI to design engaging course activities, provide assignment feedback, and gives a list of AI tools for higher education.

Extra credit assignment for those with a little more capacity:

  1. Learn a little about prompt engineering. Prompt engineering is developing and refining questions and statements for AI models such that they generate results with the desired specificity, tone, length, citations, etc. This will give you a sightline into AI capacities beyond a simple one-time command (e.g., “Compare and contrast models of femininity in Shakespeare’s Much Ado About Nothing and Taming of the Shrew”) which may yield an overly broad answer that lacks specificity and nuance. Prompt engineering will also help you learn to direct and guide AI models and not just react to them. For a useful beginner’s guide to prompt engineering, check out the brief video on prompting AI from Wharton School instructors.

Why You Should Do This

Here is why you should take the (small) leap: Artificial intelligence will change the way we teach and learn. The internet did this, email did this, and so will AI. Taking small steps to acculturate to this new reality is the best way to build the flexibility needed to successfully teach and learn with AI – and, very importantly, teach your students how to teach and learn with AI. Here are more reasons to begin to shift your behavior:

  • You can start small. Take this semester as an opportunity to begin to build your AI teaching and learning skills. You do not have to overhaul your syllabi or classroom activities to accommodate AI; you just have to begin to think through the implications of teaching in a world where AI tools are easily available and could pass your homework assignments. Ask yourself how you would coach students encountering your subject matter for the first time, and then apply those principles to your own learning about AI.
  • You will have to learn to adapt quickly. Artificial intelligence tools are evolving rapidly; your course design and instructional approach will do so, too. Each semester will require additional revisions to your syllabi to accommodate our increasing use of AI tools and AI’s increasing capacities. Starting to build those muscles now with lower-effort activities will pay off in the long run.
  • You actually know how to do this. Researching? Developing hypotheses? Evaluating resources? Check, check, and check. Iterating, revising, and adapting as you go along? Teaching students how to evaluate resources? Guiding students to think about the definitions of “artificial,” “intelligence,” and “human”? Check all that, too. The skills required to become AI-literate from a teaching and learning perspective are skills you already have. It is just a matter of applying them to this particular challenge/opportunity/problem (however you frame it).

Finally, give yourself and your students some grace. This is a huge part of beginning to learn how to teach and learn in an AI world; most likely, neither you nor your students will be proficient AI practitioners this semester. You may miss an academic integrity issue or overlook good opportunities to use AI in a classroom activity. Your students may not cite AI correctly or may not cite it at all. They may be far more fluent with AI than you are, or they may be too trusting of AI. Whatever happens, try to remember that you all are new at this and, as new learners, you all may take missteps and make mistakes with the technology.

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

[1] Darby, Flower. (27 June 2023). 4 steps to help you plan for ChatGPT in your classroom. The Chronicle of Higher Education. https://www-chronicle-com.proxy1.library.jhu.edu/article/4-steps-to-help-you-plan-for-chatgpt-in-your-classroom

[2] Bruff, D. (19 July 2023). Assignment makeovers in the AI age: Essay edition. Agile learning: Derek Bruff’s blog on teaching and learning. https://derekbruff.org/?p=4105

Selected Resources

From Hopkins:

Additional resources:

Image Source: Unsplash

Panel Discussion: “Teaching and Learning in the Age of Chatbots and Artificial Intelligence”

On April 4th, the Center for Teaching Excellence and Innovation hosted “Teaching and Learning in the Age of Chatbots and Artificial Intelligence,” a panel discussion on the implications of artificial intelligence in Hopkins classrooms. This discussion, open to attendees from all schools and divisions in Hopkins, yielded insights into the opportunities and limitations of Chatbots, particularly ChatGPT; identified ways to frame its pedagogical uses for students and faculty; and gave guidance for integrating it into classrooms.

The five-person panel consisted of Victoria Harms, DAAD Visiting Assistant Professor, History; Austin Heath, PhD Candidate, Philosophy; Mike Todasco, MFA student, Writing Seminars and former PayPal executive; and Opal Sitzman and Timothy Huang, first-year students taking the Reintroduction to Writing seminar with Alex Lewis, a Post-Doctoral Fellow in the University Writing Program who is using ChatGPT in his courses.

The discussion produced several incisive observations about chatbots and their role in higher education classrooms.

Here is a summary of the main points:

  • Teaching and learning: There was broad consensus that instructors should engage in active inquiry into artificial intelligence (AI) with their students and leverage the tool to help students think critically about evaluating texts, the accuracy of texts, and what a Chatbot’s opportunities and limitations are as a source, creator, and partner in their work.
  • A metacognitive tool: Both instructors and students said one of the best ways to use ChatGPT is as a tool to help students think about their learning and knowledge, from helping to improve writing to assessing the substance of texts.
  • Academic Integrity: Panelists thought that the written work produced by ChatGPT fell below standards for a finished product; it could be inaccurate, incorrect, and overly broad.
  • Academic Integrity and Assessments: One student urged faculty to identify the core issues driving the need for assessment and use those ideas to motivate students to produce original work. This assessment design contrasts with more mechanical and easily-plagiarizable assignments.
  • The students were teaching the faculty: Opal and Tim provided a huge amount of guidance to faculty, including recommended readings, results from their individual research projects, and thoughts on assessment design.

And words of wisdom from some of the panelists:

  • Austin Heath urged attendees to conceptualize ChatGPT as “a tool inquiry vs. a received text or received piece” of truth.
  • Opal Sitzman warned against a “tend[ancy] to overestimate ChatGPT’s current prowess.”
  • Mike Todasco compared ChatGPT’s current capabilities to “mansplaining,” with all of attendant drawbacks of the term.

Tim and Opal kicked off the conversation, describing the ways that students are using AI technology. Opal assured people that AI is not a “nefarious actor” in student lives: “In general, students like playing around with it like writing a Seinfeld episode, but it’s used more for inspiration than cheating.” Tim said, “You can use it to create the first draft of a paper,” and he’s using it as a self-tutoring tool “to adjust how I write.” Mike, in his MFA classes, used it “to be the voice of a computer in a story I was writing. The key is to always acknowledge it.”

Austin and Victoria discussed how they are guiding students to use and think about artificial intelligence. Austin thought of Chatbots “as a student’s student,” a way for students to learn how to evaluate and critique writing. He gives students output from a chatbot explaining a concept and invites them to grade it and offer suggestions for improvement. In Victoria’s class on Europe since 1945, she asked the Chatbot, “Why did the Soviet Union collapse?” Her students critique the answer for “accuracy and substance,” which taught “students that they know something, too.” She urged the audience “to teach students to be critical digesters of information.”

The panelists also weighed in on how their subject matter expertise influenced the way they used and thought about artificial intelligence. Mike, who has been writing about it for a while, said, “I felt like a Cassandra in that no one was listening and now everyone is talking about it.” He then talked about how “People who don’t have access to JHU resources can use it to learn […] the more people use it – not just for teaching, but for life – will help us learn.” Victoria teaches her students “to fact check results, like I do with Wikipedia. We need to integrate these tools into our assessments so they will use them appropriately.”

Opal, who’s interested in neuroscience, wrote a paper considering whether AI is conscious. Her verdict: “[I]t’s still much more simple than our brain,” but, importantly, “it helps us understand the concept of consciousness even if it isn’t conscious itself.” Austin, as a philosopher, applauded Opal’s interest in consciousness before explaining his own interest in “generat[ing] alternative thoughts about writing and giving credit,” saying, “I’m interested in exploring what it means to give attribution. Did a student write this work? Or did AI write this? Or did students work with AI to write this?”

When queried about Chatbots and academic integrity, the panelists mostly talked about its limitations as an easily accessible cheating tool. Opal said, “ChatGPT has a bad reputation for helping students cheat, but people overestimate its abilities. You still have to do a lot of work that requires critical thinking when using it because it doesn’t produce sophisticated results. It might help with a basic prompt.” Mike and Victoria echoed Opal’s opinion. Mike said, “If you were teaching middle schoolers, you might be concerned with cheating,” though he went on to add, “That said, the future version will get better.” Victoria added, “The pandemic taught us that not all students are excited about technology or are tech savvy.”

Tim offered a very good, thoughtful response about using ChatGPT to plagiarize code in a computing course when Kwame Kutton, a Lecturer in Biomedical Engineering, raised a question about doing this. Currently in a computer science course himself, Tim said, “In BME there are unique opportunities to write code that saves lives. Therefore, students need to tackle the core issue to solve before they even write code. We want faculty to teach us how to think about the logic of the problem, not just writing code.” His comment encouraged instructors to think deeply about first framing and identifying the problem for students, which will help motivate them to produce original and independent work.

Mike stated another perspective: “I don’t know any programmer who doesn’t use Copilot,” a code repository on GitHub that uses AI to suggest coding solutions. “My analogy is calculators,” he said. “You need to know how to do math without a calculator, but once you are doing the calculations after setting up the problem, you should use a calculator to help solve the problem.”

A question from the audience about languages, accents, and ChatGPT turned the discussion to issues of accessibility and political bias. Tim saw one of his friends using the Chatbot to translate English to Japanese and then used it himself to translate a Spanish article he was familiar with. His opinion: “It does a better job than Google Translate” though “there are lots of metaphors that get lost in translation by these tools.”

Mike then gave two excellent examples about how ChatGPT is providing access and support to people with divergent and impaired abilities. He said, “ChatGPT 4 is available, but they haven’t released the picture-to-text feature that exists yet. They shared video of someone with visual impairment using ChatGPT 4 to learn what was in the fridge using their phone. It will be able to do amazing things in the future to help us.” He went on to talk about a friend of his who knew someone in San Francisco with a lawncare business who struggled to communicate via email. The owner of the business now uses ChatGPT “to help polish his emails,” thus improving his client relationships.

Opal talked about how ChatGPT struggles with dialects, which turned the conversation to political bias. She’s using ChatGPT to write a short story “in the style of Kate Chopin,” a 19th Century American writer known for writing about Louisiana Creole culture. Opal said, “[Chopin] used a lot of Louisiana dialect” and ChatGPT “struggles” with this because it “is filtered so it doesn’t mimic the racist language used during that time.” She said that people have found ChatGPT to be “an establishment liberal” in its political biases. Victoria brought up “issues of bias in Silicon Valley” and wondered how ChatGPT would address Critical Race Theory (CRT). Mike decided to ask ChatGPT whether we should ban CRT and copied and pasted ChatGPT’s response in the Zoom chat:

As an AI language model, I don’t have personal opinions. However, I can provide you with an analysis. Whether Critical Race Theory (CRT) should be banned in schools is a subjective question, often debated among educators, policymakers, and parents. Supporters argue that CRT promotes understanding of systemic racism and its impacts, while opponents believe it can be divisive and foster racial animosity. Ultimately, the decision to include or exclude CRT in schools depends on the goals and values of the educational community involved.[1]

The conversation ended with speculation about how quickly ChatGPT would progress. Mike said, “The current GPT4 has been remarkable. I’ve written fiction in each version and I’d say it’s getting two grade levels better in each version.” Opal also weighed in: “It will be quick, but I’m not wary yet. We need to keep considering these questions, but I think it’s less something to be scared of and more something to utilize. I don’t see anything being more powerful than humans in the near future.”

Recommended reading and activities:

[1] OpenAI. (2023). ChatGPT (Apr 4 version) [Large language model]. https://chat.openai.com/

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

Image source: Pixabay, Unsplash

A Faculty Follow-up Discussion: Re-engaging Students for the Fall Semester

On Tuesday, November 8th, the Center for Teaching Excellence and Innovation (CTEI) hosted a discussion on re-engaging students for the fall semester. At faculty request, this discussion was a continuation of one initially held in August, when participants explored the challenges they faced with the return to in-person teaching in Spring semester 2022. During that session, faculty offered potential ways to address disengagement in a student population who reported high levels of “stress, fatigue, and anxiety” in a post-pandemic world.male student staring at his computer This phenomenon has been noted in many media outlets, including The Chronicle of Higher Education, which recently hosted a webinar on addressing student disengagement and summarized it in a follow-up article. Mike Reese, Associate Dean and Director of the CTEI, moderated the conversation.  

The session kicked-off with instructors offering their general sense about how student engagement in their Fall courses compared to their Spring courses. The overall assessment was that problems remained, though there were some bright spots:  

  • One instructor noted that attendance in his course’s Friday session, led by teaching assistants, was down almost 50% in the recent week.  
  • Another noted that Fall was “a little bit” better than Spring, when she was still teaching online via Zoom, but she continued to observe a lot of “struggle” among her students, exacerbated by a lack of knowledge of how to address it.  
  • One participant, who regularly polled his students on their overall well-being on a scale from one to five with five being the highest score, said he was seeing a lot of ones and twos among his students. However, he started this practice during the pandemic so he didn’t have any pre-pandemic data to baseline the response.  
  • A fourth participant had observed that her students’ behavior was better, but they also had large gaps in their subject-matter knowledge due to the instructional disruptions incurred by the pandemic. 

Time management issues quickly became the dominant topic when one faculty member pointed out that this was a particular problem for his students. Other participants also offered examples of students struggling with time management; one faculty member said that she had received a lot of requests for extensions from students who admitted these were due to poor time management, and another said that she observed an all-senior class – usually a population with a good sense of time management –also contending with this issue.group of students socializing The reason for this, attendees speculated, may have to do with the full return to on-site courses and residential campus life. Students may be excited to dive back into campus life, trying to take advantage of opportunities, like lab-based research, not available during the pandemic, and becoming over-committed as a result. Another reason offered was that the time management skills needed to negotiate pandemic life and instruction needed to be re-adjusted for more typical university life.   

The post-pandemic gap in content-specific knowledge, particularly in the STEM disciplines, has prompted some academic programs to start looking at ways to make changes to their large introductory or gateway courses. One participant said her program was looking to make data-based adjustments informed by placement tests, in-person attendance at office hours, and data from Canvas classrooms and learning-support software, such as ALEKS. 

As a group, the participants generated several useful ideas to enhance engagement in both large lecture-style courses and smaller seminar courses:  

  • Increasing structure for small-group discussions in large classrooms: One instructor had added question prompts and a pre-identified spokesperson to her small-group break-out discussions to increase student focus, participation, and output during these sessions.  
  • Flipping one class meeting a week to start homework: Another instructor had flipped one class meeting a week to provide students with a pre-determined timeslot in which to start their homework each week and receive real-time instructional feedback. This helped students with time management and on-time completion of the homework.  
  • Requiring a one-to-one meeting outside class: An attendee required that seminar students meet with him one-on-one at least once outside of class, which helped build relationships and comfort with class participation.  
  • Requiring student socialization outside class: A participant volunteered onegroup of students smiling approach that she heard about via the Chronicle Webinar: to require that students meet and socialize outside of class twice a month to work on “conversation fundamentals” – how to have a balanced conversation, how to use open-ended questions – to build “social sophistication and stamina” in in-person environments post-pandemic. 
  • Mid-semester surveys: Two instructors distributed mid-semester surveys to students that specifically targeted issues of classroom engagement, and one queried participants about their time-on-task for assignments and activities. Though survey participation was low in one course, both instructors were reviewing and integrating appropriate feedback.  
  • Panels of former students: One attendee noted that he had invited a panel of former students to talk about their experiences in the class and what contributed to their success. The credibility of the speakers and the authenticity of the guidance resonated with the current students.  
  • Strategic use of Learning Assistants or Course Assistants: Some instructors in large or introductory courses used Learning Assistants or Course Assistants – undergraduate students successful in the subject area who are trained to provide in-class instructional support – to scale up instructional reach and feedback. These assistants had been particularly crucial in courses that needed more hands-on instructional support, structure, and feedback.    

Many instructors found themselves structuring tasks and activities for students that, pre-pandemic, may not have required direct guidance and direction. Given this need, the importance of student meta-cognition – knowing how to learn something – was raised, which resulted in the following suggestions:       

  • Using learning science data to persuade students: One participant noted that her students were very responsive to research-based arguments. When she offered students evidence-based examples of effective ways to learn (she cited  The Learning Scientists blog as a good source of information), they responded affirmatively to these suggestions. Leveraging learning science research when suggesting better ways to study – retain, recall, and synthesize content – might be one way to help bolster meta-cognition.  
  • Building in self-reflection on effective learning approaches: An attendee recommended integrating opportunities for students to self-reflect on the usefulness of teaching interventions, such as the one-course-meeting-a-week flipped classroom for starting homework. Such reflection on why a certain approach worked (in this case, in-class time dedicated to starting homework with in-person instructional feedback) may help students build (or re-build) their meta-cognitive muscles.  

The conversation turned to tools that could support both targeted in-class instruction and meta-cognition skill development. Brian Cole, Associate Director of the CTEI, said that he had been investigating different technologies that would enable real-time assessment of content comprehension and upvoting of particularly confusing content areas. Melo Yap, the new Sr. Educational Research Consultant at the CTEI, volunteered Kahoot as a tool that could offer such flexibility. 

 A faculty member suggested developing a toolkit with proven meta-cognitive strategies that could be inserted into the Canvas sections of each course. Instructors and students could access this toolkit on-demand and integrate into it their course design for both “just-in-time” support (e.g., before a high-stakes test) and more long-term development. The CTEI offered to collect any already-available guidance to help students learn more effectively in an effort to start collating this information in one place.  

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

Mike Reese
Mike Reese is Associate Dean of the Center for Teaching Excellence and Innovation and associate teaching professor in Sociology.

Image Source: Unsplash