Adapting to AI in the Classroom for Time-Strapped Instructors

In the past few months, we have spoken to many instructors – faculty, graduate students, even undergraduate teaching assistants –  who are doing very interesting things with artificial intelligence tools in their classes this coming fall. Some are writing grants to support research into classroom uses of AI, some are designing interactive online modules to help teach about the ethics of AI, and some are integrating AI tools into their instructional activities.

This blog post is for another instructor population: those that have not had the time or capacity to redevelop their courses, their assessments, or their activities to accommodate an AI world. “Redesigning assessments with AI in mind” might be the 20th item on a long list of to-dos for the coming semester. Adapting to new technologies that could change the classroom experience – and AI is certainly one of them – seems like an overwhelming task. Classes start in one week, and wrestling with the teaching and learning opportunities and challenges of artificial intelligence may not be an achievable goal.

However, there are some concrete steps and curated resources to take into account in terms of AI when planning and teaching your courses.

Recommendations for Starting with AI

Here are six recommendations (and one extra credit assignment). Following all of these suggestions will put you on good footing with the learning curve associated with AI in the classroom, but even doing one or two is a good way to start.

  1. Experiment with ChatGPT and other AI tools. Just get in there and start using them and see what they produce. In an article for the Chronicle of Higher Education, one writer said, “I started by reminding myself, anytime I was about to Google something, to ask ChatGPT.”[1] ChatGPT-ing (or using Google Bard) instead of Google-ing is a good on-ramp to AI usage. You may even find them useful to you as an instructor. Here are four basic generative AI models to start with along with prompt suggestions:
    1. ChatGPT – The first (and by some reports, still the most accurate) text-based generative AI. Prompt suggestion: Ask a basic question about teaching, e.g., “How can I grade exams more efficiently?” or “How can I provide written feedback more efficiently?”
    2. Google BardLess text-heavy than ChatGPT; potentially geared towards more logic-based questions, e.g., “How do I create a website in WordPress?”
    3. Microsoft BingAble to generate images as well as text and simultaneously harness the power of a search engine. Potential question: “Name the characteristics of neo-classical architecture and provide an example.”
    4. Fotor.com Image-generator AI. Potential question: “Provide an illustration for my chemistry class syllabus.”
  2. Run your assignments through an AI tool. This will help benchmark possible AI-generated responses to your assignments. More sophisticated AI users will engage in prompt engineering that could make uncited or incorrect usage of AI harder to detect, but getting at least one example of an AI response is helpful. It will not only provide a sightline into possible academic integrity issues but also point to whether your assignment may need to be revised or redeveloped, which could include integrating AI itself. Derek Bruff, a writer and higher education consultant, provides good guidance on assessment design in light of AI:
    1. Why does this assignment make sense for this course?
    2. What are specific learning objectives for this assignment?
    3. How might students use AI tools while working on this assignment?
    4. How might AI undercut the goals of this assignment? How could you mitigate this?
    5. How might AI enhance the assignment? Where would students need help figuring that out?
    6. Focus on the process. How could you make the assignment more meaningful for students or support them more in the work? [2]
  3. Add an AI policy to your syllabus. This may require doing some or all of the recommendations above, but even if you do not have the capacity to take a deep dive into AI tools before courses start, it is a good idea to take a stab at a policy, even if it is brief. As mentioned above, you will be adapting this policy fairly quickly. The sooner you develop a benchmark policy and determine what works and what does not, the better. Lance Eaton, a doctoral student in higher education at the University of Massachusetts at Boston, has crowdsourced a Google Document with many helpful examples of AI policies for syllabi. This is an excellent place to start.
  4. Determine your academic integrity policy for AI. This may be part of your general AI policy or it could be separate. Regardless, this will probably be V.1 of your academic integrity policy, but again, starting now will put you in a good position to iterate as needed. To start, review Academic Integrity Policies for Johns Hopkins Schools. Lance Eaton’s Google Document (above) has many examples of AI policies that include academic integrity statements.
  5. Teach your students how to cite AI tools. This information could be incorporated into a syllabus policy and/or academic integrity policy, but correct citation – at least according to August 2023 recommendations of these style guides – is step number one. Making your students aware that they need to cite uses of AI tools and giving them the tools for doing that will (hopefully) incentivize compliance with your academic integrity policies.
    1. APA Citation Guidance – ChatGPT
    2. MLA Citation Guidance – Generative AI
    3. Chicago Style Citation Guidance – ChatGPT
    4. Johns Hopkins Library Guide on Citation
  6. Talk to your local center for teaching and learning. All Hopkins Schools have teaching and learning centers, some have been publishing guidance on how to teach and learn with artificial intelligence tools, and many have been considering the possible consequences of AI in the classroom. Here’s a list of teaching and learning centers at Hopkins, and here are two rich resources developed by two CTLs at Hopkins:
    1. Teaching & Learning in the ChatGPT Era. This website was created by the Center for Learning Design & Technology at the Whiting School of Engineering. It provides a great overview on generative AI as well as providing guidance on academic integrity questions, student use of AI, and assessment design with AI. Kelly Orr, Nathan Graham, Olysha Magruder, Mel Rizzuto, and Edward Queen of the CLDT all contributed to the website as did adjunct faculty David Porter.
    2. Johns Hopkins University Generative AI Tool Implementation Guidance and Best Practices. Jun Fang, Assistant Director in Teaching & Learning@Carey in the Carey School of Business led the development of this resource with contributions from representatives at other schools and teaching and learning centers at Hopkins. This guide provides substantial guidance on using generative AI to design engaging course activities, provide assignment feedback, and gives a list of AI tools for higher education.

Extra credit assignment for those with a little more capacity:

  1. Learn a little about prompt engineering. Prompt engineering is developing and refining questions and statements for AI models such that they generate results with the desired specificity, tone, length, citations, etc. This will give you a sightline into AI capacities beyond a simple one-time command (e.g., “Compare and contrast models of femininity in Shakespeare’s Much Ado About Nothing and Taming of the Shrew”) which may yield an overly broad answer that lacks specificity and nuance. Prompt engineering will also help you learn to direct and guide AI models and not just react to them. For a useful beginner’s guide to prompt engineering, check out the brief video on prompting AI from Wharton School instructors.

Why You Should Do This

Here is why you should take the (small) leap: Artificial intelligence will change the way we teach and learn. The internet did this, email did this, and so will AI. Taking small steps to acculturate to this new reality is the best way to build the flexibility needed to successfully teach and learn with AI – and, very importantly, teach your students how to teach and learn with AI. Here are more reasons to begin to shift your behavior:

  • You can start small. Take this semester as an opportunity to begin to build your AI teaching and learning skills. You do not have to overhaul your syllabi or classroom activities to accommodate AI; you just have to begin to think through the implications of teaching in a world where AI tools are easily available and could pass your homework assignments. Ask yourself how you would coach students encountering your subject matter for the first time, and then apply those principles to your own learning about AI.
  • You will have to learn to adapt quickly. Artificial intelligence tools are evolving rapidly; your course design and instructional approach will do so, too. Each semester will require additional revisions to your syllabi to accommodate our increasing use of AI tools and AI’s increasing capacities. Starting to build those muscles now with lower-effort activities will pay off in the long run.
  • You actually know how to do this. Researching? Developing hypotheses? Evaluating resources? Check, check, and check. Iterating, revising, and adapting as you go along? Teaching students how to evaluate resources? Guiding students to think about the definitions of “artificial,” “intelligence,” and “human”? Check all that, too. The skills required to become AI-literate from a teaching and learning perspective are skills you already have. It is just a matter of applying them to this particular challenge/opportunity/problem (however you frame it).

Finally, give yourself and your students some grace. This is a huge part of beginning to learn how to teach and learn in an AI world; most likely, neither you nor your students will be proficient AI practitioners this semester. You may miss an academic integrity issue or overlook good opportunities to use AI in a classroom activity. Your students may not cite AI correctly or may not cite it at all. They may be far more fluent with AI than you are, or they may be too trusting of AI. Whatever happens, try to remember that you all are new at this and, as new learners, you all may take missteps and make mistakes with the technology.

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

[1] Darby, Flower. (27 June 2023). 4 steps to help you plan for ChatGPT in your classroom. The Chronicle of Higher Education. https://www-chronicle-com.proxy1.library.jhu.edu/article/4-steps-to-help-you-plan-for-chatgpt-in-your-classroom

[2] Bruff, D. (19 July 2023). Assignment makeovers in the AI age: Essay edition. Agile learning: Derek Bruff’s blog on teaching and learning. https://derekbruff.org/?p=4105

Selected Resources

From Hopkins:

Additional resources:

Image Source: Unsplash

Panel Discussion: “Teaching and Learning in the Age of Chatbots and Artificial Intelligence”

On April 4th, the Center for Teaching Excellence and Innovation hosted “Teaching and Learning in the Age of Chatbots and Artificial Intelligence,” a panel discussion on the implications of artificial intelligence in Hopkins classrooms. This discussion, open to attendees from all schools and divisions in Hopkins, yielded insights into the opportunities and limitations of Chatbots, particularly ChatGPT; identified ways to frame its pedagogical uses for students and faculty; and gave guidance for integrating it into classrooms.

The five-person panel consisted of Victoria Harms, DAAD Visiting Assistant Professor, History; Austin Heath, PhD Candidate, Philosophy; Mike Todasco, MFA student, Writing Seminars and former PayPal executive; and Opal Sitzman and Timothy Huang, first-year students taking the Reintroduction to Writing seminar with Alex Lewis, a Post-Doctoral Fellow in the University Writing Program who is using ChatGPT in his courses.

The discussion produced several incisive observations about chatbots and their role in higher education classrooms.

Here is a summary of the main points:

  • Teaching and learning: There was broad consensus that instructors should engage in active inquiry into artificial intelligence (AI) with their students and leverage the tool to help students think critically about evaluating texts, the accuracy of texts, and what a Chatbot’s opportunities and limitations are as a source, creator, and partner in their work.
  • A metacognitive tool: Both instructors and students said one of the best ways to use ChatGPT is as a tool to help students think about their learning and knowledge, from helping to improve writing to assessing the substance of texts.
  • Academic Integrity: Panelists thought that the written work produced by ChatGPT fell below standards for a finished product; it could be inaccurate, incorrect, and overly broad.
  • Academic Integrity and Assessments: One student urged faculty to identify the core issues driving the need for assessment and use those ideas to motivate students to produce original work. This assessment design contrasts with more mechanical and easily-plagiarizable assignments.
  • The students were teaching the faculty: Opal and Tim provided a huge amount of guidance to faculty, including recommended readings, results from their individual research projects, and thoughts on assessment design.

And words of wisdom from some of the panelists:

  • Austin Heath urged attendees to conceptualize ChatGPT as “a tool inquiry vs. a received text or received piece” of truth.
  • Opal Sitzman warned against a “tend[ancy] to overestimate ChatGPT’s current prowess.”
  • Mike Todasco compared ChatGPT’s current capabilities to “mansplaining,” with all of attendant drawbacks of the term.

Tim and Opal kicked off the conversation, describing the ways that students are using AI technology. Opal assured people that AI is not a “nefarious actor” in student lives: “In general, students like playing around with it like writing a Seinfeld episode, but it’s used more for inspiration than cheating.” Tim said, “You can use it to create the first draft of a paper,” and he’s using it as a self-tutoring tool “to adjust how I write.” Mike, in his MFA classes, used it “to be the voice of a computer in a story I was writing. The key is to always acknowledge it.”

Austin and Victoria discussed how they are guiding students to use and think about artificial intelligence. Austin thought of Chatbots “as a student’s student,” a way for students to learn how to evaluate and critique writing. He gives students output from a chatbot explaining a concept and invites them to grade it and offer suggestions for improvement. In Victoria’s class on Europe since 1945, she asked the Chatbot, “Why did the Soviet Union collapse?” Her students critique the answer for “accuracy and substance,” which taught “students that they know something, too.” She urged the audience “to teach students to be critical digesters of information.”

The panelists also weighed in on how their subject matter expertise influenced the way they used and thought about artificial intelligence. Mike, who has been writing about it for a while, said, “I felt like a Cassandra in that no one was listening and now everyone is talking about it.” He then talked about how “People who don’t have access to JHU resources can use it to learn […] the more people use it – not just for teaching, but for life – will help us learn.” Victoria teaches her students “to fact check results, like I do with Wikipedia. We need to integrate these tools into our assessments so they will use them appropriately.”

Opal, who’s interested in neuroscience, wrote a paper considering whether AI is conscious. Her verdict: “[I]t’s still much more simple than our brain,” but, importantly, “it helps us understand the concept of consciousness even if it isn’t conscious itself.” Austin, as a philosopher, applauded Opal’s interest in consciousness before explaining his own interest in “generat[ing] alternative thoughts about writing and giving credit,” saying, “I’m interested in exploring what it means to give attribution. Did a student write this work? Or did AI write this? Or did students work with AI to write this?”

When queried about Chatbots and academic integrity, the panelists mostly talked about its limitations as an easily accessible cheating tool. Opal said, “ChatGPT has a bad reputation for helping students cheat, but people overestimate its abilities. You still have to do a lot of work that requires critical thinking when using it because it doesn’t produce sophisticated results. It might help with a basic prompt.” Mike and Victoria echoed Opal’s opinion. Mike said, “If you were teaching middle schoolers, you might be concerned with cheating,” though he went on to add, “That said, the future version will get better.” Victoria added, “The pandemic taught us that not all students are excited about technology or are tech savvy.”

Tim offered a very good, thoughtful response about using ChatGPT to plagiarize code in a computing course when Kwame Kutton, a Lecturer in Biomedical Engineering, raised a question about doing this. Currently in a computer science course himself, Tim said, “In BME there are unique opportunities to write code that saves lives. Therefore, students need to tackle the core issue to solve before they even write code. We want faculty to teach us how to think about the logic of the problem, not just writing code.” His comment encouraged instructors to think deeply about first framing and identifying the problem for students, which will help motivate them to produce original and independent work.

Mike stated another perspective: “I don’t know any programmer who doesn’t use Copilot,” a code repository on GitHub that uses AI to suggest coding solutions. “My analogy is calculators,” he said. “You need to know how to do math without a calculator, but once you are doing the calculations after setting up the problem, you should use a calculator to help solve the problem.”

A question from the audience about languages, accents, and ChatGPT turned the discussion to issues of accessibility and political bias. Tim saw one of his friends using the Chatbot to translate English to Japanese and then used it himself to translate a Spanish article he was familiar with. His opinion: “It does a better job than Google Translate” though “there are lots of metaphors that get lost in translation by these tools.”

Mike then gave two excellent examples about how ChatGPT is providing access and support to people with divergent and impaired abilities. He said, “ChatGPT 4 is available, but they haven’t released the picture-to-text feature that exists yet. They shared video of someone with visual impairment using ChatGPT 4 to learn what was in the fridge using their phone. It will be able to do amazing things in the future to help us.” He went on to talk about a friend of his who knew someone in San Francisco with a lawncare business who struggled to communicate via email. The owner of the business now uses ChatGPT “to help polish his emails,” thus improving his client relationships.

Opal talked about how ChatGPT struggles with dialects, which turned the conversation to political bias. She’s using ChatGPT to write a short story “in the style of Kate Chopin,” a 19th Century American writer known for writing about Louisiana Creole culture. Opal said, “[Chopin] used a lot of Louisiana dialect” and ChatGPT “struggles” with this because it “is filtered so it doesn’t mimic the racist language used during that time.” She said that people have found ChatGPT to be “an establishment liberal” in its political biases. Victoria brought up “issues of bias in Silicon Valley” and wondered how ChatGPT would address Critical Race Theory (CRT). Mike decided to ask ChatGPT whether we should ban CRT and copied and pasted ChatGPT’s response in the Zoom chat:

As an AI language model, I don’t have personal opinions. However, I can provide you with an analysis. Whether Critical Race Theory (CRT) should be banned in schools is a subjective question, often debated among educators, policymakers, and parents. Supporters argue that CRT promotes understanding of systemic racism and its impacts, while opponents believe it can be divisive and foster racial animosity. Ultimately, the decision to include or exclude CRT in schools depends on the goals and values of the educational community involved.[1]

The conversation ended with speculation about how quickly ChatGPT would progress. Mike said, “The current GPT4 has been remarkable. I’ve written fiction in each version and I’d say it’s getting two grade levels better in each version.” Opal also weighed in: “It will be quick, but I’m not wary yet. We need to keep considering these questions, but I think it’s less something to be scared of and more something to utilize. I don’t see anything being more powerful than humans in the near future.”

Recommended reading and activities:

[1] OpenAI. (2023). ChatGPT (Apr 4 version) [Large language model]. https://chat.openai.com/

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

Image source: Pixabay, Unsplash

A Faculty Follow-up Discussion: Re-engaging Students for the Fall Semester

On Tuesday, November 8th, the Center for Teaching Excellence and Innovation (CTEI) hosted a discussion on re-engaging students for the fall semester. At faculty request, this discussion was a continuation of one initially held in August, when participants explored the challenges they faced with the return to in-person teaching in Spring semester 2022. During that session, faculty offered potential ways to address disengagement in a student population who reported high levels of “stress, fatigue, and anxiety” in a post-pandemic world.male student staring at his computer This phenomenon has been noted in many media outlets, including The Chronicle of Higher Education, which recently hosted a webinar on addressing student disengagement and summarized it in a follow-up article. Mike Reese, Associate Dean and Director of the CTEI, moderated the conversation.  

The session kicked-off with instructors offering their general sense about how student engagement in their Fall courses compared to their Spring courses. The overall assessment was that problems remained, though there were some bright spots:  

  • One instructor noted that attendance in his course’s Friday session, led by teaching assistants, was down almost 50% in the recent week.  
  • Another noted that Fall was “a little bit” better than Spring, when she was still teaching online via Zoom, but she continued to observe a lot of “struggle” among her students, exacerbated by a lack of knowledge of how to address it.  
  • One participant, who regularly polled his students on their overall well-being on a scale from one to five with five being the highest score, said he was seeing a lot of ones and twos among his students. However, he started this practice during the pandemic so he didn’t have any pre-pandemic data to baseline the response.  
  • A fourth participant had observed that her students’ behavior was better, but they also had large gaps in their subject-matter knowledge due to the instructional disruptions incurred by the pandemic. 

Time management issues quickly became the dominant topic when one faculty member pointed out that this was a particular problem for his students. Other participants also offered examples of students struggling with time management; one faculty member said that she had received a lot of requests for extensions from students who admitted these were due to poor time management, and another said that she observed an all-senior class – usually a population with a good sense of time management –also contending with this issue.group of students socializing The reason for this, attendees speculated, may have to do with the full return to on-site courses and residential campus life. Students may be excited to dive back into campus life, trying to take advantage of opportunities, like lab-based research, not available during the pandemic, and becoming over-committed as a result. Another reason offered was that the time management skills needed to negotiate pandemic life and instruction needed to be re-adjusted for more typical university life.   

The post-pandemic gap in content-specific knowledge, particularly in the STEM disciplines, has prompted some academic programs to start looking at ways to make changes to their large introductory or gateway courses. One participant said her program was looking to make data-based adjustments informed by placement tests, in-person attendance at office hours, and data from Canvas classrooms and learning-support software, such as ALEKS. 

As a group, the participants generated several useful ideas to enhance engagement in both large lecture-style courses and smaller seminar courses:  

  • Increasing structure for small-group discussions in large classrooms: One instructor had added question prompts and a pre-identified spokesperson to her small-group break-out discussions to increase student focus, participation, and output during these sessions.  
  • Flipping one class meeting a week to start homework: Another instructor had flipped one class meeting a week to provide students with a pre-determined timeslot in which to start their homework each week and receive real-time instructional feedback. This helped students with time management and on-time completion of the homework.  
  • Requiring a one-to-one meeting outside class: An attendee required that seminar students meet with him one-on-one at least once outside of class, which helped build relationships and comfort with class participation.  
  • Requiring student socialization outside class: A participant volunteered onegroup of students smiling approach that she heard about via the Chronicle Webinar: to require that students meet and socialize outside of class twice a month to work on “conversation fundamentals” – how to have a balanced conversation, how to use open-ended questions – to build “social sophistication and stamina” in in-person environments post-pandemic. 
  • Mid-semester surveys: Two instructors distributed mid-semester surveys to students that specifically targeted issues of classroom engagement, and one queried participants about their time-on-task for assignments and activities. Though survey participation was low in one course, both instructors were reviewing and integrating appropriate feedback.  
  • Panels of former students: One attendee noted that he had invited a panel of former students to talk about their experiences in the class and what contributed to their success. The credibility of the speakers and the authenticity of the guidance resonated with the current students.  
  • Strategic use of Learning Assistants or Course Assistants: Some instructors in large or introductory courses used Learning Assistants or Course Assistants – undergraduate students successful in the subject area who are trained to provide in-class instructional support – to scale up instructional reach and feedback. These assistants had been particularly crucial in courses that needed more hands-on instructional support, structure, and feedback.    

Many instructors found themselves structuring tasks and activities for students that, pre-pandemic, may not have required direct guidance and direction. Given this need, the importance of student meta-cognition – knowing how to learn something – was raised, which resulted in the following suggestions:       

  • Using learning science data to persuade students: One participant noted that her students were very responsive to research-based arguments. When she offered students evidence-based examples of effective ways to learn (she cited  The Learning Scientists blog as a good source of information), they responded affirmatively to these suggestions. Leveraging learning science research when suggesting better ways to study – retain, recall, and synthesize content – might be one way to help bolster meta-cognition.  
  • Building in self-reflection on effective learning approaches: An attendee recommended integrating opportunities for students to self-reflect on the usefulness of teaching interventions, such as the one-course-meeting-a-week flipped classroom for starting homework. Such reflection on why a certain approach worked (in this case, in-class time dedicated to starting homework with in-person instructional feedback) may help students build (or re-build) their meta-cognitive muscles.  

The conversation turned to tools that could support both targeted in-class instruction and meta-cognition skill development. Brian Cole, Associate Director of the CTEI, said that he had been investigating different technologies that would enable real-time assessment of content comprehension and upvoting of particularly confusing content areas. Melo Yap, the new Sr. Educational Research Consultant at the CTEI, volunteered Kahoot as a tool that could offer such flexibility. 

 A faculty member suggested developing a toolkit with proven meta-cognitive strategies that could be inserted into the Canvas sections of each course. Instructors and students could access this toolkit on-demand and integrate into it their course design for both “just-in-time” support (e.g., before a high-stakes test) and more long-term development. The CTEI offered to collect any already-available guidance to help students learn more effectively in an effort to start collating this information in one place.  

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

Mike Reese
Mike Reese is Associate Dean of the Center for Teaching Excellence and Innovation and associate teaching professor in Sociology.

Image Source: Unsplash