Lunch and Learn: Generative AI Uses in the Classroom

On Tuesday, April 23rd, the Center for Teaching Excellence and Innovation (CTEI) hosted a Lunch and Learn on Generative AI Uses in the Classroom. Faculty panelists included Louis Hyman, Dorothy Ross Professor of Political Economy in History and Professor at the SNF Agora Institute, Jeffrey Gray, Professor of Chemical and Biomolecular Engineering in the Whiting School, and Brian Klaas, Assistant Director for Technology and instructor at the Bloomberg School of Public Health. Caroline Egan, Teaching Academy Program Manager, moderated the discussion.  

Louis Hyman began the presentation by reminding the audience what large language models (LLMs) like ChatGPT can and cannot do. For example, ChatGPT does not “know” anything and is incapable of reasoning. It generates text that it predicts will best answer the prompt it was given, based on how it was trained. In addition to his course work, Hyman mentioned several tasks he uses ChatGPT to assist with, including text summarization, writing complicated Excel formulas, writing and editing drafts, making PowerPoint tables, and turning image files in the right direction.

In Hyman’s course, AI and Data Methods in History, students are introduced to a variety of tools (e.g., Google Sheets, ChatGPT, Python) that help them analyze and think critically about historical data. Hyman described how students used primers from LinkedIn Learning as well as Generative AI prompts to increase their technical skills which enabled them to take a deeper dive into data analysis. For example, while it would have been too complicated for most students to write code on their own, they learned how to prompt ChatGPT to write code for them.  By the end of the semester, students used application programming interface (API) calls to send data to Google, used OpenAI to clean up historical documents and images presented using optical character recognition (OCR), and used ChatGPT and Python to plot and map historical data.Two maps of 1850 New England showing the number of congregational churches and the value of congregational property. Data points plotted by students using AI.

Hyman noted that one of the most challenging parts of the course was convincing students that it was OK to use ChatGPT, that they were not cheating.  Another challenge was that many students lacked basic computer literacy skills, therefore, getting everyone up to speed took some time. There was also not one shared computer structure/platform. The successes of the course include students’ ability to use libraries and APIs to make arguments in their data analysis, apply statistical analysis of the data, and ask historical questions about the results they were seeing in the data.

Jeff Gray continued by describing his Computational Protein Structure Prediction and Design course that he has taught for over 18 years. In this course, students use molecular visualization and prediction tools like PyRosetta, an interactive Python-based interface that allows them to design custom molecular modeling algorithms. Recently, Gray has introduced open-sourced AI tools into the curriculum (AlphaFold and RoseTTAFold), which predict 3D models of protein structures.

Example of protein folding using AlphaFold.

One of the challenges Gray mentioned was the diversity of student academic backgrounds. There were students from engineering, biology, bioinformatics, computer science, and applied math, among others. To accommodate this challenge, Gray used specifications grading, a grading method in which students are graded pass/fail on individual assessments that align directly with learning goals. In Gray’s class, students were presented with a bundle of problem sets categorized at various difficulty levels. Students selected which ones they wanted to complete and had the option of resubmitting them a second time for full credit. Gray is undecided about using this method going forward, noting that half of the students ended up dropping the course when they tried to complete all of the problems instead of just a few, and found the workload too heavy.  Another challenge was how to balance the fundamental depth of the subject matter versus application.  To address this, Gray structured the twice weekly class with a lecture on one day and a hands-on workshop the other day, which seemed to work well.

Brian Klaas teaches a one credit pass/fail course called Using Generative AI to Improve Public Health. The goal of this course is to allow students to explore AI tools, gain a basic understanding of how they work, and then apply them to their academic work and research. In addition to using the tools, students discussed the possible harms in Generative AI, such as confabulations, biases, etc., the impact of these tools in Public Health research, and future concerns such as the impact on the environment and copyright law. Klaas shared his syllabus statement regarding the usage of AI tools in class, something he strongly recommends all faculty share with their students 

Hands-on assignments included various ways of using Generative AI. In one assignment, students were asked to write a summary of a journal article and then have GenAI write a summary of the same article geared towards different audiences (academics vs. high school students). Students were then asked to analyze the differences between the summaries.Sample instagram post created using AI showing people from different cultures dressed as medical professionals. For another assignment, students were asked to pick from a set of topics and use Generative AI to teach them about the selected topic, noting any confabulations or biases present. They then asked GenAI to create a five-question quiz on the topic and take the quiz. A final assignment was to create an Instagram post on the same topic including a single image and a few sentences explaining the topic to a lay audience. All assignments included a reflection piece which often required peer review.

Lessons learned: Students loved the interdisciplinary approach to the course, confabulations reinforce core data research skills, and learning from each other is key.

The discussion continued with questions from the audience: 

Q: What would you recommend to an instructor who is considering implementing GenAI in the classroom? How do they start thinking about GenAI?
JG: Jupyter notebooks are pretty easy to use. I think students should just give it a try.
LH: I recommend showing students what ”bad” examples look like. The truth is, we can still write better than computers. Use AI to draft papers and then use it as an editing tool – it’s very good as an editing tool. Students can learn a lot from that.
BK : I recommend having students experiment and see where the strengths lie, get an overall awareness of it. Reflect on that process, see what went well, not so well. Feed in an assignment and see what happens. Use a rubric to evaluate the assignment. Put a transcript in and ask it to create a quiz on that information. It can save you some time.

Q for Brian Klaas: What version of GPT were you using?
BK: Any of them – I didn’t prescribe specific tools or versions. We have students all over the world, so they used whatever they had. ChatGPT, Claude, MidJourney, etc. I let the students decide and allowed them to compare differences.

Q for Jeff Gray: Regrading the number of students who dropped, is the aim of the course to have as many students as possible, or a group who is wholly into it?
JG: I don’t know, I’m struggling with this. I want to invite all students but also need to be able to dig into the math and material. It feels like we just scratched the surface. Maybe offering an intersession course to learn the tools before they take this class would be helpful. There is no standard curriculum yet for AI. Where to begin…we’re all over the map as far as what should be included in the curriculum.
LH: I guess it depends on what your goals are. Students are good at “plug and chug,” but bad at asking questions like, “what does this mean?”
BK: We didn’t get to cover everything, either – there is not enough time in a one credit class. There are just so many things to cover.

Q: What advice do you have for faculty who are not computer scientists? Where should we start learning? What should we teach students?
LH: You can ask it to teach you Python, or how to do an API call. It’s amazing at this. I don’t know coding as well as others, but it helps. Just start asking it [GenAI]. Trust it for teaching something like getting Pytorch running on your PC. Encourage students to be curious and just start prompting it.
BK: If you’re not interested in Jupyter notebooks, or some of the more complicated functions, you can use these tools without dealing in data science. It can do other things. It’s about figuring out how to use it to save time, for ideation, for brainstorming.
JG: I have to push back – what if I want to know about what’s going on in Palestine and Israel? I don’t know what I don’t know. How do I know what it’s telling me is correct?
LH: I don’t use it for history – but where is the line of what it’s good and not good at?
BK: I would use it for task lists, areas to explore further, but remember that it has no concept of truth. If you are someone who knows something about the topic, it does get you over the hurdles.
JG: You have to be an expert in the area to rely on it.
LH: Students at the end of my course made so much progress in coding. It depends on what you ask it to do – protein folding is very different than history that already happened.

Q: How can we address concerns with fairness and bias with these tools in teaching?
BK: Give students foundational knowledge about how the tools work. Understand that these are prediction machines that make stuff up. There have been studies done that show how biased they are, with simple prompts. Tell students to experiment – they will learn from this. I suggest working this in as a discussion or some practice for themselves.

Q: Students have learned to ask questions better – would you rather be living now with these tools, or without them?
JG: Students are brainstorming better. They are using more data and more statistics.
BK: AI requires exploration and play to get good responses. It really takes time to learn how to prompt well. You have to keep trying. Culturally, our students are optimized for finding the “right answer;” AI programs us to think that there are multiple answers. There is no one right answer for how to get there.
LH: Using AI is just a different process to get there. It’s different than what we had to do in college. It was hard to use computers because many of us had to play with them to get things to work. Now it all works beautifully with smart phones. Students today aren’t comfortable experimenting. How do we move from memorization to asking questions? It’s very important to me that students have this experience. It’s uncomfortable to be free and questioning, and then go back to the data. How do we reconcile this?

JG: What age is appropriate to introduce AI to kids?
LH: Students don’t read and write as much as they used to. I’m not sure about the balance.
Guest: I work with middle and high school teachers. Middle school is a great time to introduce AI. Middle school kids are already good at taking information in and figuring out what it means. Teachers need time to learn the tools before introducing it to students, including how the tools can be biased, etc.

Q: How can we encourage creative uses of AI?
BK: Ethan Mollick is a good person to follow regarding creative uses of AI in education and what frameworks are out there. To encourage creativity, the more we expose AI to students, the better. They need to play and experiment. We need to teach them to push through and figure things out.
LH: AI enables all of us to do things now that weren’t possible. We need to remember it’s an augment to what we do, not a substitute for our work.

Resources:
Hyman slides
Gray slides
Klaas slides

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image source: Lunch and Learn logo, Hyman, Gray, and Klaas presentation slides, Unsplash

Lunch and Learn: Canvas Show and Tell

 On Wednesday, November 1st, the Center for Teaching Excellence and Innovation (CTEI) hosted a Canvas Show and Tell: Share and Learn about Engaging and Effective Uses of Canvas. Alison Papadakis, Teaching Professor and Director of Clinical Psychological Studies in the Department of Psychological & Brain Sciences, hosted the discussion. She was joined by Emily Braley, Assistant Dean for Undergraduate Academic Affairs and Associate Teaching Professor in the Department of Mathematics, and Jamie Young, Lecturer in the Department of Chemistry. Beth Hals, Brian Cole, and Caroline Egan from the CTEI helped facilitate the event.

Alison Papadakis opened the discussion describing how her interest in Canvas began with her kids, who were using it during COVID. (JHU was still using Blackboard at that time.) Watching her kids struggle with poorly designed Canvas classroom interfaces influenced the way she organized her own Canvas classroom once JHU adopted it as our new learning management system (LMS).  One big decision she made was to stay away from using the Module function, which is often the most common  way to organize content in Canvas. Instead, Papadakis explained how she used the Canvas Page function to create a page with a table outlining her course schedule with hyperlinks to the rest of her content. The homepage of her Canvas site looks like a calendar with hyperlinks for each class day. She regularly checks in with her students, asking if they have trouble finding anything in the course and they always assure her that they do not. Papadakis also makes the Files area in Canvas available to her students, as an additional way for them to access course content, but they tell her they don’t use it. She says the course schedule page is not the “prettiest” display of content, but the functionality works very well for her course and students can easily find what they need for each class period.

Papadakis also does a lot of student advising and needed a place to post links and share information with students. She decided to use a community site, which is similar to a website, but built inside of Canvas. All majors and minors have access to the site as well as other faculty; it is also possible to add other users to the site if necessary. Brian Cole clarified that the key difference between a standard Canvas course and community site is that a standard site is for credited courses and is automatically generated by  JHU’s Student Information System (SIS). Community sites, which all faculty have the ability to request, are for non-credit activities and are intended to share information and resources across multiple populations.

Emily Braley described how the mathematics department is using a community site to host their math placement exam. The university’s switch to Canvas provided an opportunity to revise the exam, which was previously hosted in Blackboard. In Canvas, students are provided with more information about why they are taking the exam as they are guided through a series of steps to help them decide which exam to take. With the help of CTEI staff, Braley described how they embedded a Microsoft form inside of Canvas that asks students what math courses they took in high school, including AP courses. The branching feature of the form then directs students to the appropriate placement exam based on their answers. There are also practice tests that students can take before the actual exam.

The exam itself is set up using a Canvas feature called Mastery Paths. This feature allows an instructor to set up to three ranges of scores for the exam; once they take the exam, student scores are translated into a recommendation for enrollment. Braley also created a customized grading scheme for the exam, which contains information about interpreting the results as well as the actual score for the students.

Braley is very excited about the potential for data analytics with the revised exam process. Using the form provides the department with data which can help identify trends and determine if students are being placed correctly.  All incoming math students are encouraged to take a math placement exam; so far this fall, close to 1100 students have taken the placement exam.

Jamie Young was looking for a way to avoid having to answer the same questions repeatedly from the 640 students in his Introduction to Chemistry lab course. Using HTML code, he was able to create a dropdown FAQ page in Canvas containing embedded links. He estimates he has received 50-60% less questions this semester so far since posting the FAQ page.  He also used HTML to add buttons and links to his syllabus that link out to everything in the course, similar to Alison Papadakis’s course schedule. He believes this saves time for students as they are able to find many things very quickly. Additionally, Young embedded a live Google Document into the course that contains his course schedule. This makes it really easy to update the schedule when necessary as any changes made will immediately be pushed to Canvas – no need to upload an edited document each time a change is made.

In another course, with a combined lecture and lab, Young struggled with displaying a large amount of content. He initially put everything into modules but wasn’t happy with how disorganized they became after adding so much material. He has since turned each module into its own page and links everything from the page. This has been working out much better – again, students are able to find things quickly and easily. Young insists you don’t need much coding knowledge to take advantage of these features in Canvas; you do need to know – or have access to – a  few HTML commands.

The discussion included the following questions from the audience:

Q (for Alison Papadakis): Do you need coding experience to create this [the course schedule]?
AP: I just created it in Word and cut and pasted it in – no coding necessary.

Q (for Alison Papadakis): How do you link the “tone” of your course to the course schedule?
AP: This is an in-person course, so there is a lot of in-class discussion around the course and how it works at the beginning. The course schedule is just the pragmatic piece so we can keep things organized.

Q (for Alison Papadakis): It looks like you assign readings before the semester begins – do you plan everything ahead of the semester, before it starts?
AP: I have taught this course over ten times, so I know basically what’s coming. I put placeholders in for things I don’t know yet. You’ll notice it says ‘Tentative Schedule’ so I can allow for shifting things around if needed. I do need to remember to update the Canvas calendar when making changes to my course schedule.

Q (for Alison Papadakis): Can anyone access the community site?
AP: No, they have to be added to the roster.

Q: (For Beth Hals, CTEI’s Sr. Instructional Technologist) Can you explain Mastery Paths? Is it the same as locking/unlocking a Module?
BH: Mastery Paths are affiliated with some sort of assessment in Canvas. As the instructor, you can set three different sets of score ranges that you use to then send students on their next ‘path’ based on their results. Unlocking modules is a little different – you first set prerequisites on a module that must be completed before the module will unlock.

Q (for Jamie Young): To a neophyte, it’s a little overwhelming to see what you’ve done – there seem to be many ways of doing the same thing. Could you compare and contrast the ways of organizing your syllabus?
JY: You can use the Rich Content Editor (RCE) in Canvas to build your syllabus. If you want to add something like buttons, you would then toggle the RCE to view the HTML editor. Using HTML  is more complicated for sure, but with some basic knowledge you can do it. I would be happy to share what I’ve done and then you can just fill in your information and cut and paste it into your course. To embed the Google Form, I followed online directions that I googled.

Brian Cole, CTEI’s Associate Director for Instructional Technology: You don’t need any HTML  knowledge to embed anything into Canvas. You can use the Rich Content Editor (RCE) to do this. There is an “embed” option in the menu of the editor. You also don’t have to do every page. You can pick and choose what parts of your course to make pretty.

Q: Did Jamie build his syllabus in AEFIS?
BC: No, Jamie built his syllabus using the Canvas Syllabus page. You can still use your own syllabus in conjunction with the AEFIS syllabus – they can coexist. (Note: New name for AEFIS is Heliocampus.)

Q (for Jamie Young): Could you provide a little more information on creating tabs?
JY: They are just HTML code. I used HTML 5. You have to go into the HTML editor in Canvas and use “div” tags to build tabs. Start with the blank tabs in html, then go back to the RCE and fill in the text as needed. You can use copy and paste to make it easier.

Q: Can I move JavaScript headers into Canvas?
BC: No, Canvas will strip them out. An alternative is to embed the page into the Canvas page.
BH: There is something called the Redirect tool that may help. This tool adds an item to your navigational menu. You pick the text for what will display in your menu and it will link to a particular page.

Q: Any ideas about making grading easier?
EB: We use auto grading on all quizzes. We also use banks of questions, so that each quiz pulls from different banks. New Quizzes has matching question types that are more work for students, more robust, but still auto graded. Another thing about New Quizzes is the ability to render Latex [a typesetting software for math symbols]. This has been very useful for us – it’s so much cleaner for students. It renders as accessible MathML, which can be read by a screen reader. This is much better than posting a PDF that is read as an image.
We also use Gradescope, which is an external tool that helps us streamline grading. Students upload their work to Gradescope (inside of Canvas) and you can set it up to help auto grade problems.
JY: We also use Gradescope extensively in Chemistry. We scan written work into Gradescope and it is automatically graded. The system has gotten better at reading handwriting. It has made handwritten assignments so much easier to grade. One caveat about Canvas quizzes: they don’t allow for numbers past 4 decimal places, which we need.

A word about accessibility in Canvas:
EB: You can have Canvas tell you if your material is accessible or not. Use the accessibility checker in the RCE to help you with this.
BH: I also wanted to mention that it’s very easy to duplicate pages in Canvas – build it once, duplicate the page, then fill in what you need to change. It’s like building a template for yourself and reusing it.

For more information about topics discussed at the event, please see this Canvas resource developed by Beth Hals.

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image source: Canvas logo

Lunch and Learn: Generative AI – Teaching Uses, Learning Curves, and Classroom Guidelines

On Tuesday, October 3rd, the Center for Teaching Excellence and Innovation (CTEI) hosted its first Lunch and Learn of the academic year, a panel discussion titled, “Generative AI: Teaching Uses, Learning Curves, and Classroom Guidelines.” The three panelists included Jun Fang, Assistant Director of the Instructional Design and Technology Team in the Carey Business School, Carly Schnitzler, KSAS instructor in the University Writing Program, and Sean Tackett, Associate Professor in the School of Medicine.  The discussion was moderated by Caroline Egan, project manager in the CTEI. Mike Reese, director of the CTEI, also helped to facilitate the event. 

The panelists began by introducing themselves and then describing their experiences with generative AI. Jun Fang loves new technology and has been experimenting with AI since its inception. He noticed the faculty that he works with generally fall into two categories when it comes to using AI: some are quite concerned about students using it to cheat and are not ready to use it, while others see a great deal of potential and are very excited to use it in the classroom.  In speaking with colleagues from across the institution, Fang quickly realized these are common sentiments expressed by faculty in all JHU divisions. This motivated him to lead an effort to create a set of AI guidelines specifically geared toward faculty. The document contains a number of strategies for using AI including: designing engaging course activities, providing feedback for students on their assignments, and redesigning course assessments. The section on redesigning course assessments uses two approaches: the “avoidance approach,” which involves deliberately designing assessments without AI, and the “activation approach,” which intentionally integrates AI tools into the curriculum. The document includes specific examples of many of the strategies mentioned as well as links to widely used generative AI tools. 

Fang described a recent scenario in which a faculty member was concerned that students were using ChatGPT to generate answers to online discussion board questions.  To mitigate this situation, Fang suggested the faculty member revise the questions so that they were tied to a specific reading or perhaps to a topic generated in one of his online synchronous class sessions.  Another suggestion was to have students submit two answers for each question – one original answer and one generated by ChatGPT – and then have the students compare the two answers.  The faculty member was not comfortable with either of these suggestions and ended up making the discussion more of a synchronous activity, rather than asynchronous.  Fang acknowledged that everyone has a different comfort level with using AI and that one approach is not necessarily better than another.     

Carly Schnitzler currently teaches two introductory writing courses to undergraduates and is very open to using generative AI in her classroom.  At the start of the semester, she asked students to fill out an intake survey which included questions about previous writing experiences and any technologies used, including generative AI. She found that students were reluctant to admit that they had used these technologies, such as ChatGPT, for anything other than ‘novelty’ purposes because they associated these tools with cheating. After seeing the results of the survey, Schnitzler thought it would be beneficial for students to explore the potential use of generative AI in class. She asked students to do an assignment where they had to create standards of conduct in a first year writing class, which included discussing their expectations of the course, the instructor, their peers, and how AI would fit in among these expectations. The class came up with three standards: 

  1. AI tools should support (and not distract from) the goals of the class, such as critical thinking, analytical skills, developing a personal voice, etc.  
  2. AI tools can be used for certain parts of the writing process, such as brainstorming, revising, or editing, but students must disclose that AI tools were used. 
  3. If there appears to be an over-use or over-reliance on AI tools, a discussion will take place to address the situation rather than disciplinary action. (Schnitzler wants students to feel safe exploring the tools without fear of repercussion.) 

This assignment comes from an open collection of cross-disciplinary assignments that use text generation technologies, mostly in a writing context. TextGenEd: Teaching with Text Generation Technologies, co-edited by Schnitzler, consists of freely accessible assignments submitted by scholars from across the nation. Assignments are divided into categories, such as AI literacy, rhetorical engagements, professional writing, creative explorations, and ethical considerations. Most are designed so that the technologies used are explored by students and instructors together, requiring very little ‘expert’ technological skills.  Schnitzler noted that there is a call for new submissions twice each year and encouraged instructors to consider submitting their own assignments that use text generation AI.

Sean Tackett was initially fearful of ChatGPT when it was released last year. Reading article after article stating how generative AI was going to “take over” pushed him to learn as much as he could about this new technology. He began experimenting with it and initially did not find it easy to use or even necessarily useful in his work with medical school faculty. However, he and some colleagues recognized potential in these tools and ended up applying for and receiving a JHU DELTA grant to find ways they could apply generative AI to faculty development in the medical school. Tackett described how they are experimenting with generative AI in a curriculum development course that he teaches to the med school faculty. For example, one of the tasks is for faculty to learn to write learning objectives, so they’ve been developing prompts that can be used to specifically critique learning objectives. Another example is developing prompts to critique writing. Most of Tackett’s students are medical professionals who do not have a lot of time to learn new technologies, so his team is continually trying to refine prompts in these systems to make them as useful and efficient as possible. Despite being so busy, Tackett noted the faculty are generally enthusiastic about having the opportunity to use these tools.     

The discussion continued with a question and answer session with audience members: 

Q: How do we transfer and integrate this knowledge with teaching assistants who help manage the larger sized classes? What about grading?
ST: I would advocate for the potential of AI to replace a TA in terms of grading, but not in terms of a TA having a meaningful dialogue with a student. 
JF: Generative AI tools can be used to provide valuable feedback on assessments. There are a lot of tools out there to help make grading easier for your TAs, but AI can be used for the feedback piece. 

Q: How might professors provide guidelines to students to use generative AI to help them study better for difficult and complex topics?
MR: One possibility is to generate quiz questions – and then have students follow up by checking the work of these quizzes that have been generated.
CS: Using a ChatGPT or other text generation tool as a reading comprehension aid is something that has been useful for non-native English speakers. For example, adding a paragraph from an academic article into ChatGPT and asking what this means in plain language can be helpful.

CE: This gets to what I call ‘prompt literacy,’ which is designing better prompts to give you better answers. There is a very good series about this on Youtube from the University of Pennsylvania.
Sean, what have you experienced with prompting right now, in terms of challenges and opportunities?
ST: We’re trying to put together advice on how to better prompt the system to get more refined and accurate answers. After a few iterations of prompting the system, we refine the prompt and put it into a template for our faculty, leaving a few ‘blanks’ for them to fill in with their specific variables. The faculty are experts in their subject areas, so they can tell if the output is accurate or not. We’re in the process of collecting their output, to put together best practices about what works, what does not work.  

CE: What would you all like to see in terms of guidelines and best practices for AI on a web page geared towards using AI in the classroom?
Guest: And along those lines, how to we move forward with assigning research projects, knowing that these tools are available for students?
ST: I think it could be useful for students to learn research skills. They could use the tools to research something, then critique the results and explain how they verified those results. It can also be useful for generating ideas and brainstorming. Another thought is that there are a number of domain specific generative AI databases, such as Open Evidence which is useful in the medical field.  
CS: To Sean’s point, I think a comparative approach is useful with these tools. The tools are very good at pattern matching genre conventions, so doing comparative work within a genre could be useful.
JF: I think ChatGPT and other generative AI tools can be useful for different parts of the research process, such as brainstorming, structure, and editing. But not for something like providing or validating evidence.  

Q: As a grad student, I’m wondering how the presence of AI might force us to refine the types of questions and evaluations that we give our students. Are there ways to engineer our own questions so that the shift of the question is changed to avoid the problem [of having to refine and update the question] in the first place?
CS: There is an assignment in our collection that talks about bringing an assignment from past to present. Again, thinking in terms of a comparative approach, ask ChatGPT the question, and then ask your students the same question and see how they compare, if there are any patterns.  I think it can be helpful to think of ChatGPT as adding another voice to the room.
JF: We have a section in the guidelines on how to redesign assessment to cope with generative AI related issues. We suggest two approaches: the avoidance approach and the activation approach. The avoidance approach is for faculty who are not yet comfortable using this technology and want to avoid having students use it.  One example of this approach is for faculty to rework their assignments to focus on a higher level of learning, such as creativity or analysis, which will hopefully reduce or eliminate the opportunity for students to use AI tools. The activation approach encourages faculty to proactively integrate AI tools into the assessment process. One example of this approach I mentioned earlier is when I suggested to a faculty member to rework their discussion board questions to allow students to submit two versions of the answers, one created by them and the other by ChatGPT, and then analyze the results. 

Q: What is the ultimate goal of education? We may have different goals for different schools. Also, AI may bridge people from different social backgrounds. In China, where I grew up, the ability to read or write strongly depends on the social status of the family you come from. So there is some discomfort using it in the classroom.
CS: I feel some discomfort also, and that’s what led to the development of the guidelines in my classroom. I posed a similar question to my students: if we have these tools that can allegedly write for us, what is the point of taking a writing class?  They responded by saying things like, “writing helps to develop critical thinking and analytical skills,” to which I added, “being here is an investment in yourself as a student, a scholar, and a thinker.” I think asking students to articulate the value of the education that they want to get is really helpful in determining guidelines for AI.
ST: Going to school and getting an education is an investment of your time. You pay now so you can be paid later. But it’s not as transactional as that. AI is already in the work environment and will become more prevalent. If we’re not preparing students to succeed in the work environment, we are doing them a disservice. We teach students to apply generative AI in their classes so they are prepared to use it in the workforce.
JF: In the business school, everything is market driven. I think education can fit into that framework as well. We’re trying to provide graduates with the confidence they need to finish the work and meet the market’s need. We know that generative AI tools have really changed the world and they’re starting to emerge in every part of our life. We need to train students to realize that ChatGPT might be part of their education, part of life in the future, and part of the work in the future as well. There are things AI can help us do, but there are still fundamentals that students need to learn. One example is calculators: we still need to learn from the beginning that 1 + 1 = 2. 
CE: This question also reminded me of asking your students, what is the ultimate purpose of a research paper? Where do they think ChatGPT should fit into the research process?  

Q: I work at the library and we’re getting lots of questions about how to detect if students are using AI. And also, how do you determine if students are relying too heavily on AI?
JF: We also get this question from our faculty. The most used detection tool right now is Turnitin, which is embedded in Canvas. But the level of accuracy is not reliable. We encourage faculty to always validate before accepting the results.  For faculty who are actively using AI in the classroom, we also encourage them to provide clear guidance and expectations to students on how they are allowed to use it.  This may make it a little easier to determine if they are using it correctly or not.
MR: There are some other tools out there, such a GPTZero, ZeroGPT, but to Jun’s point, the difficult thing is that it’s different than plagiarism detection which says this is copied, and here’s the source. These tools say there’s a probability that part of this was taken, but you can’t point to a direct source. It’s up to instructors whether or not to use these tools, but consider using them to facilitate a conversation with students. In my own classes if I suspect academic misconduct, I usually start by asking them to explain, talk to me about what is happening before I make accusations. With these tools, there tends to be no hard evidence, just probabilities that something may have happened.  This is definitely an area we’re all still learning about.
Guest: I was just thinking that having a conversation with students about why they are turning to the tool in the first place might prevent misconduct.  Instead of sending them to an academic misconduct committee, we could have these conversations, like Carly mentioned. Making students aware of the limitations of the tool could also be helpful.
CS: Yes, I say that in our guidelines that I’m prioritizing conferences with students over immediate disciplinary action. I try to pre-empt anxiety students might feel around using these tools. Designing your assignments in a way that reduces anxiety is also helpful. For example, I tend to design assignments that build on one another throughout the semester in smaller bits, rather than one giant chunk all at once.  

Q: Is there any discussion around combining AI with teaching, such as generating personalized explanations of a topic? Students will have different levels of expertise and comfort with different topics.
ST: We’re trying to do this, to create a teaching aid for the future. We’re planning to use it to create assessment items.  

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image Source: Pixabay, Unsplash

 

Adapting to AI in the Classroom for Time-Strapped Instructors

In the past few months, we have spoken to many instructors – faculty, graduate students, even undergraduate teaching assistants –  who are doing very interesting things with artificial intelligence tools in their classes this coming fall. Some are writing grants to support research into classroom uses of AI, some are designing interactive online modules to help teach about the ethics of AI, and some are integrating AI tools into their instructional activities.

This blog post is for another instructor population: those that have not had the time or capacity to redevelop their courses, their assessments, or their activities to accommodate an AI world. “Redesigning assessments with AI in mind” might be the 20th item on a long list of to-dos for the coming semester. Adapting to new technologies that could change the classroom experience – and AI is certainly one of them – seems like an overwhelming task. Classes start in one week, and wrestling with the teaching and learning opportunities and challenges of artificial intelligence may not be an achievable goal.

However, there are some concrete steps and curated resources to take into account in terms of AI when planning and teaching your courses.

Recommendations for Starting with AI

Here are six recommendations (and one extra credit assignment). Following all of these suggestions will put you on good footing with the learning curve associated with AI in the classroom, but even doing one or two is a good way to start.

  1. Experiment with ChatGPT and other AI tools. Just get in there and start using them and see what they produce. In an article for the Chronicle of Higher Education, one writer said, “I started by reminding myself, anytime I was about to Google something, to ask ChatGPT.”[1] ChatGPT-ing (or using Google Bard) instead of Google-ing is a good on-ramp to AI usage. You may even find them useful to you as an instructor. Here are four basic generative AI models to start with along with prompt suggestions:
    1. ChatGPT – The first (and by some reports, still the most accurate) text-based generative AI. Prompt suggestion: Ask a basic question about teaching, e.g., “How can I grade exams more efficiently?” or “How can I provide written feedback more efficiently?”
    2. Google BardLess text-heavy than ChatGPT; potentially geared towards more logic-based questions, e.g., “How do I create a website in WordPress?”
    3. Microsoft BingAble to generate images as well as text and simultaneously harness the power of a search engine. Potential question: “Name the characteristics of neo-classical architecture and provide an example.”
    4. Fotor.com Image-generator AI. Potential question: “Provide an illustration for my chemistry class syllabus.”
  2. Run your assignments through an AI tool. This will help benchmark possible AI-generated responses to your assignments. More sophisticated AI users will engage in prompt engineering that could make uncited or incorrect usage of AI harder to detect, but getting at least one example of an AI response is helpful. It will not only provide a sightline into possible academic integrity issues but also point to whether your assignment may need to be revised or redeveloped, which could include integrating AI itself. Derek Bruff, a writer and higher education consultant, provides good guidance on assessment design in light of AI:
    1. Why does this assignment make sense for this course?
    2. What are specific learning objectives for this assignment?
    3. How might students use AI tools while working on this assignment?
    4. How might AI undercut the goals of this assignment? How could you mitigate this?
    5. How might AI enhance the assignment? Where would students need help figuring that out?
    6. Focus on the process. How could you make the assignment more meaningful for students or support them more in the work? [2]
  3. Add an AI policy to your syllabus. This may require doing some or all of the recommendations above, but even if you do not have the capacity to take a deep dive into AI tools before courses start, it is a good idea to take a stab at a policy, even if it is brief. As mentioned above, you will be adapting this policy fairly quickly. The sooner you develop a benchmark policy and determine what works and what does not, the better. Lance Eaton, a doctoral student in higher education at the University of Massachusetts at Boston, has crowdsourced a Google Document with many helpful examples of AI policies for syllabi. This is an excellent place to start.
  4. Determine your academic integrity policy for AI. This may be part of your general AI policy or it could be separate. Regardless, this will probably be V.1 of your academic integrity policy, but again, starting now will put you in a good position to iterate as needed. To start, review Academic Integrity Policies for Johns Hopkins Schools. Lance Eaton’s Google Document (above) has many examples of AI policies that include academic integrity statements.
  5. Teach your students how to cite AI tools. This information could be incorporated into a syllabus policy and/or academic integrity policy, but correct citation – at least according to August 2023 recommendations of these style guides – is step number one. Making your students aware that they need to cite uses of AI tools and giving them the tools for doing that will (hopefully) incentivize compliance with your academic integrity policies.
    1. APA Citation Guidance – ChatGPT
    2. MLA Citation Guidance – Generative AI
    3. Chicago Style Citation Guidance – ChatGPT
    4. Johns Hopkins Library Guide on Citation
  6. Talk to your local center for teaching and learning. All Hopkins Schools have teaching and learning centers, some have been publishing guidance on how to teach and learn with artificial intelligence tools, and many have been considering the possible consequences of AI in the classroom. Here’s a list of teaching and learning centers at Hopkins, and here are two rich resources developed by two CTLs at Hopkins:
    1. Teaching & Learning in the ChatGPT Era. This website was created by the Center for Learning Design & Technology at the Whiting School of Engineering. It provides a great overview on generative AI as well as providing guidance on academic integrity questions, student use of AI, and assessment design with AI. Kelly Orr, Nathan Graham, Olysha Magruder, Mel Rizzuto, and Edward Queen of the CLDT all contributed to the website as did adjunct faculty David Porter.
    2. Johns Hopkins University Generative AI Tool Implementation Guidance and Best Practices. Jun Fang, Assistant Director in Teaching & Learning@Carey in the Carey School of Business led the development of this resource with contributions from representatives at other schools and teaching and learning centers at Hopkins. This guide provides substantial guidance on using generative AI to design engaging course activities, provide assignment feedback, and gives a list of AI tools for higher education.

Extra credit assignment for those with a little more capacity:

  1. Learn a little about prompt engineering. Prompt engineering is developing and refining questions and statements for AI models such that they generate results with the desired specificity, tone, length, citations, etc. This will give you a sightline into AI capacities beyond a simple one-time command (e.g., “Compare and contrast models of femininity in Shakespeare’s Much Ado About Nothing and Taming of the Shrew”) which may yield an overly broad answer that lacks specificity and nuance. Prompt engineering will also help you learn to direct and guide AI models and not just react to them. For a useful beginner’s guide to prompt engineering, check out the brief video on prompting AI from Wharton School instructors.

Why You Should Do This

Here is why you should take the (small) leap: Artificial intelligence will change the way we teach and learn. The internet did this, email did this, and so will AI. Taking small steps to acculturate to this new reality is the best way to build the flexibility needed to successfully teach and learn with AI – and, very importantly, teach your students how to teach and learn with AI. Here are more reasons to begin to shift your behavior:

  • You can start small. Take this semester as an opportunity to begin to build your AI teaching and learning skills. You do not have to overhaul your syllabi or classroom activities to accommodate AI; you just have to begin to think through the implications of teaching in a world where AI tools are easily available and could pass your homework assignments. Ask yourself how you would coach students encountering your subject matter for the first time, and then apply those principles to your own learning about AI.
  • You will have to learn to adapt quickly. Artificial intelligence tools are evolving rapidly; your course design and instructional approach will do so, too. Each semester will require additional revisions to your syllabi to accommodate our increasing use of AI tools and AI’s increasing capacities. Starting to build those muscles now with lower-effort activities will pay off in the long run.
  • You actually know how to do this. Researching? Developing hypotheses? Evaluating resources? Check, check, and check. Iterating, revising, and adapting as you go along? Teaching students how to evaluate resources? Guiding students to think about the definitions of “artificial,” “intelligence,” and “human”? Check all that, too. The skills required to become AI-literate from a teaching and learning perspective are skills you already have. It is just a matter of applying them to this particular challenge/opportunity/problem (however you frame it).

Finally, give yourself and your students some grace. This is a huge part of beginning to learn how to teach and learn in an AI world; most likely, neither you nor your students will be proficient AI practitioners this semester. You may miss an academic integrity issue or overlook good opportunities to use AI in a classroom activity. Your students may not cite AI correctly or may not cite it at all. They may be far more fluent with AI than you are, or they may be too trusting of AI. Whatever happens, try to remember that you all are new at this and, as new learners, you all may take missteps and make mistakes with the technology.

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

[1] Darby, Flower. (27 June 2023). 4 steps to help you plan for ChatGPT in your classroom. The Chronicle of Higher Education. https://www-chronicle-com.proxy1.library.jhu.edu/article/4-steps-to-help-you-plan-for-chatgpt-in-your-classroom

[2] Bruff, D. (19 July 2023). Assignment makeovers in the AI age: Essay edition. Agile learning: Derek Bruff’s blog on teaching and learning. https://derekbruff.org/?p=4105

Selected Resources

From Hopkins:

Additional resources:

Image Source: Unsplash

Panel Discussion: “Teaching and Learning in the Age of Chatbots and Artificial Intelligence”

On April 4th, the Center for Teaching Excellence and Innovation hosted “Teaching and Learning in the Age of Chatbots and Artificial Intelligence,” a panel discussion on the implications of artificial intelligence in Hopkins classrooms. This discussion, open to attendees from all schools and divisions in Hopkins, yielded insights into the opportunities and limitations of Chatbots, particularly ChatGPT; identified ways to frame its pedagogical uses for students and faculty; and gave guidance for integrating it into classrooms.

The five-person panel consisted of Victoria Harms, DAAD Visiting Assistant Professor, History; Austin Heath, PhD Candidate, Philosophy; Mike Todasco, MFA student, Writing Seminars and former PayPal executive; and Opal Sitzman and Timothy Huang, first-year students taking the Reintroduction to Writing seminar with Alex Lewis, a Post-Doctoral Fellow in the University Writing Program who is using ChatGPT in his courses.

The discussion produced several incisive observations about chatbots and their role in higher education classrooms.

Here is a summary of the main points:

  • Teaching and learning: There was broad consensus that instructors should engage in active inquiry into artificial intelligence (AI) with their students and leverage the tool to help students think critically about evaluating texts, the accuracy of texts, and what a Chatbot’s opportunities and limitations are as a source, creator, and partner in their work.
  • A metacognitive tool: Both instructors and students said one of the best ways to use ChatGPT is as a tool to help students think about their learning and knowledge, from helping to improve writing to assessing the substance of texts.
  • Academic Integrity: Panelists thought that the written work produced by ChatGPT fell below standards for a finished product; it could be inaccurate, incorrect, and overly broad.
  • Academic Integrity and Assessments: One student urged faculty to identify the core issues driving the need for assessment and use those ideas to motivate students to produce original work. This assessment design contrasts with more mechanical and easily-plagiarizable assignments.
  • The students were teaching the faculty: Opal and Tim provided a huge amount of guidance to faculty, including recommended readings, results from their individual research projects, and thoughts on assessment design.

And words of wisdom from some of the panelists:

  • Austin Heath urged attendees to conceptualize ChatGPT as “a tool inquiry vs. a received text or received piece” of truth.
  • Opal Sitzman warned against a “tend[ancy] to overestimate ChatGPT’s current prowess.”
  • Mike Todasco compared ChatGPT’s current capabilities to “mansplaining,” with all of attendant drawbacks of the term.

Tim and Opal kicked off the conversation, describing the ways that students are using AI technology. Opal assured people that AI is not a “nefarious actor” in student lives: “In general, students like playing around with it like writing a Seinfeld episode, but it’s used more for inspiration than cheating.” Tim said, “You can use it to create the first draft of a paper,” and he’s using it as a self-tutoring tool “to adjust how I write.” Mike, in his MFA classes, used it “to be the voice of a computer in a story I was writing. The key is to always acknowledge it.”

Austin and Victoria discussed how they are guiding students to use and think about artificial intelligence. Austin thought of Chatbots “as a student’s student,” a way for students to learn how to evaluate and critique writing. He gives students output from a chatbot explaining a concept and invites them to grade it and offer suggestions for improvement. In Victoria’s class on Europe since 1945, she asked the Chatbot, “Why did the Soviet Union collapse?” Her students critique the answer for “accuracy and substance,” which taught “students that they know something, too.” She urged the audience “to teach students to be critical digesters of information.”

The panelists also weighed in on how their subject matter expertise influenced the way they used and thought about artificial intelligence. Mike, who has been writing about it for a while, said, “I felt like a Cassandra in that no one was listening and now everyone is talking about it.” He then talked about how “People who don’t have access to JHU resources can use it to learn […] the more people use it – not just for teaching, but for life – will help us learn.” Victoria teaches her students “to fact check results, like I do with Wikipedia. We need to integrate these tools into our assessments so they will use them appropriately.”

Opal, who’s interested in neuroscience, wrote a paper considering whether AI is conscious. Her verdict: “[I]t’s still much more simple than our brain,” but, importantly, “it helps us understand the concept of consciousness even if it isn’t conscious itself.” Austin, as a philosopher, applauded Opal’s interest in consciousness before explaining his own interest in “generat[ing] alternative thoughts about writing and giving credit,” saying, “I’m interested in exploring what it means to give attribution. Did a student write this work? Or did AI write this? Or did students work with AI to write this?”

When queried about Chatbots and academic integrity, the panelists mostly talked about its limitations as an easily accessible cheating tool. Opal said, “ChatGPT has a bad reputation for helping students cheat, but people overestimate its abilities. You still have to do a lot of work that requires critical thinking when using it because it doesn’t produce sophisticated results. It might help with a basic prompt.” Mike and Victoria echoed Opal’s opinion. Mike said, “If you were teaching middle schoolers, you might be concerned with cheating,” though he went on to add, “That said, the future version will get better.” Victoria added, “The pandemic taught us that not all students are excited about technology or are tech savvy.”

Tim offered a very good, thoughtful response about using ChatGPT to plagiarize code in a computing course when Kwame Kutton, a Lecturer in Biomedical Engineering, raised a question about doing this. Currently in a computer science course himself, Tim said, “In BME there are unique opportunities to write code that saves lives. Therefore, students need to tackle the core issue to solve before they even write code. We want faculty to teach us how to think about the logic of the problem, not just writing code.” His comment encouraged instructors to think deeply about first framing and identifying the problem for students, which will help motivate them to produce original and independent work.

Mike stated another perspective: “I don’t know any programmer who doesn’t use Copilot,” a code repository on GitHub that uses AI to suggest coding solutions. “My analogy is calculators,” he said. “You need to know how to do math without a calculator, but once you are doing the calculations after setting up the problem, you should use a calculator to help solve the problem.”

A question from the audience about languages, accents, and ChatGPT turned the discussion to issues of accessibility and political bias. Tim saw one of his friends using the Chatbot to translate English to Japanese and then used it himself to translate a Spanish article he was familiar with. His opinion: “It does a better job than Google Translate” though “there are lots of metaphors that get lost in translation by these tools.”

Mike then gave two excellent examples about how ChatGPT is providing access and support to people with divergent and impaired abilities. He said, “ChatGPT 4 is available, but they haven’t released the picture-to-text feature that exists yet. They shared video of someone with visual impairment using ChatGPT 4 to learn what was in the fridge using their phone. It will be able to do amazing things in the future to help us.” He went on to talk about a friend of his who knew someone in San Francisco with a lawncare business who struggled to communicate via email. The owner of the business now uses ChatGPT “to help polish his emails,” thus improving his client relationships.

Opal talked about how ChatGPT struggles with dialects, which turned the conversation to political bias. She’s using ChatGPT to write a short story “in the style of Kate Chopin,” a 19th Century American writer known for writing about Louisiana Creole culture. Opal said, “[Chopin] used a lot of Louisiana dialect” and ChatGPT “struggles” with this because it “is filtered so it doesn’t mimic the racist language used during that time.” She said that people have found ChatGPT to be “an establishment liberal” in its political biases. Victoria brought up “issues of bias in Silicon Valley” and wondered how ChatGPT would address Critical Race Theory (CRT). Mike decided to ask ChatGPT whether we should ban CRT and copied and pasted ChatGPT’s response in the Zoom chat:

As an AI language model, I don’t have personal opinions. However, I can provide you with an analysis. Whether Critical Race Theory (CRT) should be banned in schools is a subjective question, often debated among educators, policymakers, and parents. Supporters argue that CRT promotes understanding of systemic racism and its impacts, while opponents believe it can be divisive and foster racial animosity. Ultimately, the decision to include or exclude CRT in schools depends on the goals and values of the educational community involved.[1]

The conversation ended with speculation about how quickly ChatGPT would progress. Mike said, “The current GPT4 has been remarkable. I’ve written fiction in each version and I’d say it’s getting two grade levels better in each version.” Opal also weighed in: “It will be quick, but I’m not wary yet. We need to keep considering these questions, but I think it’s less something to be scared of and more something to utilize. I don’t see anything being more powerful than humans in the near future.”

Recommended reading and activities:

[1] OpenAI. (2023). ChatGPT (Apr 4 version) [Large language model]. https://chat.openai.com/

Caroline Egan
Caroline Egan is a Project Manager in the Center for Teaching Excellence and Innovation, supporting instructional training and development for Hopkins faculty, graduate students, post-doctoral fellows, and staff.

Image source: Pixabay, Unsplash

ChatGPT: A Brief Introduction and Considerations for Academic Integrity

I’ve been reading about the potential impact of artificial intelligence (AI) on teaching and learning for some time. A close family friend gave me a book entitled In the Mind of the Machine by Ken Warwick in 1998. The Education Horizon Report Advisory Committee, of which I was a member, first listed artificial intelligence as an emerging technology likely to have an impact on learning, teaching, and creative inquiry in education in 2017. November 2022 brought the long-anticipated arrival of ChatGPT beta with accompanying media attention.

What is OpenAI and ChatGPT?

OpenAI is an artificial intelligence research lab. Open AI developed a chatbot called ChatGPT (GPT = Generative Pre-trained Transformer) and an image generation application DALL-E 2. ChatGPT is a natural language processing model that is trained on hundreds of billions of documents and websites. Users can interact by asking it questions or submitting statements to which it can generate responses.  For example, here is ChatGPT answering a question about itself:

Prompt: What is ChatGPT?
“ChatGPT is a large language model developed by OpenAI. It is trained on a diverse range of internet text and is able to generate human-like text in response to various prompts. The model can be fine-tuned for various natural language processing tasks such as language translation, question answering, and conversation.”

While ChatGPT received most of the media attention in winter 2022-23, there are other chatbots that exist like Jasper and Chincilla.

What are the main concerns?

The main concern for instructors is students asking OpenAI applications to complete assignments for them. This includes writing essays or research papers along with coding assignments for which ChatGPT is trained. Students can also ask ChatGPT to answer test questions.

Things to Consider

While the capabilities of artificial intelligence applications will continue to evolve, there are currently some limitations. For example, current models do not include articles behind paywalls (e.g., subscription journals). This makes it harder for students to generate essays based on peer-reviewed research.  While the models are trained on a large number of documents, the applications’ responses to specific, focused inquiries tend to be vague.  My colleagues and I asked ChatGPT to write a strategic plan for the Center for Teaching Excellence and Innovation. It suggested relevant ideas, but it was generic and too broad to be useful. That said, we could have used it as a starting point for brainstorming a draft.

Some applications, like Turnitin, are claiming they can detect if students used ChatGPT, but like any technology, these applications are not perfect and students can work around them (e.g., editing the essay produced to make it closer to their own writing style).

 Academic Integrity

Use of OpenAI applications can fall under academic integrity policies like plagiarism, but the gray zone between clearly plagiarized work and an academic support tool is large. For example, most instructors would consider it plagiarism for students to ask ChatGPT to write a paper based on a writing prompt from class. But is it OK for students to ask ChatGPT for a summary of research on a topic, which they then use to generate a bibliography as the basis for a research paper they write?  Instructors should learn more about how ChatGPT and other AI technologies work so they can inform students what is considered appropriate use of AI technologies and what is not. Here are additional strategies to consider to help you and your students navigate this new territory:

  • Scaffold the activity by asking students to turn in an outline and iterative drafts that address comments and feedback from the instructor or teaching assistants. This requires students to show progression in a way that is difficult for tools like ChatGPT to produce.
  • Ask students to write papers using a shared Microsoft document through One Drive so you can see the version history.
  • Use writing prompts that are more specific or require students to cite specific texts.
  • Use AI tools to teach students. For example, generate essays in ChatGPT and have students critique them.
  • Discuss with students what is considered acceptable use of AI technologies (e.g., generating a summary of a field) and what is not (e.g., responding to a specific assignment prompt).

A colleague also commented that as we engage with ChatGPT and other AI technologies, we are feeding it data it can use to improve its models. They own the submissions as part of the terms of agreement when accounts are created. Explain to students that they may be giving over their intellectual property if they are using these tools.  If they submit your tests for ChatGPT to answer, they may be violating your intellectual property rights.

Where to Learn More

 Here are some resources to learn more about AI technologies:

We are all orienting ourselves to this new technology and best practices are evolving. The CTEI will continue to share more information and host discussions over the semester.

Mike Reese
Mike Reese is Associate Dean of the Center for Teaching Excellence and Innovation and associate teaching professor in Sociology.

Image Source: OpenAI Logo, Pixabay

Teaching Cinema with Omeka

Since the death of the DVD player, several challenges have emerged for media-based courses: How can we give students access to a wide range of audiovisual, image, and text sources located on multiple different online platforms? What is the most efficient way for the instructor to access these materials in class spontaneously, and for students to be able to work with the materials on their own? Can we do this in a way that allows for critical engagement and sparks new associations? Can we make that engagement interactive? To address these challenges, graduate fellow Hale Sirin and I discovered Omeka, an open-source exhibition software tool developed at George Mason University. We found the Omeka platform optimal for creating media-rich digital collections and exhibitions.Omeka website Home page for Comparative Cinema

In Fall 2019, funded by a Center for Teaching Excellence and Innovation (CTEI) Technology Fellowship Grant, we created and customized an instance of Omeka with the specific goal of designing a web-based environment to teach comparative cinema courses. We implemented the Omeka site in Spring 2020 for the course “Cinema of the 1930s: Communist and Capitalist Fantasies,” further supported by a CTEI Teaching Innovation Grant. This course compares films of the era in a variety of genres (musical, epic, Western, drama) from different countries, examining the intersections between politics and aesthetics as well as the lasting implications of the films themselves in light of theoretical works on film as a medium, ethics and gender. We adapted the online publishing software package into an interactive media platform on which the students could watch the assigned films, post comments with timestamps, and help expand the platform by sharing their own video essays. We built this platform with sustainability in mind, choosing open-source software with no recurring costs so that it could be used over the years and serve as a model for future interdisciplinary and comparative film and media courses.

When building this website, our first task was to organize the digital archive of film clips and film stills for the course. These materials were then uploaded to Panopto, the online streaming service used by JHU, and embedded in the Omeka site.screenshot of embedded film hosted in Kanopy We also embedded the films that were publicly available on YouTube, Kanopy, and other archives, such as the online film archive of the production studio Mosfil’m, designing the Omeka site to serve as a single platform to stream this content. Each film, clip, text, or image was tagged with multiple identifiers to allow students to navigate the many resources for the course via search and sort functions, tags and hyperlinks, creating an interactive and rich learning environment. We added further functionality to the website by customizing interactive plugins, such as the “Comments” function, which allowed us to create a thread for each film in which students could respond to the specific prompts for the week and to timestamp the specific parts of the film to which their comments referred.

In order to abide by copyright laws, only films in the public domain were streamed in their entirety. For other films, we provided selected short clips on Omeka, which we were able to easily access during class. Students were able to access the films available on Kanopy through our website by entering their JHU credentials.

Teaching comparative cinema with the interactive website powered by Omeka provided the students with a novel way of accessing comparative research in film studies. The website served as a single platform, interconnecting the digital material (video, image and text) and creating an interactive and rich learning environment to enhance student learning both in and outside of class time. Rather than the materials being fixed to the syllabus week to week, students could search film clips by director, year, country, or theme. Students were thus able to compare and contrast many images and films from across cultural divides on a unified online platform.

Students were not only able to access the course materials on the Omeka site, but also to expand and re-structure the content. screenshot of Scarlett Empress film clip in Omeka site Over the course of the semester, students contributed to the annotation of film clips by uploading their comments to the films and timestamping important sequences. Since they were also required to draw their presentations from material in the exhibition, their engagement on the site was quantifiable on an on-going basis. As their final projects, they had the option of creating a video essay, which involved editing together clips from the films, and recording an interpretive essay over them, like a commentary track. Their video essays were shared with their peers on the Omeka site.

After switching to online learning in Spring 2020 due to Covid19, the Omeka site not only performed its original task, but was flexible enough to give us the opportunity to build an asynchronous, alternative educational environment, now not only hosting the course materials and discussion forums, but also the weekly recorded lectures, recordings of our Zoom discussion sessions, and students’ final video essays.

We thank the Center for Teaching Excellence and Innovation (previously known as the Center for Educational Resources) and the Sheridan Libraries for their support and continual guidance during this project.

Additional Resources:

https://omeka.org/

https://blogs.library.jhu.edu/2016/08/omeka-for-instruction/

Authors’ Backgrounds:

Anne Eakin Moss was an Assistant Professor in JHU’s Department of Comparative Thought and Literature, a board member of the program in Women, Gender, and Sexuality and of the Center for Advanced Media Studies. She was the 2017 recipient of the KSAS Excellence in Graduate Teaching/Mentorship Award and a Mellon Arts Innovation Grant, and a 2019 KSAS Discovery Award winner. Since the fall of 2021, she has been at the University of Chicago where she is an Assistant Professor in the Department of Slavic Languages & Literatures.

Hale Sirin is a Ph.D. Candidate in the Department of Comparative Thought and Literature. A recipient of the Dean’s Teaching Fellowship and the Women, Gender, and Sexuality teaching fellowship, she has taught courses in comparative literature, philosophy, and intellectual history. Her research interests include early 20th-century philosophy and literature, theories of representation and media in modernity, and digital humanities.

Image source: Hale Sirin

Transitioning to Canvas: an Update on the University’s LMS Migration Process

On Tuesday, December 7, 2021, the Center for Educational Resources (CER) hosted a virtual Lunch and Learn that reviewed the university’s upcoming migration to Canvas, the new learning management system (LMS) that will replace Blackboard. Brian Cole, Associate Director for the CER and leader of the LMS migration committee, provided an update and summary of the process which was followed by faculty Q and A.

Cole began with a brief explanation of why the university is migrating to Canvas: the university’s contract with Blackboard will end at the end of this fiscal year so we must choose a new system. In addition, our current version of Blackboard is outdated and increasingly unreliable, especially when using the more complex tools such as tests. Last spring, a university-wide LMS evaluation process was held and the majority of stakeholders (faculty, staff, and students) selected Canvas as the future LMS.

Key dates for faculty to keep in mind:

  • Spring 2022 – The university will run a small Canvas pilot, with approximately 50 courses from across all JHU divisions.
  • Summer 2022 – ASEN summer courses will be ‘opt-in:’ ASEN instructors can choose between staying in Blackboard or using Canvas for Summer 2022 courses. (If you are teaching a summer course that is not in ASEN, please check with your division’s teaching and learning center.)
  • Fall 2022 – No courses offered in Blackboard, everything in Canvas.
  • December 1, 2022 – Blackboard access turned off.

Faculty will have a choice between building a new course from scratch in Canvas, or migrating existing course content over from Blackboard. Cole highly encouraged faculty to build from scratch if the course is mostly content-based and does not use many complicated tools; this will give faculty an opportunity to learn their way around Canvas.  If a course uses more complicated features, such as tests, faculty might want to consider a migration, either on their own, or with CER assistance.

The CER will provide multiple opportunities for training and help throughout the migration process. Training for Homewood faculty will begin in late spring of 2022. There will also be live and on-demand trainings led by Canvas professionals available to JHU faculty.

Cole concluded with a brief demo of Canvas which led to a Q and A session:

Q: How much time will it take to migrate a course?
A: This will depend on how complex your course is, how many tools you use now.

Q: Can faculty use the free Canvas site?
A: Yes, if faculty want to experiment and get to know Canvas, they can create a free account available at https://canvas.instructure.com/register,  but anything posted there becomes the intellectual property of Canvas. We will not support migrating any content created using the free account to your JHU Canvas account. Also, not all features available in the free environment will be available in the JHU environment.

Q: What about third party tools (VoiceThread, Panopto, Turnitin, etc.)?
A: Most third party tools will be available in Canvas. The appearance of a tool may differ from the way it appears in Blackboard, but the functionality should be similar.

Q: What about section merges? Will we still be able to do that?
A: Yes. Sections will work differently in Canvas – child courses will be more integrated with parent courses. The process of creating merged sections will change, but it should work more efficiently.

Q: Can you copy directly into the content editor?
A: Yes, you can copy directly into the content editor and it should work better than when you copy into Blackboard’s content editor.

Q: Does Canvas have a good discussion board that can replace Piazza?
A: There are two versions of discussion boards in Canvas. The old one is very similar to Blackboard. The newer one is updated and is closer to the way Piazza works – there are features such as a TA or instructor being able to approve an answer, it has a ‘mentions area,’ like Piazza, but it does not have everything.

Q: How will courses that are not migrated from Blackboard be archived?
A: Blackboard archive file are .zip files – they are only readable by the Blackboard system. We will advise exporting certain pieces – gradebook/grades, for example. But we have to be careful with grades for FERPA reasons. IT@JH will archive as much as they can, but it will be kept in an offline state, for accreditation purposes only. The archiving process is not entirely worked out yet.

Q: How do you give feedback to students in Canvas?
A: Assignments are graded using something called Speedgrader – this is similar to the way Blackboard assignments are graded. There are also more analytical tools available to give you an idea of how your students are doing in the course. These tools may help to streamline who you may need to contact. It is very easy to message students from different areas of a course.

Q: Will there be integration with Teams, Sharepoint, and OneDrive?
A: It is anticipated that Teams and OneDrive integration will be ready for the Fall 2022 semester.

Q: Will faculty have early access to their courses?
A: Faculty may begin working on a migrated or developmental version of their course as early as April. Official Summer 2022 and Fall 2022 courses will be available in Canvas in roughly the same timeframes as they were in Blackboard.

Q: What about archiving in Canvas?
A: It is very easy to move or reference material stored in a previous Canvas course. Course copy works much better in Canvas than in Blackboard – it is much more granular. There is also the ability to create specific modules that can be developed and shared among departments.

For additional information about the transition to Canvas, faculty can access: http://canvas.jhu.edu. There is also a mailing list faculty can join to stay informed: https://jh.qualtrics.com/jfe/form/SV_5bWaTLyFV5WJTg2

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Lunch and Learn Logo, Canvas Logo

Surviving to Thriving: Reflections from Teaching Online

On Friday, December 11, 2020, the Office of the Provost, in conjunction with teaching and learning centers across Johns Hopkins University, sponsored a half-day virtual conference titled “Surviving to Thriving: Reflections from Teaching Online.” Faculty, graduate students, and staff came together to share ideas, challenges faced, and best practices when teaching in a virtual environment. Highlights from a few of the sessions are below:

Engaging Students in the Virtual Environment: A panel of faculty representing the School of Education, the Bloomberg School of Public Health, and the Whiting School of Engineering shared strategies for how building community, synchronous sessions, and using specific tools can help to engage students online.

Building community in an online course helps students feel more connected with each other and can lead to more productive learning. Strategies for community building shared by faculty include:

  • Using ice breaker questions at the beginning of every class
  • Learning student names
  • Setting up sessions to get to know one another (in breakout rooms, online discussion boards, Google Sheets)
  • Offering frequent feedback in multiple formats (email, audio recordings)
  • ‘Humanizing’ situations by admitting mistakes and struggles with technology

Synchronous sessions allow students to experience learning with their peers in a real-time, interactive environment. For students nervous about speaking up during live sessions, faculty members suggested giving them a choice to either use the chat feature or the microphone. One faculty member required his students, in a seminar style class, to ask at least one question during the semester, which helped to build confidence. Another faculty member described posting a ‘speaker list’ ahead of each class so students knew in what order they would be speaking. Not only were students more prepared, but the sessions ran more efficiently. Faculty discussed the challenges of holding synchronous sessions with students in multiple time zones. While most faculty recorded their lectures and made them available online, some offered to give their lecture twice, and allowed students to choose which session to attend.  Technology can be another challenge when holding synchronous sessions. Faculty suggested offering material in multiple formats in order to meet the needs of students who may be having technological difficulties. For example, post slides in addition to video.

There are a variety of online tools that can be used to engage students online. Faculty in this session discussed the following:

  • Padlet – tool that allows students to collaborate synchronously or asynchronously
  • Loom – video recording application that allows students to create and send recordings
  • Flipgrid – video discussion board tool
  • Slack – communication platform used for information sharing, individual and group communication, synchronous and asynchronous collaboration (similar to Teams)

(Note: Instructors are encouraged to contact their school’s teaching and learning center before deciding to use third party tools that may or may not be supported by their institution.)

Jazzing Up Online Presentations: A panel of faculty representing the School of Education, the School of Medicine, and the Krieger School of Arts and Sciences shared strategies for how to strengthen online presentations and keep students engaged. One idea is to maintain a balance between synchronous and asynchronous sessions to help avoid Zoom fatigue. Another idea is to keep sessions interactive by using breakout rooms, videos, knowledge checks, and other active learning techniques. Modeling online navigation for students is another way to help make sure they are staying engaged. For example, if referring to a particular discussion board post, share your screen with students and navigate to the post, read it with students, and discuss together. Consider using a different tool, other than PowerPoint, for your presentation, such as Genially or Sway. Lastly, don’t be afraid to put your personality into your presentation – pets and silly bow ties can make a difference to students!

Student Perspectives: A moderated panel of four students from across the institution shared their experiences with virtual learning this past semester. When asked what has been the most challenging aspect, several students mentioned Zoom fatigue. With the many hours of online lectures they were expected to attend, students reported that it was often difficult to stay engaged. Zoom fatigue also made it a challenge to participate in outside events or clubs since they are virtual as well. Other challenges: difficulty connecting and networking with people online and students living in different time zones.

When asked what techniques they wished more faculty were using, one student mentioned following best practices of online learning to ensure consistency across courses. Another student appreciated how one of her professors asked how students were doing at the beginning of each class. This was a small class, and the student acknowledged that it probably wouldn’t be possible in a larger setting, but that it helped to build community among class members. Another consideration mentioned is to build in breaks during the lecture. Other techniques suggested were the use of collaborative assignments, to help students keep each other accountable, and breakout rooms (in Zoom) with a structured task or purpose in mind.

Students were asked about the positive aspects of virtual learning. One student from the School of Advanced International Studies (SAIS) explained how students from all three SAIS campuses were able to participate in all classes together, and how beneficial it was for students to experience the others’ perspectives. Another student found it extremely helpful that faculty would record their lectures and post them to Blackboard for unlimited viewing.  Students also greatly appreciated the creativity and support of their instructors and teaching assistants.

Other topics presented at the conference include: student wellness, assessment, accessibility, and more. The Provost’s office has made the full day of recorded sessions available here with JHED authentication.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Pixabay

Teaching Online: What Have We Learned?

On Friday, October 9, the Center for Educational Resources (CER) hosted an online session, “Teaching Online: What Have We Learned?” where faculty were able to share and discuss best practices based on their experiences teaching online.  Mike Reese, director of the CER and faculty member in Sociology, and Allon Brann, teaching support specialist at the CER, moderated the discussion, structuring it with a few guiding questions as outlined below:

What is something you are doing differently this semester online that you will continue to do when you are back in the classroom?

Jamie Young from Chemistry described how he is using tools such as MS Teams and Slack to build classroom community. He anticipated students feeling isolated in an online environment, so he set up spaces for casual conversation for them to communicate and get to know one another. He said it has definitely encouraged and increased conversation among students. When asked how he motivates students to participate in these environments, Young responded that he made their participation a very small part of their grade. Young and his TAs make it a point to respond right away when students post to these spaces so that students know this is an active space and that they are being heard. This level of responsiveness has also helped boost participation. Back in the classroom, Young plans to continue using these tools for informal office hours.

Rachel Sangree from Civil and Systems Engineering shared that she has been holding evening office hours and what a difference it has made in the number of students who attend.  Acknowledging that it is sometimes exhausting, she stated that she sees more students now than ever before. When we’re back in the classroom, Sangree plans to continue to hold evening office hours, but perhaps not quite as late as she offers currently.

Alison Papadakis from Psychological and Brain Sciences described how she has adapted the “think pair share” active learning strategy to an online environment. Students are split into groups and assigned breakout rooms in Zoom, then use Google Sheets to record their ideas and notes while they’re in the rooms. This allows Papadakis to monitor the progress of students without having to manually drop in to each of the rooms. She is also able to add her own comments directly to the sheets in real time as students work on them. It was noted that separate tabs are created for each group in Google Sheets, so each group has its own space to work.  Initially Papadakis was concerned that students would feel like she was ‘spying’ on them and wouldn’t like this method, but so far the feedback has been positive.  Jamie Young shared a tutorial he put together on this topic: https://docs.google.com/document/d/1uvRB38GHIKNaxQL-dN-9vpWgC43Yslssyz_jh2uPtno/edit

Francois Furstenberg from History shared how he is using the e-reader platform Perusall which allows students to collaboratively annotate their online readings. The annotations inform Furstenberg what parts of the readings students are finding interesting and are helping to shape the in-class discussions. He plans to continue using this platform when in-person classes resume. It was noted that instructors need to have copyright permission before uploading reading material to Perusall. If they have questions about obtaining copyright, they are encouraged to contact their university librarian.

David Kraemer from Mechanical Engineering mentioned that he mailed USB oscilloscope boards and a kit of devices to each of his students so they could perform “hands-on” experiments at home. He recognizes the value of these kits whether or not students are learning online, and plans to keep using them when in person classes resume.

Joshua Reiter from the Center for Leadership Education described how he adjusted his approach to assessments by breaking up large exams into more frequent quizzes throughout the semester.  Some of these quizzes were meant to be ‘fun’ quizzes for participation points, but he noticed that students were feeling pressured when taking them, defeating their purpose. Reiter changed them from individual to group quizzes, using the breakout room feature in Zoom.  Since then, he’s noticed a significant reduction in stress among the students.

 

What is something that you are still struggling with?

Several faculty members mentioned that their workload is significantly higher this semester as they do their best to recreate their lessons online.  Some feel like they are putting in three times the normal amount of preparation time and as a result, are experiencing technological overload, having to learn and keep up with so many tools. Many faculty mentioned that their students are feeling this way, too.  It was suggested that adding technology should be done in a purposeful way and that faculty should not feel compelled to use all of the available tools. Some faculty acknowledged that although it has been very challenging, they have learned a great deal about tools that are out there, and which ones seem to work better for students.

Similarly, a faculty member mentioned how difficult it is to multitask within the online environment; for example, keeping up with the chat window in Zoom while lecturing synchronously. Many faculty agreed, commenting that they feel pressured to keep up with everything going on and that it often feels like a performance. Someone commented that sharing concerns with the students helps to humanize the situation, while setting realistic expectations helps to take pressure off of instructors. One instructor mentioned how he purposely builds pauses into his lectures to allow himself time to catch up. Others mentioned that they use their TAs to monitor the chat window; if the instructor does not have a TA, he or she could ask a student.  Another general suggestion is to ask students what works best for them, instead of trying to monitor everything.

Academic integrity was another issue that came up. At least one instructor acknowledged multiple instances of students cheating since moving online, while others shared that they are concerned it may happen to them. A brief discussion followed, with instructors sharing strategies they use to try and mitigate cheating: using online monitoring tools, lowering weights of exams, making all exams open-book/open-note, and placing more emphasis on project-based work.

Some technical questions also came up, such as how to recreate an interactive whiteboard. The responses ranged from configuring Powerpoint in a certain way to using multiple devices. Jamie Young shared a tutorial he put together that uses Open Broadcaster Software (OBS): https://docs.google.com/document/d/1JXptPGjnAOiqbpvrXJPGWDcbqE_l95C6Cm0moYpaelk/edit?usp=sharing  Faculty are welcome to contact the CER for help with this and other specific technological challenges.

Anything you would like to share with others?

A few faculty members shared how they are taking advantage of the online environment. Andrew Cherlin from Sociology mentioned how much easier it is to schedule outside guests, such as authors, since there are no travel arrangements, logistics, etc. to be worked out. He has had several guests this semester already. Cherlin also described how he has taken advantage of Zoom to meet with each student individually for about fifteen minutes to check in with them and make sure they are on track. He acknowledged that this is not practical for large courses, but it has been very beneficial to those in his seminar style course.

Lori Finkelstein from Museum Studies described how being online has reshaped her assignments. She usually has students go out into the field to different museums to conduct research. This semester, students are taking a look at what museums are offering virtually and whether or not they are successful, what seems to be working, what is not working, etc.

Lester Spence from Political Science is teaching a course with instructors from Goucher College and Towson University. Students from all three schools are collaborating together as they work on group projects, something that would not necessarily be possible in a face-to-face environment.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Pixabay