Lunch and Learn: Generative AI – Teaching Uses, Learning Curves, and Classroom Guidelines

On Tuesday, October 3rd, the Center for Teaching Excellence and Innovation (CTEI) hosted its first Lunch and Learn of the academic year, a panel discussion titled, “Generative AI: Teaching Uses, Learning Curves, and Classroom Guidelines.” The three panelists included Jun Fang, Assistant Director of the Instructional Design and Technology Team in the Carey Business School, Carly Schnitzler, KSAS instructor in the University Writing Program, and Sean Tackett, Associate Professor in the School of Medicine.  The discussion was moderated by Caroline Egan, project manager in the CTEI. Mike Reese, director of the CTEI, also helped to facilitate the event. 

The panelists began by introducing themselves and then describing their experiences with generative AI. Jun Fang loves new technology and has been experimenting with AI since its inception. He noticed the faculty that he works with generally fall into two categories when it comes to using AI: some are quite concerned about students using it to cheat and are not ready to use it, while others see a great deal of potential and are very excited to use it in the classroom.  In speaking with colleagues from across the institution, Fang quickly realized these are common sentiments expressed by faculty in all JHU divisions. This motivated him to lead an effort to create a set of AI guidelines specifically geared toward faculty. The document contains a number of strategies for using AI including: designing engaging course activities, providing feedback for students on their assignments, and redesigning course assessments. The section on redesigning course assessments uses two approaches: the “avoidance approach,” which involves deliberately designing assessments without AI, and the “activation approach,” which intentionally integrates AI tools into the curriculum. The document includes specific examples of many of the strategies mentioned as well as links to widely used generative AI tools. 

Fang described a recent scenario in which a faculty member was concerned that students were using ChatGPT to generate answers to online discussion board questions.  To mitigate this situation, Fang suggested the faculty member revise the questions so that they were tied to a specific reading or perhaps to a topic generated in one of his online synchronous class sessions.  Another suggestion was to have students submit two answers for each question – one original answer and one generated by ChatGPT – and then have the students compare the two answers.  The faculty member was not comfortable with either of these suggestions and ended up making the discussion more of a synchronous activity, rather than asynchronous.  Fang acknowledged that everyone has a different comfort level with using AI and that one approach is not necessarily better than another.     

Carly Schnitzler currently teaches two introductory writing courses to undergraduates and is very open to using generative AI in her classroom.  At the start of the semester, she asked students to fill out an intake survey which included questions about previous writing experiences and any technologies used, including generative AI. She found that students were reluctant to admit that they had used these technologies, such as ChatGPT, for anything other than ‘novelty’ purposes because they associated these tools with cheating. After seeing the results of the survey, Schnitzler thought it would be beneficial for students to explore the potential use of generative AI in class. She asked students to do an assignment where they had to create standards of conduct in a first year writing class, which included discussing their expectations of the course, the instructor, their peers, and how AI would fit in among these expectations. The class came up with three standards: 

  1. AI tools should support (and not distract from) the goals of the class, such as critical thinking, analytical skills, developing a personal voice, etc.  
  2. AI tools can be used for certain parts of the writing process, such as brainstorming, revising, or editing, but students must disclose that AI tools were used. 
  3. If there appears to be an over-use or over-reliance on AI tools, a discussion will take place to address the situation rather than disciplinary action. (Schnitzler wants students to feel safe exploring the tools without fear of repercussion.) 

This assignment comes from an open collection of cross-disciplinary assignments that use text generation technologies, mostly in a writing context. TextGenEd: Teaching with Text Generation Technologies, co-edited by Schnitzler, consists of freely accessible assignments submitted by scholars from across the nation. Assignments are divided into categories, such as AI literacy, rhetorical engagements, professional writing, creative explorations, and ethical considerations. Most are designed so that the technologies used are explored by students and instructors together, requiring very little ‘expert’ technological skills.  Schnitzler noted that there is a call for new submissions twice each year and encouraged instructors to consider submitting their own assignments that use text generation AI.

Sean Tackett was initially fearful of ChatGPT when it was released last year. Reading article after article stating how generative AI was going to “take over” pushed him to learn as much as he could about this new technology. He began experimenting with it and initially did not find it easy to use or even necessarily useful in his work with medical school faculty. However, he and some colleagues recognized potential in these tools and ended up applying for and receiving a JHU DELTA grant to find ways they could apply generative AI to faculty development in the medical school. Tackett described how they are experimenting with generative AI in a curriculum development course that he teaches to the med school faculty. For example, one of the tasks is for faculty to learn to write learning objectives, so they’ve been developing prompts that can be used to specifically critique learning objectives. Another example is developing prompts to critique writing. Most of Tackett’s students are medical professionals who do not have a lot of time to learn new technologies, so his team is continually trying to refine prompts in these systems to make them as useful and efficient as possible. Despite being so busy, Tackett noted the faculty are generally enthusiastic about having the opportunity to use these tools.     

The discussion continued with a question and answer session with audience members: 

Q: How do we transfer and integrate this knowledge with teaching assistants who help manage the larger sized classes? What about grading?
ST: I would advocate for the potential of AI to replace a TA in terms of grading, but not in terms of a TA having a meaningful dialogue with a student. 
JF: Generative AI tools can be used to provide valuable feedback on assessments. There are a lot of tools out there to help make grading easier for your TAs, but AI can be used for the feedback piece. 

Q: How might professors provide guidelines to students to use generative AI to help them study better for difficult and complex topics?
MR: One possibility is to generate quiz questions – and then have students follow up by checking the work of these quizzes that have been generated.
CS: Using a ChatGPT or other text generation tool as a reading comprehension aid is something that has been useful for non-native English speakers. For example, adding a paragraph from an academic article into ChatGPT and asking what this means in plain language can be helpful.

CE: This gets to what I call ‘prompt literacy,’ which is designing better prompts to give you better answers. There is a very good series about this on Youtube from the University of Pennsylvania.
Sean, what have you experienced with prompting right now, in terms of challenges and opportunities?
ST: We’re trying to put together advice on how to better prompt the system to get more refined and accurate answers. After a few iterations of prompting the system, we refine the prompt and put it into a template for our faculty, leaving a few ‘blanks’ for them to fill in with their specific variables. The faculty are experts in their subject areas, so they can tell if the output is accurate or not. We’re in the process of collecting their output, to put together best practices about what works, what does not work.  

CE: What would you all like to see in terms of guidelines and best practices for AI on a web page geared towards using AI in the classroom?
Guest: And along those lines, how to we move forward with assigning research projects, knowing that these tools are available for students?
ST: I think it could be useful for students to learn research skills. They could use the tools to research something, then critique the results and explain how they verified those results. It can also be useful for generating ideas and brainstorming. Another thought is that there are a number of domain specific generative AI databases, such as Open Evidence which is useful in the medical field.  
CS: To Sean’s point, I think a comparative approach is useful with these tools. The tools are very good at pattern matching genre conventions, so doing comparative work within a genre could be useful.
JF: I think ChatGPT and other generative AI tools can be useful for different parts of the research process, such as brainstorming, structure, and editing. But not for something like providing or validating evidence.  

Q: As a grad student, I’m wondering how the presence of AI might force us to refine the types of questions and evaluations that we give our students. Are there ways to engineer our own questions so that the shift of the question is changed to avoid the problem [of having to refine and update the question] in the first place?
CS: There is an assignment in our collection that talks about bringing an assignment from past to present. Again, thinking in terms of a comparative approach, ask ChatGPT the question, and then ask your students the same question and see how they compare, if there are any patterns.  I think it can be helpful to think of ChatGPT as adding another voice to the room.
JF: We have a section in the guidelines on how to redesign assessment to cope with generative AI related issues. We suggest two approaches: the avoidance approach and the activation approach. The avoidance approach is for faculty who are not yet comfortable using this technology and want to avoid having students use it.  One example of this approach is for faculty to rework their assignments to focus on a higher level of learning, such as creativity or analysis, which will hopefully reduce or eliminate the opportunity for students to use AI tools. The activation approach encourages faculty to proactively integrate AI tools into the assessment process. One example of this approach I mentioned earlier is when I suggested to a faculty member to rework their discussion board questions to allow students to submit two versions of the answers, one created by them and the other by ChatGPT, and then analyze the results. 

Q: What is the ultimate goal of education? We may have different goals for different schools. Also, AI may bridge people from different social backgrounds. In China, where I grew up, the ability to read or write strongly depends on the social status of the family you come from. So there is some discomfort using it in the classroom.
CS: I feel some discomfort also, and that’s what led to the development of the guidelines in my classroom. I posed a similar question to my students: if we have these tools that can allegedly write for us, what is the point of taking a writing class?  They responded by saying things like, “writing helps to develop critical thinking and analytical skills,” to which I added, “being here is an investment in yourself as a student, a scholar, and a thinker.” I think asking students to articulate the value of the education that they want to get is really helpful in determining guidelines for AI.
ST: Going to school and getting an education is an investment of your time. You pay now so you can be paid later. But it’s not as transactional as that. AI is already in the work environment and will become more prevalent. If we’re not preparing students to succeed in the work environment, we are doing them a disservice. We teach students to apply generative AI in their classes so they are prepared to use it in the workforce.
JF: In the business school, everything is market driven. I think education can fit into that framework as well. We’re trying to provide graduates with the confidence they need to finish the work and meet the market’s need. We know that generative AI tools have really changed the world and they’re starting to emerge in every part of our life. We need to train students to realize that ChatGPT might be part of their education, part of life in the future, and part of the work in the future as well. There are things AI can help us do, but there are still fundamentals that students need to learn. One example is calculators: we still need to learn from the beginning that 1 + 1 = 2. 
CE: This question also reminded me of asking your students, what is the ultimate purpose of a research paper? Where do they think ChatGPT should fit into the research process?  

Q: I work at the library and we’re getting lots of questions about how to detect if students are using AI. And also, how do you determine if students are relying too heavily on AI?
JF: We also get this question from our faculty. The most used detection tool right now is Turnitin, which is embedded in Canvas. But the level of accuracy is not reliable. We encourage faculty to always validate before accepting the results.  For faculty who are actively using AI in the classroom, we also encourage them to provide clear guidance and expectations to students on how they are allowed to use it.  This may make it a little easier to determine if they are using it correctly or not.
MR: There are some other tools out there, such a GPTZero, ZeroGPT, but to Jun’s point, the difficult thing is that it’s different than plagiarism detection which says this is copied, and here’s the source. These tools say there’s a probability that part of this was taken, but you can’t point to a direct source. It’s up to instructors whether or not to use these tools, but consider using them to facilitate a conversation with students. In my own classes if I suspect academic misconduct, I usually start by asking them to explain, talk to me about what is happening before I make accusations. With these tools, there tends to be no hard evidence, just probabilities that something may have happened.  This is definitely an area we’re all still learning about.
Guest: I was just thinking that having a conversation with students about why they are turning to the tool in the first place might prevent misconduct.  Instead of sending them to an academic misconduct committee, we could have these conversations, like Carly mentioned. Making students aware of the limitations of the tool could also be helpful.
CS: Yes, I say that in our guidelines that I’m prioritizing conferences with students over immediate disciplinary action. I try to pre-empt anxiety students might feel around using these tools. Designing your assignments in a way that reduces anxiety is also helpful. For example, I tend to design assignments that build on one another throughout the semester in smaller bits, rather than one giant chunk all at once.  

Q: Is there any discussion around combining AI with teaching, such as generating personalized explanations of a topic? Students will have different levels of expertise and comfort with different topics.
ST: We’re trying to do this, to create a teaching aid for the future. We’re planning to use it to create assessment items.  

Amy Brusini, Senior Instructional Designer
Center for Teaching Excellence and Innovation
 

Image Source: Pixabay, Unsplash

 

Quick Tips: Alternative Assessments

Throughout the past year and a half, instructors have made significant changes to the way they design and deliver their courses. The sudden shift to being fully remote, then hybrid, and now back to face-to-face for some courses has required instructors to rethink not only the way they teach, but also the way they assess their students. Many who have previously found success with traditional tests and exams are now seeking alternative forms of assessment, some of which are described below:

Homework assignments: Adding more weight to homework assignments is one way to take the pressure off of high stakes exams while keeping students engaged with course material. Homework assignments will vary according to the subject, but they may include answering questions from a chapter in a textbook, writing a summary of a reading or topic discussed in class, participating in an online discussion board, writing a letter, solving a problem set, etc.

Research paper:  Students can apply their knowledge by writing a research paper. To help ensure a successful outcome, a research paper can be set up as a scaffolded assignment, where students turn in different elements of the paper, such as a proposal, an outline, first and second drafts, bibliography, etc. throughout the semester, and then the cumulative work at the end.

Individual or group presentations: Student presentations can be done live for the class or prerecorded ahead of time using multimedia software (e.g., Panopto, VoiceThread) that can be viewed asynchronously. Depending on the subject matter, presentations may consist of a summary of content, a persuasive argument, a demonstration, a case study, an oral report, etc. Students can present individually or in groups.

Reflective paper or journal: Reflective exercises allow students to analyze what they have learned and experienced and how these experiences relate to their learning goals. Students develop an awareness of how they best acquire knowledge and can apply these metacognitive skills to both academic and non-academic settings. Reflective exercises can be guided or unguided and may include journaling, self-assessment, creating a concept map, writing a reflective essay, etc.

Individual or group projects: Student projects may be short-term, designed in a few weeks, or long-term, designed over an entire semester or more. If the project is longer term, it may be a good idea to provide checkpoints for students to check in about their progress and make sure they are meeting deadlines. Ideas for student projects include: creating a podcast, blog, interactive website, interactive map, short film, digital simulation, how-to guide, poster, interview, infographic, etc. Depending on the circumstances, it may be possible for students to partner with a community-based organization as part of their project. Another idea is to consider allowing students to propose their own project ideas.

Online Tests and Exams: For instructors who have moved their tests online, it may be worth considering lowering the stakes of these assessments.  Instead of high-stakes midterms and finals, replace them with weekly quizzes that are weighted lower than a traditional midterm or final. Giving more frequent assessments allows for additional opportunities to provide feedback to students and help them reach their goals successfully. To reduce the potential for cheating, include questions that are unique and require higher-level critical thinking. Another consideration is to allow at least some of the quizzes to be open-book.

It’s worth noting that offering students a variety of ways to demonstrate their knowledge aligns with the principles of universal design for learning (UDL). Going beyond traditional tests and exams helps to ensure that all learners have an opportunity to show what they have learned in a way that works best for them. If you’re looking for more ideas, here are a few sites containing additional alternative assessment strategies:

https://www.scholarlyteacher.com/post/alternatives-to-the-traditional-exam-as-measures-of-student-learning-outcomes

https://teaching.berkeley.edu/resources/course-design-guide/design-effective-assessments/alternatives-traditional-testing

https://cei.umn.edu/alternative-assessment-strategies

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Pixabay

Strategies for an Inclusive Classroom

This summer, the Center for Educational Resources offered a multi-day Best Practices in University Teaching workshop for JHU faculty to learn about evidence-based teaching practices. Participants explored topics such as best practices in course design, active learning strategies, and various assessment techniques. One of the many sessions that generated a great deal of discussion was the Inclusive Pedagogy session, which addressed the importance of accommodating the needs of diverse learners in a supportive environment.  The session was led by Dr. Karen Fleming, a professor in the Biophysics department who is also nationally recognized for her efforts in raising awareness on overcoming biases and barriers to women in STEM.  I played a small role in the presentation by providing a brief introduction and overview of Universal Design for Learning (UDL), a research-based educational framework that helps remove unnecessary barriers from the learning process.

During the session, participants were encouraged to examine their own biases by reflecting on an unconscious bias test they took just before the session. Many were clearly dismayed by their own results; Fleming reassured them that we all have biases and that accepting this fact is the first step in addressing them.  She then shared a real-world example of unconscious bias toward women in STEM that is published in the Proceedings of the National Academy of Sciences. The shocking results of this study, which show that even women faculty in STEM display a preferential bias toward males over females, resulted in an engaging discussion. The dialogue continued as participants then debriefed about a video they watched, also before the session, which featured a teaching assistant (TA) stereotyping various students as he welcomed them to class.  The video was intentionally exaggerated at times, and participants were eager to point out the “over the top” behavior exhibited by the TA. Participants were inspired to share personal experiences of bias, prejudice, and stereotyping that they’ve encountered in the classroom either as students or instructors.

Toward the end of the session, the focus shifted to thinking about strategies that would mitigate instances of biased behavior and instead encourage a more inclusive classroom environment. hands reaching toward each otherAs a culminating exercise, we asked participants to consider the principles of UDL as well as ideas and discussions from earlier in the session to complete an “Inclusive Strategies Worksheet;” the worksheet would contain concrete strategies that would make a measurable difference in terms of inclusivity in their classrooms. The participants were very thoughtful in their responses and several of their ideas are worth sharing:

  • Administer a pre- or early-semester survey to get to know the students and build community.
  • Include a “campus climate” section in the syllabus with language expressing a commitment to respecting diverse opinions and being inclusive.
  • On the first day of class, have students create a “Community Agreement” to establish ground rules for class discussions, online discussions, and group activities. This can be revisited throughout the semester to adjust what is working/not working.
  • Acknowledge that there may be uncomfortable moments as we face mistakes and hold each other and ourselves accountable. Encourage students to “call in” when mistakes (intentional or not) occur, rather than “call out” or “cancel” so that we may learn from each other.
  • Work collaboratively with students to develop rubrics for assignments.
  • Include authors and guest speakers with varied cultures, backgrounds, and identities. Include images, readings, examples, and other course materials that are diversified. If opportunities are limited, have students do a reflective exercise on who/what is missing from the research.
  • Share content with students in multiple ways: research papers, videos, images, graphs, blog entries, etc.
  • Increase the number of active learning activities to enrich the learning experience.
  • Offer options to students: vary the types of assignments given and allow for a choice of ways to demonstrate knowledge among students when possible.
  • Follow accessibility guidelines: ensure video/audio recordings have closed captioning and/or a transcript, for example.
  • Create opportunities for students to discuss their lived experiences in the classroom and/or on assignments.
  • Provide opportunities for students to participate anonymously without fear of judgement (i.e. using iClickers or Jamboard).
  • Conduct activities that engage students in small groups so they get to know one another. Encourage students to use these connections to identify study partners. Consider switching groups throughout the semester so students meet additional partners.

Do you have additional strategies to share? Please feel free to add them in the comments.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Best Practices in University Teaching Logo, Pixabay

Expanding Students’ Research Skills with a Virtual Museum Exhibit

Morgan Shahan received her PhD in History from Johns Hopkins University in 2020. While at Hopkins, she received Dean’s Teaching and Prize Fellowships. In 2019, her department recognized her work with the inaugural Toby Ditz Prize for Excellence in Graduate Student Teaching. Allon Brann from the Center for Educational Resources spoke to Morgan about an interesting project she designed for her fall 2019 course,“Caged America: Policing, Confinement, and Criminality in the ‘Land of the Free.’”

I’d like to start by asking you to give us a brief description of the final project.  What did your students do?

Students created virtual museum exhibits on topics of their choice related to the themes of our course, including the rise of mass incarceration, the repeated failure of corrections reform, changing conceptions of criminality, and the militarization of policing. Each exhibit included a written introduction and interpretive labels for 7-10 artifacts, which students assembled using the image annotation program Reveal.  On the last day of class, students presented these projects to their classmates. Examples of projects included: “Birthed Behind Bars: Policing Pregnancy and Motherhood in the 19th and 20th Centuries,” “Baseball in American Prisons,” and “Intentional Designs: The Evolution of Prison Architecture in America in the 19th and 20th Centuries.”

Can you describe how you used scaffolding to help students prepare for the final project?

I think you need to scaffold any semester-long project. My students completed several component tasks before turning in their final digital exhibits. Several weeks into the semester, they submitted a short statement outlining the “big idea” behind their exhibitions. The “big idea statement,” a concept I borrowed from museum consultant Beverly Serrell, explained the theme, story, or argument that defined the exhibition’s tone and dictated its content. I asked students to think of the “big idea statement” as the thesis for their exhibition.

Students then used the big idea to guide them as they chose one artifact and drafted a 200-word label for it. I looked for artifact labels that were clearly connected to the student’s big idea statement, included the context visitors would need to know to understand the artifact, and presented the student’s original interpretation of the artifact. The brevity of the assignment gave me time to provide each student with extensive written comments. In these comments and in conversations during office hours, I helped students narrow their topics, posed questions to help guide analysis and interpretation of artifacts, and suggested additional revisions focused on writing mechanics and tone.

Later in the semester, students expanded their big idea statements into rough drafts of the introductions for their digital exhibit. I asked that each introduction orient viewers to the exhibition, outline necessary historical context, and set the tone for the online visit. I also set aside part of a class period for a peer review exercise involving these drafts. I hoped that their classmates’ comments, along with my own, would help students revise their introductions before they submitted their final exhibit.

If I assigned this project again, I would probably ask students to turn in another label for a second artifact. This additional assignment would allow me to give each student more individualized feedback and would help to further clarify my grading criteria before the final project due date.

When you first taught this course a few years ago, you assigned students a more traditional task—a research paper. Can you explain why you decided to change the final assignment this time around?

I wanted to try a more flexible and creative assignment that would push students to develop research and analytical skills in a different format. The exhibit project allows students to showcase their own interpretation of a theme, put together a compelling historical narrative, and advance an argument. The project remains analytically rigorous, pushing students to think about how history is constructed. Each exhibit makes a claim—there is reasoning behind each choice the student makes when building the exhibit and each question he or she asks of the artifacts included. The format encourages students to focus on their visual analysis skills, which tend to get sidelined in favor of textual interpretation in most of the student research papers I have read. Additionally, the exhibit assignment asks students to write for a broader audience, emphasizing clarity and brevity in their written work.

What challenges did you encounter while designing this assignment from scratch?  

In the past I have faced certain risks whenever I have designed a new assignment. First, I have found it difficult to strike a balance between clearly stating expectations for student work while also leaving room for students to be creative. Finding that balance was even harder with a non-traditional assignment. I knew that many of my students would not have encountered an exhibit project before my course, so I needed to clarify the utility of the project and my expectations for their submissions.

Second, I never expected to go down such a long research rabbit hole when creating the assignment directions. I naively assumed that it would be fairly simple to put together an assignment sheet outlining the requirements for the virtual museum project.  I quickly learned, however, that it was difficult to describe exactly what I expected from students without diving into museum studies literature and scholarship on teaching and learning.

I also needed to find a digital platform for student projects. Did I want student projects to be accessible to the public? How much time was I willing to invest in teaching students how to navigate a program or platform? After discussing my options with Reid Sczerba in the Center for Educational Resources (CER), I eventually settled on Reveal, a Hopkins-exclusive image-annotation program. The program would keep student projects private, foreground written work, and allow for creative organization of artifacts within the digital exhibits. Additionally, I needed to determine the criteria for the written component of the assignment. I gave myself a crash course in museum writing, scouring teaching blogs, museum websites, journals on exhibition theory and practice, and books on curation for the right language for the assignment sheet. I spoke with Chesney Medical Archives Curator Natalie Elder about exhibit design and conceptualization. My research helped me understand the kind of writing I was looking for, identify models for students, and ultimately create my own exhibit to share with them.

Given all the work that this design process entailed, do you have any advice for other teachers who are thinking about trying something similar?

This experience pushed me to think about structuring assignments beyond the research paper for future courses. Instructors need to make sure that students understand the requirements for the project, develop clear standards for grading, and prepare themselves mentally for the possibility that the assignment could crash and burn. Personally, I like taking risks when I teach—coming up with new activities for each class session and adjusting in the moment should these activities fall flat—but developing a semester-long project from scratch was a big gamble.

How would you describe the students’ responses to the project? How did they react to the requirements and how do you think the final projects turned out?

I think that many students ended up enjoying the project, but responses varied at first. Students expressed frustration with the technology, saying they were not computer-savvy and were worried about having to learn a new program. I tried to reassure these students by outing myself as a millennial, promising half-jokingly that if I could learn to use it, they would find it a cinch. Unfortunately, I noticed that many students found the technology somewhat confusing despite the tutorial I delivered in class. After reading through student evaluations, I also realized that I should have weighted the final digital exhibit and presentation less heavily and included additional scaffolded assignments to minimize the end-of-semester crunch.

Despite these challenges, I was really impressed with the outcome. While clicking through the online exhibits, I could often imagine the artifacts and text set up in a physical museum space. Many students composed engaging label text, keeping their writing accessible to their imaginary museum visitors while still delivering a sophisticated interpretation of each artifact. In some cases, I found myself wishing students had prioritized deeper analysis over background information in their labels; if I assigned this project again, I would emphasize that aspect.

I learned a lot about what it means to support students through an unfamiliar semester-long project, and I’m glad they were willing to take on the challenge. I found that students appreciated the flexibility of the guidelines and the room this left for creativity. One student wrote that the project was “unique and fun, but still challenging, and let me pursue something I couldn’t have if we were just assigned a normal paper.”

If you’re interested in pursuing a project like this one and have more questions for Morgan, you can contact her at: morganjshahan@gmail.com. 

For other questions or help developing new assessments to use in your courses, contact the Center for Educational Resources (cerweb@jhu.edu).

Allon Brann, Teacher Support Specialist
Center for Educational Resources

Image Source: Morgan Shahan

The New Google Sites

We’re always on the lookout for applications that instructors and their students can use to enhance course work. A previous post We Have a Solution for That: Student Presentations, Posters, and Websites (October 6, 2017) mentioned a new version of Google Sites as having potential as a presentation software that allows for easy collaboration among student team members. Today’s post will delve deeper into its possibilities and use. This post is also available in PDF format as part of The Innovative Instructor articles series.

Logo for Google SitesNew Google Sites is an online website creation platform. It doesn’t require web development or design experience to create sites that work well on mobile devices. The New Google Sites application is included with the creation tools offered in Google Drive, making it easier to share and integrate your Google Drive content.

In 2006 Google purchased JotSpot, a software company that had been creating social software for businesses. The software acquired from that purchase was used to create the first iteration of Google Sites, now known as Classic Google Sites. Ten years later, Google launched a completely rebuilt Google Sites, which is currently being referred to as New Google Sites.

New Google Sites hasn’t replaced Classic Google Sites so much as it offers a new and different experience. The focus of New Google Sites is to increase collaboration for all team members regardless of their web development experience. It is also integrated with Google Drive so that teams working within the Google apps environment can easily associate shared content.

This new iteration of Google Sites is designed with mobile devices in mind. Users are

Screenshot example of the New Google Sites editing interface.

Example of the New Google Sites editing interface.

not able to add special APIs (Application Programmable Interface, which extends functionality of an application) or edit HTML directly. This keeps the editing interface

and options simple to ensure that whatever you create will work consistently across all browsers and devices. While this may seem limiting, you still have the option to use Classic Google Sites if you want a higher level of control.

In a classroom setting, instructors are often cautious about assigning students projects that require them to learn new technical skills that aren’t directly relevant to the course content. Instructors must balance the time it will take students to achieve technical competency against the need to ensure that students achieve the course learning goals. With New Google Sites, students can focus on their content without being overwhelmed by the technology.

In addition to ease of use, collaborative features allow students to work in teams and share content. Group assignments can offer students a valuable learning experience by providing opportunities for inclusivity, exposure to diverse viewpoints, accountability through team roles, and improved project outcomes.

New Google Sites makes it easy for the causal user to disseminate new ideas, original research, and self-expression to a public audience. If the website isn’t ready to be open to the world, the site’s editor has the ability to keep it unpublished while still having the option to collaborate or share it with select people. This is an important feature as student work may not be ready for a public audience or there may be intellectual property rights issues that preclude public display.

Professors at here at Johns Hopkins have used New Google Sites for assignments. In the History of Science and Technology course, Man vs. Machine: Resistance to New Technology since the Industrial Revolution, Assistant Professor Joris Mercelis had students use New Google Sites for their final projects. Teams of two or three students were each asked to create a website to display an illustrated essay based on research they had conducted. Images and video were required to support their narrative arguments. Students had to provide proper citations for all materials. Mercelis wanted the students to focus on writing for a lay audience, an exercise that encouraged them to think broadly about the topics they were studying.

History of Art Professor Stephen Campbell used a single Google Site where student teams collaborated to produce an online exhibition, Exhibiting the Renaissance Nude: The Body Exposed. Each student group was responsible for supplying the materials for one of five topic pages. The content developed from this project was accessible only to the class.

In both cases, students reported needing very little assistance when editing their sites. Typically, giving an introductory demonstration and providing resources for where to find help are all students need to begin working.

Recently, Google has created the ability to allow other Google Drive content to be embedded in a site. This means that you can embed a form or a document on a web page to elicit responses/feedback from your audience without them having to leave the site. This level of integration further supports the collaborative nature of Google applications.

Currently, this iteration of Google Sites uses the New in its title. There may come a time when Google will drop the New or re-brand New Google Sites with a different name. There is no indication that Google will stop supporting Classic Google Sites with its more advanced features.

Use of both versions of Google Sites is free and accessible using your Google Account. You can create a new site by signing into Google and going to the New Google Sites page (link provided below). You can also create a site from Google Drive’s “New” button in the creation tools menu.

It is recommended that students create a new account for class work instead of us­ing their personal accounts. While this is an additional step, it ensures that they can keep their personal lives separated from their studies.

A Google Site as displayed on a desktop, tablet, and smartphone.

A Google Site as displayed on a desktop, tablet, and smartphone.

Additional Resources:

Reid Sczerba, Multimedia Developer
Center for Educational Resources

Image sources: Google Sites logo, screenshots

Considerations for Digital Assignments

Image of the handout on considerations for digital assignments

My colleague in the Center for Educational Resources, Reid Sczerba, and I often consult with faculty who are looking for alternative assignments to the traditional research paper. Examples of such assignments include oral presentations, digital and print poster presentations, virtual exhibitions, using timelines and mapping tools to explore temporal and spatial relationships, blogging, creating videos or podcasts, and building web pages or websites.

Reid, who is a graphic designer and multimedia specialist, put together a handy chart to help faculty think about these assignments in advance of a face-to-face consultation with us. A PDF version of this handout is available for your convenience. The text from the chart is reprinted below.

Learning objectives
♦ Have you determined your learning objectives for this assignment? Deciding what you would like your students to learn or be able to do helps to frame the parameters of your assignment. http://www.cer.jhu.edu/ii/InnovInstruct-BP_learning-objectives.pdf

Type of assignment
♦Will there be analysis and interpretation of a topic or topics to produce a text-based and/or visual-based project? Consider alternatives to a traditional research paper.
http://ii.library.jhu.edu/2016/04/08/lunch-and-learn-alternatives-to-the-research-paper/
♦Will there be a need to document objects or materials for a catalog, exhibition, or repository? Defining meaningful metadata and the characteristics of research materials will be important considerations.

Access and visibility
♦Will you want the students’ work to be made open to the public, seen just at JHU, or shared only with the class? Decide up front whether to have students’ work be public or private in order to get their consent and choose the best platform for access.
♦ Will they be working with copyrighted materials? The fair use section of the Copyright Act may provide some latitude, but not all educational uses are fair use. http://www.arl.org/focus-areas/copyright-ip

Collaboration
♦Will you want students to work collaboratively as a class, in small groups, or individually?
Group work has many benefits but there are challenges for assessment and in ensuring that students do their fair share of the work.
http://www.cer.jhu.edu/ii/InnovInstruct-BP_MakingGroupProjectsWork.pdf
♦ Will you want the students’ work to be visible to others in the class or private to themselves or their group?
Consider adding a peer review component to the assignment to help the students think critically about their work.
http://www.cer.jhu.edu/ii/InnovInstruct-Ped_peerinstruction.pdf

Format
♦ Will you want your students to have a choice of media to express their research or will all students use the same solution?
An open-ended choice of format could allow students to play to their strengths, leading to creativity. On the other hand, too many choices can be daunting for some, and it may be challenging to assess different projects equally.
♦ What would be the ideal presentation of the student’s work?

• spatially arranged content (mapping, exhibition)
• temporally arranged content (timeline)
• narrative (website, blog)
• oral presentation
• visual presentation (poster, video)

Formats for digital assignments are not limited to this list. More than one approach can be used if the result fulfills the learning objectives for the assignment.

Some of the solutions that we have recommended to faculty in the past are OmekaOmeka NeatlineTimeline JSPanopto (JHU), Reveal (JHU), Google tools (Google SitesGoogle Maps, Google Docs), Voicethread (JHU), and WordPress.

*************************************************************************************************

Reid Sczerba, Multimedia Development Specialist
Center for Educational Resources

Macie Hall, Senior Instructional Designer
Center for Educational Resources

Image source: Image of the handout created by Reid Sczerba