Lunch and Learn: Innovative Grading Strategies

Logo for Lunch and Learn program showing the words Lunch and Learn in orange with a fork above and a pen below the lettering. Faculty Conversations on Teaching at the bottom.On Thursday, February 28, the Center for Educational Resources (CER) hosted the third Lunch and Learn for the 2018-2019 academic year. Rebecca Kelly, Associate Teaching Professor, Earth and Planetary Sciences and Director of the Environmental Science and Studies Program, and Pedro Julian, Associate Professor, Electrical and Computer Engineering, presented on Innovative Grading Strategies.

Rebecca Kelly began the presentation by discussing some of the problems in traditional grading. There is a general lack of clarity in what grades actually mean and how differently they are viewed by students and faculty. Faculty use grades to elicit certain behaviors from students, but it doesn’t necessarily mean that they are learning. Kelly noted that students, especially those at JHU, tend to be focused on the grade itself, aiming for a specific number and not the learning; this often results in high levels of student anxiety, something she sees often. She explained how students here don’t get many chances to fail and not have their grades negatively affected. Therefore, every assessment is a source of stress because it counts toward their grade. There are too few opportunities for students to learn from their mistakes.

Kelly mentioned additional challenges that faculty face when grading: it is often time consuming, energy draining, and stressful, especially when haggling over points, for example.  She makes an effort to provide clearly stated learning goals and rubrics for each assignment, which do help, but are not always enough to ease the burden.

Kelly introduced the audience to specifications grading and described how she’s recently started using this approach in Introduction to Geographic Information Systems (GIS). With specifications grading (also described in a recent CER Innovative Instructor article), students are graded pass/fail or satisfactory/unsatisfactory on individual assessments that align directly with learning goals. Course grades are determined by the number of learning goals mastered. This is measured by the number of assessments passed. For example, passing 20 or more assignments out of 23 would equate to an A; 17-19 assignments would equate to a B. Kelly stresses the importance of maintaining high standards; for rigor, the threshold for passing should be a B or better.

In Kelly’s class, students have multiple opportunities to achieve their goals. Each student receives three tokens that he/she can use to re-do an assignment that doesn’t pass, or select a different assignment altogether from the ‘bundle’ of assignments available. Kelly noted the tendency of students to ‘hoard’ their tokens and how it actually works out favorably; instead of risking having to use a token, students often seek out her feedback before turning anything in.

Introduction to GIS has both a lecture and a lab component. The lab requires students to use software to create maps that are then used to perform data analysis. The very specific nature of the assignments in this class lend themselves well to the specifications grading approach. Kelly noted that students are somewhat anxious about this approach at first, but settle into it once they fully understand. In addition to clearly laying out Grade bundles used in specifications gradingexpectations, Kelly lists the learning goals of the course and how they align with each assignment (see slides). She also provides students with a table showing the bundles of assignments required to reach final course grades. Additionally, she distributes a pacing guide to help students avoid procrastination.

The results that Kelly has experienced with specifications grading have been positive. Students generally like it because the expectations are very clear and initial failure does not count against them; there are multiple opportunities to succeed. Grading is quick and easy because of the pass/fail system; if something doesn’t meet the requirements, it is simply marked unsatisfactory. The quality of student work is high because there is no credit for sloppy work. Kelly acknowledged that specifications grading is not ideal for all courses, but feels the grade earned in her GIS course is a true representation of the student’s skill level in GIS.

Pedro Julian described a different grading practice that he is using, something he calls the “extra grade approach.” He currently uses this approach in Digital Systems Fundamentals, a hands-on design course for freshmen. In this course, Julian uses a typical grading scale: 20% for the midterm, 40% for labs and homework, and 40% for the final project. However, he augments the scale by offering another 20% if students agree to put in extra work throughout the semester. How much extra work? Students must commit to working collaboratively with instructors (and other students seeking the 20% credit) for one hour or more per week on an additional project.  This year, the project is to build a vending machine. Past projects include building an elevator out of Legos and building a robot that followed a specific path on the floor.

Julian described how motivated students are to complete the extra project once they commit to putting in the time. Students quickly realize that they learn all sorts of skills they would not have otherwise learned and are very proud and engaged. Student participation in the “extra grade” option has grown steadily since Julian started using this approach three years ago. The first year there were 5-10 students who signed up, and this year there are 30. Julian showed histograms (see slides) of student grades from past semesters in his class and how the extra grade has helped push overall grades higher.  The histograms also show that it’s not just students who may be struggling with the class who are choosing to participate in the extra grade, but “A students” as well.

Similar to Rebecca Kelly’s experience, Julian expressed how grade-focused JHU students are, much to his dismay. In an attempt to take some of the pressure off, he described how he repeatedly tells his students that if they work hard, they will get a good grade; he even includes this phrase in his syllabus. Julian explained how he truly wants students to concentrate more on the learning and not on the grade, which is his motivation behind the “extra grade” approach.

An interesting discussion with several questions from the audience followed the presentations. Below are some of the questions asked and responses given by Kelly and Julian, as well as audience members.

Q: (for Julian) Some students may not have the time or flexibility in their schedule to take part in an extra project. Do you have suggestions for them? Did you consider this when creating the “extra grade” option?

Julian responded that in his experience, freshmen seem to be available. Many of them make time to come in on the weekends. He wants students to know he’s giving them an “escape route,” a way for them to make up their grade, and they seem to find the time to make it happen.  Julian has never had a student come to him saying he/she cannot participate because of scheduling conflicts.

Q: How has grade distribution changed?

Kelly remarked how motivated the students are and therefore she had no Cs, very few Bs, and the rest As this past semester. She expressed how important it is to make sure that the A is attainable for students. She feels confident that she’s had enough experience to know what counts as an A. Every student can do it, the question is, will they?

Q: (for Kelly) Would there ever be a scenario where students would do the last half of the goals and skip the first half?

Kelly responded that she has never seen anyone jump over everything and that it makes more sense to work sequentially.

Q: (for Kelly) Is there detailed feedback provided when students fail an assignment?

Kelly commented that it depends on the assignment, but if students don’t follow the directions, that’s the feedback – to follow the directions. If it’s a project, Kelly will meet with the student, go over the assignment, and provide immediate feedback. She noted that she finds oral feedback much more effective than written feedback.

Q: (for Kelly) Could specs grading be applied in online classes?

Kelly responded that she thinks this approach could definitely be used in online classes, as long as feedback could be provided effectively. She also stressed the need for rubrics, examples, and clear goals.

Q: Has anyone tried measuring individual learning gains within a class? What skills are students coming in with? Are we actually measuring gain?

Kelly commented that specifications grading works as a compliment to competency based grading, which focuses on measuring gains in very specific skills.

Julian commented that this issue comes up in his class, students coming in with varying degrees of experience. He stated that this is another reason to offer the extra credit, to keep things interesting for those that want to move at a faster pace.

The discussion continued among presenters and audience members about what students are learning in a class vs. what they are bringing in with them. A point was raised that if students already know the material in a class, should they even be there?  Another comment was made regarding if it is even an instructor’s place to determine what students already know.  Additional comments were made about what grades mean and concerns about grades being used for different things, i.e. employers looking for specific skills, instructors writing recommendation letters, etc.

Q: Could these methods be used in group work?

Kelly responded that with specifications grading, you would have to find a way to evaluate the group. It might be possible to still score on an individual basis within the group, but it would depend on the goals. She mentioned peer evaluations as a possibility.

Julian stated that all grades are based on individual work in his class. He does use groups in a senior level class that he teaches, but students are still graded individually.

The event concluded with a discussion about how using “curve balls” – intentionally difficult questions designed to catch students off-guard – on exams can lead to challenging grading situations. For example, to ultimately solve a problem, students would need to first select the correct tools before beginning the solution process. Some faculty were in favor of including this type of question on exams, while others were not, noting the already high levels of exam stress.  A suggestion was made to give students partial credit for the process even if they don’t end up with the correct answer. Another suggestion was to give an oral exam in order to hear the student’s thought process as he/she worked through the challenge. This would be another way for students to receive partial credit for their ideas and effort, even if the final answer was incorrect.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Sources: Lunch and Learn Logo, slide from Kelly presentation

Changing the Guard

After 31 years at Johns Hopkins University, 11 in the Center for Educational Resources, six years and 198 posts on The Innovative Instructor, I am retiring. The good news is that my colleague, Amy Brusini, will be taking on the mantle, not only as the new editor of this blog, but as the Senior Instructional Designer in the CER.

Amy has been an Instructional Technology Specialist in the CER for 12 years. She has a BS in Music Education from Towson University and a MS in Education from Johns Hopkins University. She is currently the CER’s Blackboard expert, providing instruction and consultations to faculty for course management, instructional design, and educational technology.

The Innovative Instructor has been a labor of love for me. I am leaving it in good hands. Happy Trails!

Macie Hall, Senior Instructional Designer
Center for Educational Resources

Image Sources: CC Reid Sczerba, Center for Educational Resources

PhysPort: Not Just for Physics Instructors

Screenshot of PhysPort home page.While tracking down some resources for active learning this past week, I stumbled on PhysPort, and wished I’d know about this site much sooner. PhysPort, formerly the Physics Education Research (PER) User’s Guide, supports “…physics faculty in implementing research-based teaching practices in their classrooms, by providing expert recommendations about teaching methods, assessment, and results from physics education research (PER). Work in PER has made enormous advances in developing a variety of tools that dramatically improve student learning of physics. Our goal is to synthesize and translate the results of this research so you can use it in your classroom today.” The thing is, many of the resources here will be valuable to faculty in any discipline and will help improve student learning in any course.

Certainly, some of the materials and examples are physics-specific; others may be more useful for STEM faculty generally. Yet, there is plenty here that will be appreciated by anyone looking for pedagogical resources, even Humanities faculty. The best thing is the emphasis on research-based strategies.

I liked the clean, clear structure of the site. There are five tabs at the top each page for easy navigation to the Home page, Expert Recommendations, Teaching Methods, Assessments, and Workshops.

On the Home page there are three areas for general help—Teaching (I want to…), Assessment (I want to…), and Troubleshooting (I need help with…). Clicking on a topic of interest will take you to a page with relevant materials and resources. Expert Recommendations are essentially articles/blog posts written by PhysPort staff and guest authors to help instructors. I’ve listed some articles of general interest further down.

Teaching Methods will take you to a form where you can enter information more specific to your course. You can enter subject from a drop down list (including “any subject” to keep results more generic), level, setting, student skills you’d like to develop, the amount of instructor effort required. You can choose the level of research validation, and exclude resources, such as computers for students or tables for group work, that may not be available to you. Below the form is the list of 57 Research-Based Methods that you can browse through if the form doesn’t provide you with relevant choices. Again, some of these are physics-specific, but others, like Just-in-Time-Teaching are broadly applicable. Each of the methods has tabs for Overview, Resources, Teaching Materials, and Research.

The Assessments tab allows you to explore “…where you can get instant analysis of your students’ scores on research-based assessment instruments, comparisons to national averages and students like yours, recommendations for improving your teaching, and reports for tenure and promotion files, teaching portfolios, and departmental accreditation.” It is also set up with a form at the top. You can scroll down to see a list of 92 Research-Based Assessments. Most of these are physics-based. But scroll to the bottom of the page for a few interactive teaching protocols that may be more generally appropriate.

The Workshops tab features video tutorials. Again, there is a mix of physics-specific and non-specific materials.

Back to my original quest for resources on active learning. Under Expert Recommendations tab of particular interest are a series of posts by Stephanie Chasteen, University of Colorado Boulder (June 20, 2017) on implementing active learning strategies in your classroom. These are applicable to any subject matter, not just physics or even STEM courses. Each topic covered has a section on further reading with a list of references, a general reading list, and suggested keywords for searching in the literature.

PhysPort is a rich resource for all faculty. Spend a little time digging around. You should come up with some great material.

Macie Hall, Senior Instructional Designer
Center for Educational Resources

Image Source: Screenshot of PhysPort home page: https://www.physport.or

 

Omeka for Instruction

omekalogoThe following post describes Omeka, a Web-based exhibition software application, and the how it was selected, installed on a local server, and is currently used at Johns Hopkins. Outside of Johns Hopkins these processes may serve as models. Alternatives to local hosting of Omeka are also outlined.

Omeka for Instruction

Years ago, our Dean, Winston Tabb, here in The Sheridan Libraries at Johns Hopkins University requested we perform a survey and evaluation of open source Web-based exhibition software, the kind of software that might be a useful adjunct to our brick-and-mortar exhibitions.  This genre of software was, at the time, in a nascent stage.  Nevertheless, our survey and evaluation included now-mature software applications such as: Collective AccessOmekaOpen Exhibits;  and Pachyderm.  Each package was downloaded, installed, configured, and evaluated with respect to ease of installation, overall functionality, and prospect of sustainability. In the end, Omeka was our exhibition software package of choice.

What is Omeka?

Omeka is a Web-based exhibition software package written by historians for historians.  A product of the Center for History and New Media at George Mason University, Omeka was created so that those with exhibit-worthy content — most notably, historians — could click their way to a visually pleasing Web-based exhibition without the need to learn HTML, Javascript, or CSS coding.  Omeka is more than just a Webpage with some images and text, though.  It is a multi-user, Web-based tool that includes facility for user account management, for installing and configuring a host of freely-available plugins, for activating and altering themes, for adding and cataloging content items, and for taking those items and creating structured exhibitions with them.

Our Services

Shortly after settling on Omeka as our software package of choice, we decided to install it with two different uses in mind.  First, we would install a central Omeka instance for use by the Exhibitions Committee of The Sheridan Libraries and University Museums.  This instance would enable librarians and curators to use Omeka as either an online addition to a regular brick-and-mortar exhibition or as the venue for fully online exhibitions.  As of this writing, this instance of Omeka was used in fall 2015 to host an online exhibition of materials related to the John Barth Exhibition held at the George Peabody LibraryJohnBarthExhibition.  Also as of this writing, it is the intent of the Exhibitions Committee to likewise use Omeka to supplement a forthcoming exhibition on Edgar Allan Poe, again to be hosted at the Peabody Library.

The second use of Omeka would be in the classroom.  For this, we set up a separate server and began offering each instructor interested in using Omeka his or her very own Omeka instance on a per course section basis.  In this way, each section of each course using Omeka gets its own, dedicated instance, and students from each course section are sandboxed with their fellows, free and able to work together with this remarkable software package.

Typically the way this has worked is that a professor contacts technologist and librarian Mark Cyzyk in The Sheridan Libraries or staff in the Center for Educational Resources  to request the use of Omeka.  Cyzyk then sets up an instance, generates student accounts, and comes to class at least once during the semester to train the students.  He sometimes is accompanied by a subject librarian or curator who addresses subject-specific topics such as where to find appropriate images/video/audio for use in exhibits, copyright and fair use issues, proper citation practice, etc.

Courses Using Omeka

Over the past five years, the following courses have used Omeka for instruction here at Johns Hopkins:

Spring 2012.  “Literary Archive.” AS.389.359 (01)  Gabrielle Dean
Spring 2012.  “Seeing Baltimore History: Race & Community.” AS.362.306 (01) Moira Hinderer
Fall 2012.  “Modernity on Display: Technology and Ideology in the Era of World War II.” AS.140.320 (01) Robert Kargon
Spring 2013. “American Literature on Display.” AS.389.360 (01) Gabrielle Dean
Spring 2014.  “Gender in Latin American History.” AS.100.232 (01)  Norah Andrews
Spring 2014.  “Guillaume de Machaut: Exploring Medieval Authorship in the Digital Age.” AS.212.678 (01) Tamsyn Rose-Steel
Spring 2015.  “Modernism in Baltimore: A Literary Archive.” AS.389.359 (01) Gabrielle Dean
Spring 2015.  “History of Modern Medicine.” AS.140.106 (01) Jeremy Green
Spring 2016.  “Art and Science in the Middle Ages.” AS.010.403 (01) Chris Lakey
Spring 2016.  “#Digital Blackness.” AS.362.332 (01) Kim Gallon
Spring 2016.  “The Virtual Museum.” AS.389.302 (01) Jennifer Kingsley
Spring 2016.  “History of Public Health in East Asia.” AS.140.146 (01)  Marta Hanson

Alternatives for Using Omeka

If you are not at Johns Hopkins, but are interested in using Omeka, you have two choices:  First, you can get your local IT shop to install it.  It is a PHP application that runs on the Apache Web server with the MySQL database on the backend, and it is fairly easy and straightforward to install and configure.  Second, the Omeka community offers both paid and free hosting services via the omeka.net Website.  The free plan includes a single site, 500 MB of server space, 15 plugins, and 5 themes:  Plenty of functionality to get you started!

If you are at Johns Hopkins and are interested in using Omeka in one of your classes, please contact Mark Cyzyk, mcyzyk@jhu.edu, in The Sheridan Libraries.

*************************************************************************************************

Mark Cyzyk, Scholarly Communication Architect
Sheridan Libraries and Museums

Image sources: Omeka Logo from http://omeka.org; Lost in the Funhouse image © Sheridan Libraries and Museums

 

An Annotated Bibliography on College Teaching

The Tomorrow’s Professor e-Newletter often has interesting and useful posts. Sponsored by the Stanford University Center for Teaching and Learning, Tomorrow’s Professor is edited by Richard M. Reis, Ph.D., a consulting professor in the Department of Mechanical Engineering at Stanford (see more in this Innovative Instructor post from last year). Recently Reis shared a bibliography compiled by L. Dee Fink, Ph.D., a national and international consultant in higher education, a former president of the Professional and Organizational Development Network in Higher Education, and a former director of the Instructional Development Program at the University of Oklahoma.

Stack of books in a library.The list is comprised of books that have introduced major ideas in college teaching from 1990 to 2013. Fink says, “The point of this list is to illustrate that the scholars of teaching and learning are continuing to generate powerful new ideas year after year, thereby creating the possibility of enhancing the capabilities of college teachers everywhere – IF faculty members can learn about these ideas and incorporate them into their teaching.”

The ideas are show in two ways. First is by theme and sub-theme. The four themes are: General Perspectives on Teaching & Learning, Basic Tasks of Teaching, Dealing with Specific Teaching/Learning Situations, and Getting Better at Teaching. Under each of the themes are sub-themes with links (within the document) to annotated source listings arranged chronologically, which make up the second way in which the ideas are displayed.

For example, in the category Getting Better At Teaching, you will find Learning About Teaching & Learning with a link to Learning Communities. Clicking on the link takes you to 1998: “Learning communities, whether of students or of faculty, can lead to powerful forms of dialogue and growth. Source: Shapiro, N. & Levine, J. Creating Learning Communities. Jossey-Bass.

Browsing the chronological listings will also be fruitful. And if your spring break is coming up, maybe you will actually have a little time to read.

Macie Hall, Senior Instructional Designer, Center for Educational Resources

Image Source: Microsoft Clip Art

Managing Teamwork with CATME

Many instructors recognize the value of having students work collaboratively on team-based assignments. Not only is it possible for students to experience a greater understanding of the subject material, but several life-long learning skills can be gained through active engagement with team members. Managing team-based assignments, however, is not something most instructors look forward to; the administrative tasks can be quite cumbersome, especially with large classes. Thankfully there is a tool to help with this process: CATME.

Logo for CATMECATME, which stands for ‘Comprehensive Assessment of Team Member Effectiveness,’ is a free set of tools designed to help instructors manage group work and team assignments more effectively. It was developed by a diverse group of professors with extensive teaching experience, as well as researchers and students. First released in 2005, CATME takes away much of the administrative burden that instructors face when trying to organize and manage teams, communicate with students, and facilitate effective peer evaluation.

‘Team Maker,’ one of two main parts of CATME, assists with the team creation process. First, it allows instructors to easily create and send a survey to students. The survey collects various demographic data, previously completed coursework, and student availability information. Instructors can also add their own questions to the survey if desired. Once the data are collected, instructors decide which criteria will be used to create the teams and then assign weights to each of the criterion. Team Maker then uses the weights in an algorithm to create the teams.  Instructors are free to adjust the teams, if necessary, to their satisfaction. Once the teams are finalized, the instructor releases the results to students, who are provided with their team members’ names, email addresses, and a schedule matrix showing member availability.

‘Peer Evaluation,’ the other core component of CATME, is used by students to evaluate their teammates’ performance as well as their own.  The web-based ratings page is presented on one screen, making it easy to fill out and submit results. Students select from a set of behaviors which most closely describes themselves and their peers. There is also a place where students can include confidential comments which are only seen by the instructor.  Once completed, instructors can decide when to release the evaluation results to students. Peer ratings appear anonymous to students but are identified for instructors.

Another tool included in CATEME is the ‘Rater Calibration’ tool, which helps train students in the peer evaluation process. Students are asked to rate a series of fictional team members and then receive feedback about their ratings. Other tools include the ‘Student Team Training’ tool, designed to help students recognize effective team behaviors, and the ‘Meeting Support’ tool, which provides templates that students can use to plan and organize meetings, such as writing a team charter, taking minutes, etc.

To view a video demo of CATME and learn more about the product, visit the CATME website. Instructors interested in using CATME can go to https://www.catme.org/login/request to register for an account.

Amy Brusini, Course Management Training Specialist Center for Educational Resources

Image Source: CATME logo from http://info.catme.org/

Welcome to The Innovative Instructor

After the Provost’s Gateway Sciences Initiative Symposium on Teaching Excellence in January 2012, faculty expressed interest in having an online space where ideas about innovative teaching could be collected and archived. As a resource for those teaching in any discipline, The Innovative Instructor blog will offer a variety of ideas about teaching excellence, instructional technology, and teaching as research.

The Innovative Instructor blog builds on a successful print series of the same name, which focuses on Pedagogy, Best Practices, and Technology. Blog posts will cover topics such as active learning, assessment, use of case studies in instruction, classroom management, instructional design, how to engage students, grading and feedback, collaborative learning, leading discussions, hybrid instruction, and teaching methods.

While initial posts will be written by staff members in the Center for Educational Resources (CER) and other Johns Hopkins teaching and learning centers, faculty, post docs, and graduate and undergraduate students are invited to serve as guest editors. If you have a teaching-related topic that you would like to share, please contact Macie Hall at macie.hall@jhu.edu. Or contact her if you have an issue or subject you’d like to see covered in a future post.

The CER is just one of the teaching support centers at Johns Hopkins University. Find a complete list under the Contact tab. The CER provides a variety of services for faculty in the Krieger School of Arts and Sciences and the Whiting School of Engineering. CER staff members meet with instructors to discuss digital course enhancements, manage the Technology Fellowship Program, collaborate with faculty on grant projects, and offer structured opportunities for faculty to learn about cutting edge educational innovations. The CER is also the home for the TA Training Institute and the institutional affiliation with the Center for Integration of Research, Teaching, and Learning (CIRTL). We offer a number of opportunities to prepare future faculty for teaching.

Whether you are faculty, future faculty, student, or staff interested in pedagogy, teaching with technology, or educational best practices, welcome to The Innovative Instructor.