Empowering Students through Guided Reflection

[Guest post by Pamela Sheff, Director, Center for Leadership Education, Johns Hopkins University]

lighthouseEach spring, I teach a course called Culture of the Engineering Profession for the Center for Leadership Education in the Whiting School. Primarily through discussions and projects, students in this class investigate what it means to be an engineer, identify contemporary issues in engineering, and consider the ethical guidelines of the engineering profession. The majority of students in the Spring 2019 class were Chemical and Biomolecular Engineering majors with a Mechanical Engineering student mixed in. This semester, I decided to experiment somewhat with guided reflection, a metacognitive practice that was new for many of these students.

One of the goals of the course is to help students strengthen their communication skills; therefore I leverage a great deal of class discussion including a requirement for students to lead a discussion at least once during the semester. Several times during the semester, I guided the students in reflecting not only on the quality of their discussions, but also on the culture of the classroom as a whole. I raised questions such as the following: What is working well? What could be changed? What values do students want in the classroom? From this reflective exercise the students generated a rubric listing characteristics including accountability, respect, and transparency. Every couple of weeks, I asked them to reflect on how they were doing as a group by reviewing the list.

After about six weeks, I noticed the group coming to consensus on what they felt was working in our classroom practice and what needed to change.  For example, one idea suggested by the group was to speak purposefully during discussions.  Students should not talk simply to be heard, but to move the discussion forward. Results included higher quality discussions and improved leadership skills.

The success of using a rubric to guide class discussions led me to continue using reflection to help students evaluate their major projects.  We talked during class about effective project criteria, for example, and what they should look for in the posters they would see at the course-wide poster fair.  The teaching assistants in the class then compiled the list of suggestions, which helped the students create strong written critiques after the fair. I talked with the class about how to assign grades to the team discussions they had been leading. Again, we worked in class to develop a list of criteria to consider and, the TAs and I developed the grading rubric. I then gave the students an opportunity to comment on the rubric before it was finalized. I also made the decision to allow students to grade their own projects according to the rubric. If I agreed with the grade they chose, the grade stood.  If not, I modified the grade. In a class of 29 students, I only had to lower two grades.  I did raise three grades, in cases where the students were unduly critical of their efforts.

The results of continuous guided reflection? The projects were the best I have ever seen in this class, and I could not have been more pleased. I attribute the high quality of work to students taking ownership of the process. It pushed them to live up to the standards they defined for themselves, and in many cases, go beyond them.  Providing space for students to reflect on what they were working towards led them to act more purposefully and, in turn, allowed me to give them agency over the classroom. I am thrilled with the way this approach worked out and am planning to use it again in future semesters.

Pamela Sheff, Associate Teaching Professor and Director
Center for Leadership Education, Johns Hopkins University

Pamela Sheff is an award-winning writer and marketing communications consultant, with a wealth of experience developing marketing, public relations and communications strategies for clients ranging from start-ups to large corporate, institutional and government organizations. Now a full-time lecturer in the Center for Leadership Education, Pam has taught business communications for private companies and directed the Writing Program at Goucher College.

Image Source: Pixabay

Advising Graduate Students

[Guest post by Anne-Elizabeth Brodsky, Senior Lecturer, Expository Writing, Johns Hopkins University]

Usually teaching offers us a built-in apparatus. There’s a classroom, regular meeting times, a syllabus, an end of the semester in sight.

But advising grad students on their dissertations, or supervising them as their PI, is an entirely different sort of teaching. The familiar structures have evaporated, the final product is hard to envision, and what’s at stake is not a grade but a career.

As faculty we do our best to get it right, knowing that each grad student and each research project differs wildly.

But it’s tricky, as Drew Daniel (JHU English department) reflects in an August 2018  blog post:

“Graduate advising is intimate and intense. . . . It is a partnership but it is also structurally, Drew Daniel, JHU English departmentfundamentally unequal. One of you is learning how to do something; one of you is advising the other on how to do that thing based on prior experience and presumed expertise. . . . The advisor must help the grad student bring something new into the world which is the student’s own and which the advisor does not themselves already completely understand.”

Given that the road ahead is unpredictable, the initial steps we take as faculty become all the more important. It’s crucial that we set up clear terms and reliable mechanisms that will buttress our students, come what may.

What Works

Consider, for example, creating an advising statement to share with prospective advisees. This can be a tangible, transparent way to set clear and mutual expectations in the advisor-advisee relationship. Read more about advising statements in the Chronicle here (October 2018), where Moin Syed (in psychology at the University of Minnesota) shares his advising statement as a google doc you can adapt as your own.

Leonard Cassuto, in the English department at Fordham, explains here (in the Chronicle December 2018) how he sets up dissertation writing groups. This approach structures not only the faculty-advisee relationship but also collegial relationships among grad students at different levels in the program.

Along similar lines, but in the context of lab sciences, Allison Antes (from the Center for Research and Clinical Ethics at Washington University School of Medicine) offers six key steps to strong faculty advising in a November 2018 Nature article. For instance:

Task one: put recurring one-on-one meetings with the members of your group on your calendar. Set up a notebook or spreadsheet and jot down anything you should bring up during these meetings. Set an alert for ten minutes before the appointment to decide how to approach the meeting. Does the team member need encouragement? Career guidance? Feedback on their project and direction for next steps? Are they behind on deadlines or lacking confidence?

Task two: invite people to share both complaints and highlights. Several exemplary scientists explicitly require their trainees to relate a concern or struggle at some point in one-on-one meetings. They want to help people to be comfortable enough to bring problems and mistakes to light, and so address issues early, while they are manageable.

Compass pointing to the word CareerFinally, in March 2019, four professors from across disciplines offer “Three research-based lessons to improve your mentoring:”

  1. Approach the power dynamic between mentor and mentee by invoking relevant research. Aspects of mentoring line up with aspects of parenting; to say this is not to infantilize students but rather to acknowledge the power difference as well as (often) the generational difference—and to avoid reinventing the wheel. Research shows the benefits of “authoritativeness, which is defined by both high expectations and high attentiveness; offering a safe haven in times of distress; and fostering a secure base to promote exploration.”
  2. Communicate your confidence in students’ abilities and potential. Again, from the research: “if students think their professors believe that only a few special people have intellectual potential, it can harm their sense of belonging and their performance.”
  3. Model a growth mindset, and “help mentees embrace failure as growth.” One of the authors, Jay J. Van Bavel, shares his unofficial bio alongside his formal one. Some faculty circulate failure CVs.

Where Hopkins Fits In

Here at Hopkins there has been significant conversation around how best to mentor graduate students, particularly since the publication of the National Academies of Science report on sexual and gender harassment in the sciences. In October of 2018, at a Women Faculty Forum event concerning the NAS report, participants (faculty, students, and staff) generated suggestions for how JHU could implement NAS’s recommendation #5: “Diffuse the hierarchical and dependent relationship between trainees and faculty.” You can read notes from that conversation here.

In November 2018, a faculty coffee hour focused solely on the faculty-trainee relationship at Hopkins produced these suggestions.

Meanwhile, there is a new PhD Student Advisory Committee, convened by Vice Provost for Graduate and Professional Education Nancy Kass. Mentorship, inclusivity, professional development, and grad student well-being are among the key topics discussed. From the Hub: “We get these amazing students, and we want them to be productive, and happy, and feel good about what they’re doing, and then be prepared to do really wonderful things afterwards,” Kass says.

As a result of this work, the Doctor of Philosophy Board just passed two new policies: The first requires PhD students and their advisors to have annual conversations about not only research progress but also professional development goals. The second requires each PhD-granting school to distribute our new mentoring guidance and to put in place at least two “supports”—such as workshops, training, mentoring mavens, mentoring awards, and so on.

Finally, Vice Provost Kass also assembled a university-wide PhD Program Directors Retreat in early May. The focus was on PhD professional development and preparedness for non-academic careers. Farouk Dey, Vice Provost for Integrative Learning and Life Design, was the keynote speaker. His overall message to faculty PhD program directors: “Try not to ask [students] ‘What do you want to do?’ Instead ask, ‘What has inspired you lately?’ ‘What action can you take to turn that inspiration into reality and how can I help you with that?’”

Anne-Elizabeth Brodsky, Senior Lecturer
Expository Writing, Johns Hopkins University

Anne-Elizabeth M. Brodsky has taught in the Expository Writing Program since 2007. In addition to teaching “Introduction to Expository Writing,” she has also taught courses on friendship, public education, and race in American literature. A former member of the JHU Diversity Leadership Council, Anne-Elizabeth now serves as co-chair of the Women Faculty Forum at Homewood.

Lunch and Learn: Strategies to Minimize Cheating (A Faculty Brainstorming Session)

On Wednesday, April 17, the Center for Educational Resources (CER) hosted the final Lunch and Learn for the 2018-2019 academic year: Strategies to Minimize Cheating (A Faculty Brainstorming Session).  As the title suggests, the format of this event was slightly different than past Lunch and Learns. Faculty attendees openly discussed their experiences with cheating as well as possible solutions to the problem. The conversation was moderated by James Spicer, Professor, Materials Science and Engineering, and Dana Broadnax, Director of Student Conduct.

The discussion began with attendees sharing examples of academic misconduct they identified. The results included: copying homework, problem solutions, and lab reports; using other students’ clickers; working together on take-home exams; plagiarizing material from Wikipedia (or other sites); and using online solution guides (such as chegg.com, coursehero.com, etc.).

Broadnax presented data from the Office of the Dean of Student Life regarding the numbers of cheating incidents per school, types of violations, and outcomes. She stressed to faculty members how important it is to report incidents to help her staff identify patterns and repeat offenders. If it’s a student’s first offense, faculty are allowed to determine outcomes that do not result in failure of the course, transcript notation, or change to student status. Options include: assigning a zero to the assessment, offering a retake of the assessment, lowering the course grade, or giving a formal warning.  A student’s second or subsequent offense must be adjudicated by a hearing panel (Section D – https://studentaffairs.jhu.edu/policies-guidelines/undergrad-ethics/).

Some faculty shared their reluctance to report misconduct because of the time required to submit a report. Someone else remarked that when reporting, she felt like a prosecutor.  As a longtime ethics board member, Spicer acknowledged the burdens of reporting but stressed the importance of reporting incidents. He also shared that faculty do not act as prosecutors at a hearing. They only provide evidence for the hearing panel to consider. Broadnax agreed and expressed interest in finding ways to help make the process easier for faculty. She encouraged faculty to share more of their experiences with her.

The discussion continued with faculty sharing ideas and strategies they’ve used to help reduce incidents of cheating. A summary follows:

  • Do not assume that students know what is considered cheating. Communicate clearly what is acceptable/not acceptable for group work, independent work, etc. Clearly state on your syllabus or assignment instructions what is considered a violation.
  • Let students know that you are serious about this issue. Some faculty reported their first assignment of the semester requires students to review the ethics board website and answer questions. If you serve or have served on the ethics board, let students know.
  • Include an ethics statement at the beginning of assignment instructions rather than at the end. Research suggests that signing ethics statements placed at the beginning of tax forms rather than at the end reduces dishonest reporting.
  • Do not let ‘low levels’ of dishonesty go without following University protocol – small infractions may lead to more serious ones. The message needs to be that no level of dishonesty is acceptable.
  • Create multiple opportunities for students to submit writing samples (example: submit weekly class notes to Blackboard) so you can get to know their writing styles and recognize possible instances of plagiarism.
  • Plagiarism detection software, such as Turnitin, can be used to flag possible misconduct, but can also be used as an instructional tool to help students recognize when they are unintentionally plagiarizing.
  • Emphasize the point of doing assignments: to learn new material and gain valuable critical thinking skills. Take the time to personally discuss assignments and paper topics with students so they know you are taking their work seriously.
  • If using clickers, send a TA to the back of the classroom to monitor clicker usage. Pay close attention to attendance so you can recognize if a clicker score appears for an absent student.
  • Ban the use of electronic devices during exams if possible. Be aware that Apple Watches can be consulted.
  • Create and hand out multiple versions of exams, but don’t tell students there are different versions. Try not to re-use exam questions.
  • Check restrooms before or during exams to make sure information is not posted.
  • Ask students to move to different seats (such as the front row) if you suspect they are cheating during an exam. If a student becomes defensive, tell him/her that you don’t know for sure whether or not cheating has occurred, but that you would like him/her to move anyway.
  • Make your Blackboard site ‘unavailable’ during exams; turn it back on after everyone has completed the exam.
  • To discourage students from faking illness on exam days, only offer make-ups as oral exams. One faculty member shared this policy significantly reduced the number of make-ups due to illness in his class.

Several faculty noted the high-stress culture among JHU students and how it may play a part in driving them to cheat. Many agreed that in order to resolve this, we need to create an environment where students don’t feel the pressure to cheat. One suggestion was to avoid curving grades in a way that puts students in competition with each other.  Another suggestion was to offer more pass/fail classes. This was met with some resistance as faculty considered the rigor required by courses students need to get into medical school. Yet another suggestion was to encourage students to consult with their instructor if they feel the temptation to cheat. The instructor can help address the problem by considering different ways of handling the situation, including offering alternative assessments when appropriate. Broadnax acknowledged the stress, pressure, and competition among students, but also noted that these are not excuses to cheat: “Our students are better served by learning to best navigate those factors and still maintain a standard of excellence.”

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Source: Lunch and Learn Logo

Using Slack in the Classroom

If you aren’t already using it, chances are you have probably heard of the online communication platform known as Slack. Slack is a cloud-based software program that is used for project management, information sharing, individual and group communication, as well as synchronous and asynchronous collaboration.  There are free and paid plans available; the main difference between the plans is the number of messages that are accessible (10,000 with the free plan) and how many third-party tools are supported (10 with the free plan).  What began in 2013 as a mode for inter-office conversation between two business offices has quickly expanded to hundreds of workplaces worldwide as well as many classrooms.

With the number of existing communication tools already available, you may be wondering how this one differs and why you might consider using it. Slack is organized into ‘channels’ which are like chat rooms dedicated to specific conversations. Messages posted to a channel can be seen by everyone who subscribes to that channel or directed to specific individuals and kept private. Unlike traditional chat rooms which may be hard to follow, Slack supports threading, which allows participants to respond directly to posts within a channel without interrupting the overall flow of conversation. Slack integrates with several third-party services, such as Box, Google Drive, and Dropbox, as well as developer platforms such as GitHub and Bitbucket. It also has a powerful search feature, making it easy to find files and specific topics in cross-channel conversations.

Slack was designed with efficiency in mind, therefore communication tends to be succinct and streamlined. Generally speaking, participants write short, direct messages closer in style to a messaging app without the ‘formality’ often used when composing an email. While this lack of formality may take some getting used to, many students are already accustomed to this style which they frequently use in various social media apps and when texting. Also unlike email, Slack follows more of an ‘opt-in’ model, where users can join in on conversations they feel are relevant and ignore those that are not.  Settings are available to determine how often users are notified of messages being posted.

The following is a list of possible ways instructors can use Slack in the classroom:

  • Share information – Create channels for posting announcements, sharing articles, links, relevant content, etc. Students can immediately ask questions or comment on the post which could lead to a dialogue around a specific topic. This may help to engage students in the topic as well as build a sense of community in the class.
  • Manage group projects – Each group can have its own channel to collaborate, share files, and communicate with each other. Instructors can post resources for groups in their specific channels and periodically check in and offer assistance as needed.
  • Crowdsource class notes – Create a channel for students to contribute main ideas from notes taken in class. This could eventually be used to create a study guide.
  • Poll the class – Slack includes a free polling tool which can be used to survey students for a variety of reasons in real-time, during class, or asynchronously, outside of class. Polls are optionally anonymous.
  • Include experts ‘in the field’ – Invite subject matter experts and/or those working ‘in the field’ to Slack so they can participate in conversations and answer student questions. JHU instructor Jennifer Bernstein invites former students to stay involved in her Slack channels so that current students can benefit from the perspective of someone who has recently graduated and is now working in the medical profession.
  • Monitor student engagement – Slack provides an optional weekly summary of usage statistics, including charts and graphs showing how many messages were posted, files uploaded, etc.

If you decide to use Slack in a classroom environment, there are some considerations to keep in mind. For example, there is no FERPA compliance in Slack. Sensitive data such as grades and personal information should not be shared in Slack spaces. Instructors should be clear with students about what types of conversations are appropriate for Slack, and what might be better served in an email or face-to-face. Another thing to consider is the capability available to members (students) that are invited to a Slack space. Instructors may be surprised with the permissions and features available to students (i.e. the ability to create their own channels). Therefore, it is recommended that instructors familiarize themselves with the established permissions of Slack before getting started.  Finally, it may be worth noting that Slack is not a course management system (Blackboard, Canvas, etc.), and does not contain many of the features available in those systems, such as a gradebook, assignment creator, rubrics tool, etc. It may, however, provide an interesting, alternative means of communication in relevant situations as determined by the instructor.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image sources: Slack logo, Phil Simon: How I Use Slack in the Classroom

Lunch and Learn: Innovative Grading Strategies

Logo for Lunch and Learn program showing the words Lunch and Learn in orange with a fork above and a pen below the lettering. Faculty Conversations on Teaching at the bottom.On Thursday, February 28, the Center for Educational Resources (CER) hosted the third Lunch and Learn for the 2018-2019 academic year. Rebecca Kelly, Associate Teaching Professor, Earth and Planetary Sciences and Director of the Environmental Science and Studies Program, and Pedro Julian, Associate Professor, Electrical and Computer Engineering, presented on Innovative Grading Strategies.

Rebecca Kelly began the presentation by discussing some of the problems in traditional grading. There is a general lack of clarity in what grades actually mean and how differently they are viewed by students and faculty. Faculty use grades to elicit certain behaviors from students, but it doesn’t necessarily mean that they are learning. Kelly noted that students, especially those at JHU, tend to be focused on the grade itself, aiming for a specific number and not the learning; this often results in high levels of student anxiety, something she sees often. She explained how students here don’t get many chances to fail and not have their grades negatively affected. Therefore, every assessment is a source of stress because it counts toward their grade. There are too few opportunities for students to learn from their mistakes.

Kelly mentioned additional challenges that faculty face when grading: it is often time consuming, energy draining, and stressful, especially when haggling over points, for example.  She makes an effort to provide clearly stated learning goals and rubrics for each assignment, which do help, but are not always enough to ease the burden.

Kelly introduced the audience to specifications grading and described how she’s recently started using this approach in Introduction to Geographic Information Systems (GIS). With specifications grading (also described in a recent CER Innovative Instructor article), students are graded pass/fail or satisfactory/unsatisfactory on individual assessments that align directly with learning goals. Course grades are determined by the number of learning goals mastered. This is measured by the number of assessments passed. For example, passing 20 or more assignments out of 23 would equate to an A; 17-19 assignments would equate to a B. Kelly stresses the importance of maintaining high standards; for rigor, the threshold for passing should be a B or better.

In Kelly’s class, students have multiple opportunities to achieve their goals. Each student receives three tokens that he/she can use to re-do an assignment that doesn’t pass, or select a different assignment altogether from the ‘bundle’ of assignments available. Kelly noted the tendency of students to ‘hoard’ their tokens and how it actually works out favorably; instead of risking having to use a token, students often seek out her feedback before turning anything in.

Introduction to GIS has both a lecture and a lab component. The lab requires students to use software to create maps that are then used to perform data analysis. The very specific nature of the assignments in this class lend themselves well to the specifications grading approach. Kelly noted that students are somewhat anxious about this approach at first, but settle into it once they fully understand. In addition to clearly laying out Grade bundles used in specifications gradingexpectations, Kelly lists the learning goals of the course and how they align with each assignment (see slides). She also provides students with a table showing the bundles of assignments required to reach final course grades. Additionally, she distributes a pacing guide to help students avoid procrastination.

The results that Kelly has experienced with specifications grading have been positive. Students generally like it because the expectations are very clear and initial failure does not count against them; there are multiple opportunities to succeed. Grading is quick and easy because of the pass/fail system; if something doesn’t meet the requirements, it is simply marked unsatisfactory. The quality of student work is high because there is no credit for sloppy work. Kelly acknowledged that specifications grading is not ideal for all courses, but feels the grade earned in her GIS course is a true representation of the student’s skill level in GIS.

Pedro Julian described a different grading practice that he is using, something he calls the “extra grade approach.” He currently uses this approach in Digital Systems Fundamentals, a hands-on design course for freshmen. In this course, Julian uses a typical grading scale: 20% for the midterm, 40% for labs and homework, and 40% for the final project. However, he augments the scale by offering another 20% if students agree to put in extra work throughout the semester. How much extra work? Students must commit to working collaboratively with instructors (and other students seeking the 20% credit) for one hour or more per week on an additional project.  This year, the project is to build a vending machine. Past projects include building an elevator out of Legos and building a robot that followed a specific path on the floor.

Julian described how motivated students are to complete the extra project once they commit to putting in the time. Students quickly realize that they learn all sorts of skills they would not have otherwise learned and are very proud and engaged. Student participation in the “extra grade” option has grown steadily since Julian started using this approach three years ago. The first year there were 5-10 students who signed up, and this year there are 30. Julian showed histograms (see slides) of student grades from past semesters in his class and how the extra grade has helped push overall grades higher.  The histograms also show that it’s not just students who may be struggling with the class who are choosing to participate in the extra grade, but “A students” as well.

Similar to Rebecca Kelly’s experience, Julian expressed how grade-focused JHU students are, much to his dismay. In an attempt to take some of the pressure off, he described how he repeatedly tells his students that if they work hard, they will get a good grade; he even includes this phrase in his syllabus. Julian explained how he truly wants students to concentrate more on the learning and not on the grade, which is his motivation behind the “extra grade” approach.

An interesting discussion with several questions from the audience followed the presentations. Below are some of the questions asked and responses given by Kelly and Julian, as well as audience members.

Q: (for Julian) Some students may not have the time or flexibility in their schedule to take part in an extra project. Do you have suggestions for them? Did you consider this when creating the “extra grade” option?

Julian responded that in his experience, freshmen seem to be available. Many of them make time to come in on the weekends. He wants students to know he’s giving them an “escape route,” a way for them to make up their grade, and they seem to find the time to make it happen.  Julian has never had a student come to him saying he/she cannot participate because of scheduling conflicts.

Q: How has grade distribution changed?

Kelly remarked how motivated the students are and therefore she had no Cs, very few Bs, and the rest As this past semester. She expressed how important it is to make sure that the A is attainable for students. She feels confident that she’s had enough experience to know what counts as an A. Every student can do it, the question is, will they?

Q: (for Kelly) Would there ever be a scenario where students would do the last half of the goals and skip the first half?

Kelly responded that she has never seen anyone jump over everything and that it makes more sense to work sequentially.

Q: (for Kelly) Is there detailed feedback provided when students fail an assignment?

Kelly commented that it depends on the assignment, but if students don’t follow the directions, that’s the feedback – to follow the directions. If it’s a project, Kelly will meet with the student, go over the assignment, and provide immediate feedback. She noted that she finds oral feedback much more effective than written feedback.

Q: (for Kelly) Could specs grading be applied in online classes?

Kelly responded that she thinks this approach could definitely be used in online classes, as long as feedback could be provided effectively. She also stressed the need for rubrics, examples, and clear goals.

Q: Has anyone tried measuring individual learning gains within a class? What skills are students coming in with? Are we actually measuring gain?

Kelly commented that specifications grading works as a compliment to competency based grading, which focuses on measuring gains in very specific skills.

Julian commented that this issue comes up in his class, students coming in with varying degrees of experience. He stated that this is another reason to offer the extra credit, to keep things interesting for those that want to move at a faster pace.

The discussion continued among presenters and audience members about what students are learning in a class vs. what they are bringing in with them. A point was raised that if students already know the material in a class, should they even be there?  Another comment was made regarding if it is even an instructor’s place to determine what students already know.  Additional comments were made about what grades mean and concerns about grades being used for different things, i.e. employers looking for specific skills, instructors writing recommendation letters, etc.

Q: Could these methods be used in group work?

Kelly responded that with specifications grading, you would have to find a way to evaluate the group. It might be possible to still score on an individual basis within the group, but it would depend on the goals. She mentioned peer evaluations as a possibility.

Julian stated that all grades are based on individual work in his class. He does use groups in a senior level class that he teaches, but students are still graded individually.

The event concluded with a discussion about how using “curve balls” – intentionally difficult questions designed to catch students off-guard – on exams can lead to challenging grading situations. For example, to ultimately solve a problem, students would need to first select the correct tools before beginning the solution process. Some faculty were in favor of including this type of question on exams, while others were not, noting the already high levels of exam stress.  A suggestion was made to give students partial credit for the process even if they don’t end up with the correct answer. Another suggestion was to give an oral exam in order to hear the student’s thought process as he/she worked through the challenge. This would be another way for students to receive partial credit for their ideas and effort, even if the final answer was incorrect.

Amy Brusini, Senior Instructional Designer
Center for Educational Resources

Image Sources: Lunch and Learn Logo, slide from Kelly presentation

An Evidence-based Approach to Effective Studying

Dr. Culhane is Professor and Chair of the Department of Pharmaceutical Sciences at Notre Dame of Maryland University School of Pharmacy.

If you are like me, much of your time is spent ensuring that the classroom learning experience you provide for your students is stimulating, interactive and impactful. But how invested are we in ensuring that what students do outside of class is productive? Based on my anecdotal experience and several studies1,2,3 looking at study strategies employed by students, the answer to this question is not nearly enough! Much like professional athletes or musicians, our students are asked to perform at a high level, mastering advanced, information dense subjects; yet unlike these specialists who have spent years honing the skills of their craft, very few students have had any formal training in the basic skills necessary to learn successfully. It should be no surprise to us that when left to their own devices, our students tend to mismanage their time, fall victim to distractions and gravitate towards low impact or inefficient learning strategies. Even if students are familiar with high impact strategies and how to use them, it is easy for them to default back to bad habits, especially when they are overloaded with work and pressed for time.

Several years ago, I began to seriously think about and research this issue in hopes of developing an evidence-based process that would be easy for students to learn and implement. Out of this work I developed a strategy focused on the development of metacognition – thinking about how one learns. I based it on extensively studied, high impact learning techniques to include: distributed learning, self-testing, interleaving and application practice.4 I call this strategy the S.A.L.A.M.I. method. This method is named after a metaphor used by one of my graduate school professors. He argued that learning is like eating a salami. If you eat the salami one slice at a time, rather than trying to eat the whole salami in one setting, the salami is more likely to stay with you. Many readers will see that this analogy represents the effectiveness of distributed learning over the “binge and purge” method which many of our students gravitate towards.

S.A.L.A.M.I. is a “backronym” for Systematic Approach to Learning And Metacognitive Improvement. The method is structured around typical, daily learning experiences that I refer to as the five S.A.L.A.M.I. steps:

  1. Pre-class preparation
  2. In-class engagement
  3. Post-class review
  4. Pre-exam preparation
  5. Post-assessment review

When teaching the S.A.L.A.M.I. method, I explain how each of the five steps correspond to different “stages” or components of learning (see figure 1). Through mastery of skills associated with each of the five S.A.L.A.M.I. steps, students can more efficiently and effectively master a subject area.

S.A.L.A.M.I. Steps

Figure 1

Despite its simplicity, this model provides a starting point to help students understand that learning is a process that takes time, requires the use of different learning strategies and can benefit from the development of metacognitive awareness. Specific techniques designed to enhance metacognition and learning are employed during each of the five steps, helping students use their time effectively, maximize learning and achieve subject mastery. Describing all the tools and techniques recommended for each of the five steps would be beyond the scope of this post, but I would like to share two that I have found useful for students to evaluate the effectiveness of their learning and make data driven changes to their study strategies.

Let us return to our example of professional athletes and musicians: these individuals maintain high levels of performance by consistently monitoring and evaluating the efficacy of their practice as well as reviewing their performance after games or concerts. If we translate this example to an academic environment, the practice or rehearsal becomes student learning (in and out of class) and the game or concert acts as the assessment.  We often evaluate students’ formative or summative “performances” with grades, written or verbal feedback. But what type of feedback do we give them to help improve the efficacy of their preparation for those “performances?” If we do give them feedback about how to improve their learning process, is it evidenced-based and directed at improving metacognition, or do we simply tell them they need to study harder or join a study group in order to improve their learning? I would contend that we could do more to help students evaluate their approach to learning outside of class and examination performance. This is where a pre-exam checklist and exam wrapper can be helpful.

The inspiration for the pre-exam checklist came from the pre-flight checklist a pilot friend of mine uses to ensure that he and his private aircraft are ready for flight.  I decided to develop a similar tool for my students that would allow them to monitor and evaluate the effectiveness of their preparation for upcoming assessments. The form is based on a series of reflective questions that help students think about the effectiveness of their daily study habits. If used consistently over time and evaluated by a knowledgeable faculty or learning specialist, this tool can help students be more successful in making sustainable, data driven changes in their approach to learning.

Another tool that I use is called an exam wrapper. There are many examples of exam wrappers online, however, I developed my own wrapper based on the different stages or components of learning shown in figure 1. The S.A.L.A.M.I. wrapper is divided into five different sections. Three of the five sections focus on the following stages or components of learning: understanding and building context, consolidation, and application. The remaining two sections focus on exam skills and environmental factors that may impact performance. Under each of the five sections is a series of statements that describe possible reasons for missing an exam question. The student analyzes each missed question and matches one or more of the statements on the wrapper to each one. Based on the results of the analysis, the student can identify the component of learning, exam skill or environmental factors that they are struggling with and begin to take corrective action. Both the pre-exam checklist and exam wrapper can be used to help “diagnose” the learning issue that academically struggling students may be experiencing.

Two of the most common issues that I diagnose involve illusions of learning5. Students who suffer from the ‘illusion of knowledge’ often mistake their understanding of a topic for mastery. These students anticipate getting a high grade on an assessment but end up frustrated and confused when receiving a much lower grade than expected. Information from the S.A.L.A.M.I. wrapper can help them realize that although they may have understood the concept being taught, they could not effectively recall important facts and apply them. Students who suffer from the ‘illusion of productivity’ often spend extensive time preparing for an exam, however, the techniques they use are extremely passive. Commonly used passive study strategies include: highlighting, recopying and re-reading notes, or listening to audio/video recordings of lectures in their entirety. The pre-exam checklist can help students identify the learning strategies they are using and reflect on their effectiveness. When I encounter students favoring the use of passive learning strategies I use the analogy of trying to dig a six-foot deep hole with a spoon: “You will certainly work hard for hours moving dirt with a spoon, but you would be a lot more productive if you learned how to use a shovel.” The shovel in this case represents adopting strategies such as distributed practice, self-testing, interleaving and application practice.

Rather than relying on anecdotal advice from classmates or old habits that are no longer working, students should seek help early, consistently practice effective and efficient study strategies, and remember that digesting information (e.g. a  S.A.L.A.M.I.) in small doses is always more effective at ‘keeping the information down’ so it may be applied and utilized successfully later.

  1. Kornell, N., Bjork, R. The promise and perils of self-regulated study. Psychon Bull Rev. 2007;14 (2): 219-224.
  2. Karpicke, J. D., Butler, A. C., & Roediger, H. L. Metacognitive strategies in student learning: Do students practice retrieval when they study on their own? Memory. 2009; 17: 471– 479.
  3. Persky, A.M., Hudson, S. L. A snapshot of student study strategies across a professional pharmacy curriculum: Are students using evidence-based practice? Curr Pharm Teach Learn. 2016; 8: 141-147.
  4. Dunlosky, J., Rawson, K.A., Marsh, E.J., Nathan, M.J., Willingham, D.T. Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology. Psychol Sci Publ Int. 2013; 14 (1): 4-58.
  5. Koriat, A., & Bjork, R. A. Illusions of competence during study can be remedied by manipulations that enhance learners’ sensitivity to retrieval conditions at test. Memory & Cognition. 2006; 34: 959-972.

James M. Culhane, Ph.D.
Chair and Professor, School of Pharmacy, Notre Dame of Maryland University