Balancing Rigor & Compassion in Assessment During Times of Disruption
During times of uncertainty or remote instruction due to disruption, final exams or assessments may feel even more important, and potentially more fraught, than usual. On the one hand, we seek to uphold the high standards of an Elon education, and to ensure that our students are well-prepared for future courses, internships or jobs, and everyday lives that may require the skills and knowledge we’ve worked so hard to help them develop. On the other, we know that uncertainty and fear undermine our students cognitive and emotional resources, that students may face challenges during remote instruction in finding quiet spaces and high-speed internet connections, and both may exacerbate student mental health challenges. So, what’s a conscientious instructor to do?
Luckily, the apparent tension between rigor and compassion may be illusory. High levels of subjective stress and test anxiety are inversely correlated with both learning and performance (Gino, 2016; Friend, 1982; Musch & Broder, 1999; Karatas et al., 2013): The more stressed out a student is while taking a final exam or assessment, the less likely they are to perform at their true capacity, and the less likely they are to experience that final as a final opportunity to heighten their understanding of your discipline. By designing final assessments that measure our learning objectives and align with the characteristics of the future contexts in which students would need to apply their knowledge, we ensure that we are responsibly preparing our students. By creating conditions that reduce stress and test anxiety, we both create a more compassionate experience and get a better measure of student learning.
In Support of Rigor: Aligning Finals with Learning Outcomes and Authentic Future Conditions
The learning outcomes or objectives we set for our students can be a valuable guide to help us make decisions about what we must ask of our students for their final exams. What learning outcomes have you already seen students accomplish this semester? We might include those all again on a cumulative final to ensure that learning has “stuck.” Or, we might instead choose to focus this semester’s finals on the learning objectives we haven’t yet assessed.
Categorizing our learning objectives on Bloom’s taxonomy (Krathwohl & Anderson, 2009; Eberly Center, Alignment) can help us make intentional decisions about which assessments types are likely to measure that type of knowledge well. For example, if your hope is that students will “remember” or “understand” facts or concepts, multiple choice, fill-in-the-blank, or matching formats are likely to fit. If you want students to “analyze,” short answer or essay questions that set up new scenarios or case studies (or ask students to identify their own) might be in order. If “create” is the goal, consider a project that challenges students to take what they’ve learned and produce some output that is meaningful in the context of your discipline and that shows you (and them) what they can do.
We might also tailor our final to replicate the authentic conditions in which students will need the knowledge and skills they have developed during our courses. In future internships or jobs, students can look up a fact or formula they don’t “remember”; instead, they may need practice using information from your course to identify relevant approaches (“analyze”) or make recommendations (“evaluate”). Professional work in many fields includes substantial work in teams, suggesting the value of collaborative finals. In downstream classes, students will need to recognize the applicability of knowledge and skills from your course to that new material; questions you ask now about the possible future uses they might envision for their learning can help prime them to do so effectively.
In the Name of Compassion (and Accurate Assessment): Reducing Stress and Test Anxiety
The following approaches provide a menu of options that you might choose from to help reduce test anxiety and the acute stress some students may experience.
Providing clear and detailed information to students about what to expect on the final well in advance helps minimize their experience of uncertainty and anxiety, allowing them to focus on meaningful preparation. What is the purpose of the final, both in terms of the course learning objectives, but also in terms of the skills and knowledge you imagine they might develop for future academic and non-academic use (Winkelmes et al, 2016)? What will students need to do to complete the task (number of questions of which types; individual vs. group; pitfalls to avoid; and logistical and cognitive steps they should plan to follow)? By what criteria will you evaluate their success (often established by providing a rubric or list of criteria, and potentially having students evaluate an example of a similar piece of student work to understand the rubric)?
Choice and control
Allowing students to have some element of choice or control in the final exam can also help to reduce their experience of stress, as well as enhancing their motivation (Gino, 2016; Ushioda, 2011; Skinner, 1996). Choice and control can come in many forms: Might students have the freedom to pick which to answer from a set of possible exam questions? To select the format for a final project? To take the exam at a time when their home is likely to be quiet and they are less likely to experience bandwidth limitations or interruptions? Consider mentioning elements of choice and control when describing the final ahead of time, so that students know what to expect.
Many different alternatives to traditional grading schemes can help lower student grade anxiety and allow them to focus more on learning and demonstrating that learning accurately. Examples range from simple structural approaches such as weighing students’ lowest assessment less heavily than their others in the final grade calculation to specifications grading approaches that allow students to decide which grade they want and work toward an established set of specifications required to earn it. Current research suggests we might choose to avoid more traditional practices such as grading on a bell curve, which pits students against one another, intensifies the experience of stereotype threat for individuals from student populations traditionally underrepresented in higher education, and ultimately undermines students’ motivation to study (Dubey & Geanakoplos, 2010; Grant, 2016).
Check your tech
In times of emergency remote learning, students may still be learning new technologies, and you might be especially selective about the technologies used for your final. If using a Moodle Quiz, preview it to ensure that any figures are clear. If your final requires students to type special characters, review available technologies for widespread accessibility and reliability (or default to having students write handwritten answers and upload photos from their phones). Colleagues at Elon’s Teaching and Learning Technologies can help you identify the best-fit technologies for your final.
A Final Note: Maintaining Academic integrity
Especially with online finals, we might be tempted to administer multiple choice, fill-in-the-blank, matching, and short answer format questions using a timed Moodle quiz to reduce the chances of honor code violations. However, setting time limitations is likely to ratchet up student anxiety and diminish their performance, even assuming the best possible conditions (no bandwidth hiccups cause Moodle to reset, all students requiring extra time receive it, their home conditions are quiet and allow them to focus, and everyone is in a time zone where taking a final at that time makes sense). Instead, consider including an honor pledge as the first question on the assessment, reminding students of your specific expectations for this assessment and how they align with the university’s standards for academic integrity. Combined with taking the time to come up with your own unique assessment questions and emphasizing to students the durable knowledge and skills they will practice on the final and the ways those skills and knowledge will serve them in the future, this approach can substantially reduce any temptation to cheat.
Baddeley, A., & Hitch, G. J. (2010). Working Memory. Scholaropedia, 5(2): 3015.
Dubey, P., & Geanakoplos, J. (2010). Grading exams: 100, 99, 98,… or a, b, c?. Games and Economic Behavior, 69(1), 72-94.
Eberly Center (n.d.). Why should assessments, learning objectives, and instructional strategies be aligned? Retrieved from https://www.cmu.edu/teaching/assessment/basics/alignment.html.
Friend, K. E. (1982). Stress and Performance: Effects of Subjective Work Load and Time Urgency. Personnel Psychology, 35(3), 623-633.
Gino, F. (2016, April 14). Are You Too Stressed to Be Productive? Or Not Enough? Harvard Business Review. Retrieved from https://hbr.org/2016/04/are-you-too-stressed-to-be-productive-or-not-stressed-enough
Grant, A. (2016). Why we should stop grading students on a curve. The New York Times.
Hall, M. (2018, April 11). What is Specifications Grading and Why You Should Consider Using It? The Innovative Instructor Blog. Johns Hopkins University.
Karatas, H., Alci, B., & Aydin, H. (2013). Correlation among high school senior students’ test anxiety, academic performance and points of university entrance exam. Educational Research and Reviews, 8(13), 919.
Krathwohl, D. R., & Anderson, L. W. (2009). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Longman.
Musch, J., & Broder, A. (1999). Test anxiety versus academic skills: A comparison of two alternative models for predicting performance in a statistics exam. British Journal of Educational Psychology, 69(1), 105-116.
Skinner, E. A. (1996). A guide to constructs of control. Journal of Personality and Social Psychology, 71(3), 549.
Ushioda, E. (2011). Why autonomy? Insights from motivation theory and research. Innovation in Language Learning and Teaching, 5(2), 221-232.
Winkelmes, M. A., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K. H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2), 31-36.