Designing Intelligent Review Forms for Peer Assessment: A Data-Driven Approach

This evidence-based practice paper employs a data-driven, explainable, and scalable approach to the development and application of an online peer review system in computer science and engineering courses. Crowd-sourced grading through peer review is an effective evaluation methodology that 1) allows the use of meaningful assignments in large or online classes (e.g. assignments other than true/false, multiple choice, or short answer), 2) fosters learning and critical thinking in a student evaluating another’s work, and 3) provides a defendable and non-biased score through the wisdom of the crowd. Although peer review is widely utilized, to the authors’ best knowledge, the form and associated grading process have never been subjected to data-driven analysis and design. We present a novel, iterative approach by first gathering the most appropriate review form questions through intelligent data mining of past student reviews. During this process, key words and ideas are gathered for positive and negative sentiment dictionaries, a flag word dictionary, and a negate word dictionary. Next, we revise our grading algorithm using simulations and perturbation to determine robustness (measured by standard deviation within a section). Using the dictionaries, we leverage sentiment gathered from review comments as a quality assurance mechanism to generate a crowd comment “grade”. This grade supplements the weighted average of other review form sections. The result of this semi-automated, innovative process is a peer assessment package (intelligently-designed review form and robust grading algorithm leveraging crowd sentiment) based on actual student work that can be used by an educator to confidently assign and grade meaningful open-ended assignments in any size class.

Designing Intelligent Review Forms for Peer Assessment: A Data-Driven Approach
Z Beasley, L Piegl, P Rosen
ASEE Annual Conference & Exposition, 2019