skip to main content
10.1145/3231644.3231664acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesl-at-sConference Proceedingsconference-collections
research-article

How much randomization is needed to deter collaborative cheating on asynchronous exams?

Published: 26 June 2018 Publication History

Abstract

This paper investigates randomization on asynchronous exams as a defense against collaborative cheating. Asynchronous exams are those for which students take the exam at different times, potentially across a multi-day exam period. Collaborative cheating occurs when one student (the information producer) takes the exam early and passes information about the exam to other students (the information consumers) that are taking the exam later. Using a dataset of computerized exam and homework problems in a single course with 425 students, we identified 5.5% of students (on average) as information consumers by their disproportionate studying of problems that were on the exam. These information consumers ("cheaters") had a significant advantage (13 percentage points on average) when every student was given the same exam problem (even when the parameters are randomized for each student), but that advantage dropped to almost negligible levels (2--3 percentage points) when students were given a random problem from a pool of two or four problems. We conclude that randomization with pools of four (or even three) problems, which also contain randomized parameters, is an effective mitigation for collaborative cheating. Our analysis suggests that this mitigation is in part explained by cheating students having less complete information about larger pools.

References

[1]
J. D. Angrist and J. Pischke. 2008. Mostly harmless econometrics: An empiricist's companion. Princeton University Press.
[2]
J. D Angrist and J. Pischke. 2014. Mastering'metrics: The path from cause to effect. Princeton University Press.
[3]
A. C. Bugbee Jr. 1996. The equivalence of paper-and-pencil and computer-based testing. Journal of Research on Computing in Education 28, 3 (1996), 282--299.
[4]
B. Chen, M. West, and C. Zilles. 2017. Do performance trends suggest wide-spread collaborative cheating on asynchronous exams?. In Learning at Scale 2017.
[5]
R. F. DeMara, N. Khoshavi, S. D. Pyle, J. Edison, R. Hartshorne, B. Chen, and M. Georgiopoulos. 2016. Redesigning computer engineering gateway courses using a novel remediation hierarchy. In 2016 ASEE Annual Conference & Exposition. New Orleans, Louisiana.
[6]
E. Lee, N. Garg, C. Bygrave, J. Mahar, and V. Mishra. 2015. Can university exams be shortened? An alternative to problematic traditional methodological approaches. In Proceedings of the 14th European Conference on Research Methods.
[7]
D. L. McCabe. 2005. Cheating among college and university students: A North American perspective. International Journal for Educational Integrity 1, 1 (2005).
[8]
R. Muldoon. 2012. Is it time to ditch the traditional university exam? Higher Education Research and Development 31, 2 (2012), 263--265.
[9]
C. G. Northcutt, A. D. Ho, and I. L. Chuang. 2016. Detecting and preventing "multiple-account" cheating in massive open online courses. Computers & Education 100 (2016), 71--80.
[10]
C. G. Parshall. 2002. Practical considerations in computer-based testing. Springer Science & Business Media.
[11]
W. Schnedler. 2005. Likelihood estimation for censored random vectors. Econometric Reviews 24, 2 (2005), 195--217.
[12]
L. Wang, Z. Zhang, J. J. McArdle, and T. A. Salthouse. 2008. Investigating ceiling effects in longitudinal data analysis. Multivariate Behavioral Research 43, 3 (2008), 476--496.
[13]
G. Watson and J. Sottile. 2010. Cheating in the digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration 13, 1 (2010).
[14]
Matthew West, Geoffrey L. Herman, and Craig Zilles. 2015. PrairieLearn: Mastery-based online problem solving with adaptive scoring and recommendations driven by machine learning. In 2015 ASEE Annual Conference & Exposition. Seattle, Washington.
[15]
Craig Zilles, Robert Timothy Deloatch, Jacob Bailey, Bhuwan B. Khattar, Wade Fagen, Cinda Heeren, David Mussulman, and Matthew West. 2015. Computerized testing: A vision and initial experiences. In 2015 ASEE Annual Conference & Exposition. Seattle, Washington.
[16]
C. Zilles, M. West, and D. Mussulman. 2016. Student behavior in selecting an exam time in a Computer-Based Testing Facility. In 2016 ASEE Annual Conference & Exposition. New Orleans, Louisiana.

Cited By

View all
  • (2024)Plagiarism in the Age of Generative AI: Cheating Method Change and Learning Loss in an Intro to CS CourseProceedings of the Eleventh ACM Conference on Learning @ Scale10.1145/3657604.3662046(75-85)Online publication date: 9-Jul-2024
  • (2024)A Generalized Framework for Describing Question RandomizationProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 210.1145/3626253.3635599(1736-1737)Online publication date: 14-Mar-2024
  • (2024)Comparing the Security of Three Proctoring Regimens for Bring-Your-Own-Device ExamsProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630809(429-435)Online publication date: 7-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
L@S '18: Proceedings of the Fifth Annual ACM Conference on Learning at Scale
June 2018
391 pages
ISBN:9781450358866
DOI:10.1145/3231644
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. asynchronous exams
  2. collaborative cheating
  3. computerized testing
  4. problem randomization

Qualifiers

  • Research-article

Conference

L@S '18
L@S '18: Fifth (2018) ACM Conference on Learning @ Scale
June 26 - 28, 2018
London, United Kingdom

Acceptance Rates

L@S '18 Paper Acceptance Rate 24 of 58 submissions, 41%;
Overall Acceptance Rate 117 of 440 submissions, 27%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)31
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Plagiarism in the Age of Generative AI: Cheating Method Change and Learning Loss in an Intro to CS CourseProceedings of the Eleventh ACM Conference on Learning @ Scale10.1145/3657604.3662046(75-85)Online publication date: 9-Jul-2024
  • (2024)A Generalized Framework for Describing Question RandomizationProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 210.1145/3626253.3635599(1736-1737)Online publication date: 14-Mar-2024
  • (2024)Comparing the Security of Three Proctoring Regimens for Bring-Your-Own-Device ExamsProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630809(429-435)Online publication date: 7-Mar-2024
  • (2023)The Generation of a Large Bank of Randomized Questions in a Discrete Structures CourseJournal of Computing Sciences in Colleges10.5555/3636971.363697239:2(10-18)Online publication date: 1-Oct-2023
  • (2023)Who's Cheating WhomProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 210.1145/3545947.3569609(1210-1211)Online publication date: 1-Mar-2023
  • (2023)Beyond Question Shuffling: Randomization Techniques in Programming Assessment2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10342976(1-9)Online publication date: 18-Oct-2023
  • (2023)Issues of Question Equivalence in Online Exam PoolsJournal of College Science Teaching10.1080/0047231X.2023.1229062952:4(3-5)Online publication date: 30-Aug-2023
  • (2022)Exam TimeProceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 210.1145/3478432.3499123(1138-1138)Online publication date: 3-Mar-2022
  • (2022)Are We Fair?Proceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499388(647-653)Online publication date: 22-Feb-2022
  • (2022)Lessons Learned from Asynchronous Online Assessment Formats in CS0 and CS3Proceedings of the 53rd ACM Technical Symposium on Computer Science Education - Volume 110.1145/3478431.3499386(640-646)Online publication date: 22-Feb-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media