How Does a Data-Informed Deliberate Change in Learning Design Impact Students’ Self-Regulated Learning Tactics?

Authors

DOI:

https://doi.org/10.18608/jla.2024.8083

Keywords:

learning analytics, learning design, process mining, self-regulated learning, mastery-based learning, research paper

Abstract

The current study measures the extent to which students’ self-regulated learning tactics and learning outcomes change as the result of a deliberate, data-driven improvement in the learning design of mastery-based online learning modules. In the original design, students were required to attempt the assessment once before being allowed to access the learning material. The improved design gave students the choice to skip the first attempt and access the learning material directly. Student learning tactics were measured using a multi-level clustering and process mining algorithm, and a quasi-experiment design was implemented to remove or reduce differences in extraneous factors, including content being covered, time of implementation, and naturally occurring fluctuations in student learning tactics. The analysis suggests that most students who chose to skip the first attempt were effectively self-regulating their learning and were thus successful in learning from the instructional materials. Students who would have failed the first attempt were much more likely to skip it than those who would have passed the first attempt. The new design also resulted in a small improvement in learning outcome and median learning time. The study demonstrates the creation of a closed loop between learning design and learning analytics: first, using learning analytics to inform improvements to the learning design, then assessing the effectiveness and impact of the improvements.

References

Alexandron, G., Ruipérez-Valiente, J. A., Chen, Z., Muñoz-Merino, P. J., & Pritchard, D. E. (2017). Copying@scale: Using harvesting accounts for collecting correct answers in a MOOC. Computers & Education, 108, 96–114. https://doi.org/10.1016/j.compedu.2017.01.015

Azevedo, R., & Taub, M. (2020). The challenge of measuring processes and outcomes while learning from multiple representations with advanced learning technologies. In P. Van Meter, A. List, D. Lombardi, & P. Kendeou (Eds.), Handbook of learning from multiple representations and perspectives. Routledge. https://doi.org/10.4324/9780429443961-34

Barkley, A., & Coffey, B. K. (2018). An economic model of student learning. Journal of Agricultural and Applied Economics, 50(4), 503–525. https://doi.org/10.1017/AAE.2018.13

Bloom, B. S. (1968). Learning for mastery. Evaluation Comment, 1(2).

Chen, Z. (2022). Measuring the level of homework answer copying during COVID-19 induced remote instruction. Physical Review Physics Education Research, 18(1), 010126. https://doi.org/10.1103/PhysRevPhysEducRes.18.010126

Chen, Z., Garrido, G., Berry, Z., Turgeon, I., & Yonekura, F. (2018). Designing online learning modules to conduct pre- and post-testing at high frequency. 2017 Physics Education Research Conference Proceedings, 26–27 July 2017, Cincinnati, OH, USA (pp. 84–87). https://doi.org/10.1119/perc.2017.pr.016

Chen, Z., Xu, M., Garrido, G., & Guthrie, M. W. (2020). Relationship between students’ online learning behavior and course performance: What contextual information matters? Physical Review Physics Education Research, 16(1), 010138. https://doi.org/10.1103/PhysRevPhysEducRes.16.010138

Craig, P., Katikireddi, S. V., Leyland, A., & Popham, F. (2017). Natural experiments: An overview of methods, approaches, and contributions to public health intervention research. Annual Review of Public Health, 38, 39–56. https://doi.org/10.1146/annurev-publhealth-031816-044327

Dunlosky, J., Baker, J. M. C., Rawson, K. A., & Hertzog, C. (2006). Does aging influence people’s metacomprehension? Effects of processing ease on judgments of text learning. Psychology and Aging, 21(2), 390–400. https://doi.org/10.1037/0882-7974.21.2.390

Fan, Y., Matcha, W., Uzir, N. A., Wang, Q., & Gašević, D. (2021). Learning analytics to reveal links between learning design and self-regulated learning. International Journal of Artificial Intelligence in Education, 31(4), 980–1021. https://doi.org/10.1007/s40593-021-00249-z

Felker, Z., & Chen, Z. (2023). Reducing procrastination on introductory physics online homework for college students using a planning prompt intervention. Physical Review Physics Education Research, 19(1), 010123. https://doi.org/10.1103/PhysRevPhysEducRes.19.010123

Foster, N. L., Was, C. A., Dunlosky, J., & Isaacson, R. M. (2017). Even after thirteen class exams, students are still overconfident: The role of memory for past exam performance in student predictions. Metacognition and Learning, 12(1), 1–19. https://doi.org/10.1007/S11409-016-9158-6

Gower, J. C. (1971). A general coefficient of similarity and some of its properties. Biometrics, 27(4), 857–871. https://doi.org/10.2307/2528823

Greene, J. A., & Azevedo, R. (2009). A macro-level analysis of SRL processes and their relations to the acquisition of a sophisticated mental model of a complex system. Contemporary Educational Psychology, 34(1), 18–29. https://doi.org/10.1016/j.cedpsych.2008.05.006

Guthrie, M. W., & Chen, Z. (2019a). Adding duration-based quality labels to learning events for improved description of students’ online learning behavior. In C. F. Lynch, A. Merceron, M. Desmarais, & R. Nkambou (Eds.), Proceedings of the 12th International Conference on Educational Data Mining (EDM2019), 2–5 July 2019, Montréal, Quebec, Canada (pp. 560–563). International Educational Data Mining Society. https://ssrn.com/abstract=3522718

Guthrie, M. W., & Chen, Z. (2019b). Comparing student behavior in mastery and conventional style online physics homework. 2019 Physics Education Research Conference Proceedings, 24–25 July 2019, Provo, UT, USA (pp. 190–195). https://www.per-central.org/items/detail.cfm?ID=15275

Hellinger, E. (1909). Neue begründung der theorie quadratischer formen von unendlichvielen veränderlichen [New justification of the theory of quadratic forms of infinite variables]. Journal für die Reine und Angewandte Mathematik, 136, 210–271. http://eudml.org/doc/149313

Holm, S. (1979). A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics, 6(2), 65–70. http://www.jstor.org/stable/4615733

Holmes, W., Nguyen, Q., Zhang, J., Mavrikis, M., & Rienties, B. (2019). Learning analytics for learning design in online distance learning. Distance Education, 40(3), 309–329. https://doi.org/10.1080/01587919.2019.1637716

Huber, K., & Bannert, M. (2023). Investigating learning processes through analysis of navigation behavior using log files. Journal of Computing in Higher Education. https://doi.org/10.1007/s12528-023-09372-3

Ifenthaler, D., Gibson, D., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2), 117–132. https://doi.org/10.14742/AJET.3767

Janssenswillen, G., van Hulzen, G., Depaire, B., Mannhardt, F., Beuving, T., & Urvikalia Hasselt University. (2023). processmapR: Construct process maps using event data (Version 0.5.3) [Data set]. https://CRAN.R-project.org/package=processmapR

Kaliisa, R., Kluge, A., & Mørch, A. I. (2020). Combining checkpoint and process learning analytics to support learning design decisions in blended learning environments. Journal of Learning Analytics, 7(3), 33–47. https://doi.org/10.18608/jla.2020.73.4

Kapur, M. (2010). Productive failure in mathematical problem solving. Instructional Science, 38(6), 523–550. https://doi.org/10.1007/s11251-009-9093-x

Koedinger, K. R., Booth, J. L., & Klahr, D. (2013). Instructional complexity and the science to constrain it. Science, 342(6161), 935–937. https://doi.org/10.1126/science.1238056

Kulik, J. A., Carmichael, K., & Kulik, C.-L. (1974). The Keller plan in science teaching: An individually paced, student-tutored, and mastery-oriented instructional method is evaluated. Science, 183(4123), 379–383. https://doi.org/10.1126/science.183.4123.379

Lancaster, A., Moses, S., Clark, M., & Masters, M. C. (2020). The positive impact of deliberate writing course design on student learning experience and performance. Journal of Learning Analytics, 7(3), 48–63. https://doi.org/10.18608/jla.2020.73.5

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367

Macfadyen, L. P., Lockyer, L., & Rienties, B. (2020). Learning design and learning analytics: Snapshot 2020. Journal of Learning Analytics, 7(3), 6–12. https://doi.org/10.18608/jla.2020.73.2

Maechler, M., Rousseeuw, P., Struyf, A., Hubert, M., & Hornik, K. (2023). cluster: Cluster analysis basics and extensions (R package version 2.1.6) [Data set]. https://CRAN.R-project.org/package=cluster

Magnus, J. R., & Peresetsky, A. A. (2018). Grade expectations: Rationality and overconfidence. Frontiers in Psychology, 8, 2346. https://doi.org/10.3389/fpsyg.2017.02346

Mangaroska, K., & Giannakos, M. (2019). Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies, 12(4), 516–534. https://doi.org/10.1109/TLT.2018.2868673

Motz, B. A., Bergner, Y., Brooks, C. A., Gladden, A., Gray, G., Lang, C., Li, W., Marmolejo-Ramos, F., & Quick, J. D. (2023). A LAK of direction: Misalignment between the goals of learning analytics and its research scholarship. Journal of Learning Analytics, 10(2), 1–13. https://doi.org/10.18608/jla.2023.7913

Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248. https://doi.org/10.1111/BJET.12207

Prates, M. O., Lachos, V. H., & Cabral, C. R. B. (2013). mixsmsn: Fitting finite mixture of scale mixture of skew-normal distributions. Journal of Statistical Software, 54(12), 1–20. https://doi.org/10.18637/jss.v054.i12

Rienties, B., Nguyen, Q., Holmes, W., & Reedy, K. (2017). A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK. Interaction Design and Architecture Journal, 33, 134–154. https://doi.org/10.55612/s-5002-033-007

Rousseeuw, P. J. (1987). Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics, 20, 53–65. https://doi.org/10.1016/0377-0427(87)90125-7

Saint, J., Fan, Y., Singh, S., Gašević, D., & Pardo, A. (2021). Using process mining to analyse self-regulated learning: A systematic analysis of four algorithms. Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK ’21), 12–16 April 2021, Irvine, CA, USA (pp. 333–343). ACM Press. https://doi.org/10.1145/3448139.3448171

Saint, J., Whitelock-Wainwright, A., Gašević, D., & Pardo, A. (2020). Trace-SRL: A framework for analysis of microlevel processes of self-regulated learning from trace data. IEEE Transactions on Learning Technologies, 13(4), 861–877. https://doi.org/10.1109/TLT.2020.3027496

Schwartz, D. L., Bransford, J. D., & Sears, D. (2005). Efficiency and innovation in transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 1–51). Information Age Publishing.

Serra, M. J., & DeMarree, K. G. (2016). Unskilled and unaware in the classroom: College students’ desired grades predict their biased grade predictions. Memory & Cognition, 44(7), 1127–1137. https://doi.org/10.3758/S13421-016-0624-9

SoLAR. (2020). What is Learning Analytics? Society for Learning Analytics Research. https://www.solaresearch.org/about/what-is-learning-analytics/

Sonnenberg, C., & Bannert, M. (2019). Using process mining to examine the sustainability of instructional support: How stable are the effects of metacognitive prompting on self-regulatory behavior? Computers in Human Behavior, 96, 259–272. https://doi.org/10.1016/j.chb.2018.06.003

Taub, M., Azevedo, R., Bradbury, A. E., Millar, G. C., & Lester, J. (2018). Using sequence mining to reveal the efficiency in scientific reasoning during STEM learning with a game-based learning environment. Learning and Instruction, 54, 93–103. https://doi.org/10.1016/j.learninstruc.2017.08.005

Taub, M., Banzon, A. M., Zhang, T., & Chen, Z. (2022). Tracking changes in students’ online self-regulated learning behaviors and achievement goals using trace clustering and process mining. Frontiers in Psychology, 13, 813514. https://doi.org/10.3389/fpsyg.2022.813514

Van Witteloostuijn, A. (1990). Learning in economic theory: A taxonomy with an application to expectations formation. Journal of Economic Psychology, 11(2), 183–207. https://doi.org/10.1016/0167-4870(90)90003-R

Warnakulasooriya, R., Palazzo, D. J., & Pritchard, D. E. (2007). Time to completion of web-based physics problems with tutoring. Journal of the Experimental Analysis of Behavior, 88(1), 103–113. https://doi.org/10.1901/jeab.2007.70-06

Whitcomb, K. M., Guthrie, M. W., Singh, C., & Chen, Z. (2021). Improving accuracy in measuring the impact of online instruction on students’ ability to transfer physics problem-solving skills. Physical Review Physics Education Research, 17(1), 010112. https://doi.org/10.1103/PhysRevPhysEducRes.17.010112

Winne, P. H. (2017). Cognition and metacognition within self-regulated learning. In D. H. Schunk & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance (2nd ed., pp. 36–48). Routledge. https://doi.org/10.4324/9781315697048

Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Metacognition in educational theory and practice (pp. 227–304). Lawrence Erlbaum Associates. https://doi.org/10.4324/9781410602350-12

Winne, P. H., & Hadwin, A. F. (2008). The weave of motivation and self-regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Motivation and self-regulated learning: Theory, research, and applications (pp. 297–314). Lawrence Erlbaum Associates.

Wortha, F., Azevedo, R., Taub, M., & Narciss, S. (2019). Multiple negative emotions during learning with digital learning environments: Evidence on their detrimental effect on learning from two methodological approaches. Frontiers in Psychology, 10, 2678. https://doi.org/10.3389/fpsyg.2019.02678

Wüst, K., & Beck, H. (2018). “I thought I did much better”: Overconfidence in university exams. Decision Sciences Journal of Innovative Education, 16(4), 310–333. https://doi.org/10.1111/DSJI.12165

Zhang, T., Taub, M., & Chen, Z. (2021). Measuring the impact of COVID-19 induced campus closure on student self-regulated learning in physics online learning modules. Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK ’21), 12–16 April 2021, Irvine, CA, USA (pp. 110–120). ACM Press. https://doi.org/10.1145/3448139.3448150

Zhang, T., Taub, M., & Chen, Z. (2022). A multi-level trace clustering analysis scheme for measuring students’ self-regulated learning behavior in a mastery-based online learning environment. Proceedings of the 12th International Conference on Learning Analytics and Knowledge (LAK ’22), 21–25 March 2022, Online (pp. 197–207). ACM Press. https://doi.org/10.1145/3506860.3506887

Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). Academic Press. https://doi.org/10.1016/B978-012109890-2/50031-7

Downloads

Published

2024-07-25

How to Cite

Chen, Z., Zhang, T., & Taub, M. (2024). How Does a Data-Informed Deliberate Change in Learning Design Impact Students’ Self-Regulated Learning Tactics?. Journal of Learning Analytics, 11(2), 174-196. https://doi.org/10.18608/jla.2024.8083

Issue

Section

Research Papers