Processing and Understanding Moodle Log Data and Their Temporal Dimension
DOI:
https://doi.org/10.18608/jla.2023.7867Keywords:
Learning log data, Educational log data, Moodle log data collection, Time-on-task, Temporal dimension, data and tools reportAbstract
The increased adoption of online learning environments has resulted in the availability of vast amounts of educational
log data, which raises questions that could be answered by a thorough and accurate examination of students’ online
learning behaviours. Event logs describe something that occurred on a platform and provide multiple dimensions
that help to characterize what actions students take, when, and where (in which course and in which part of the
course). Temporal analysis has been shown to be relevant in learning analytics (LA) research, and capturing
time-on-task as a proxy to model learning behaviour, predict performance, and prevent drop-out has been the
subject of several studies. In Moodle, one of the most used learning management systems, while most events are
logged at their beginning, other events are recorded at their end. The duration of an event is usually calculated as
the difference between two consecutive records assuming that a log records the action’s starting time. Therefore,
when an event is logged at its end, the difference between the starting and the ending event identifies their sum,
not the duration of the first. Moreover, in the pursuit of a better user experience, increasingly more online learning
platforms’ functions are shifted to the client, with the unintended effect of reducing significant logs and conceivably
misinterpreting student behaviour. The purpose of this study is to present Moodle’s logging system to illustrate
where the temporal dimension of Moodle log data can be difficult to interpret and how this knowledge can be used
to improve data processing. Starting from the correct extraction of Moodle logs, we focus on factors to consider
when preparing data for temporal dimensional analysis. Considering the significance of the correct interpretation of
log data to the LA community, we intend to initiate a discussion on this domain understanding to prevent the loss of
data-related knowledge.
References
Akhuseyinoglu, K., & Brusilovsky, P. (2022). Exploring behavioral patterns for data-driven modeling of learners’ individual
differences. Frontiers in Artificial Intelligence, 5, 9 pages. https://doi.org/10.3389/frai.2022.807320
Alario-Hoyos, C., Rodrıguez-Triana, M. J., Scheffel, M., Arnedillo-S ́anchez, I., & Dennerlein, S. M. (2020). Addressing global
challenges and quality education: Proceedings of the 15th European Conference on Technology Enhanced Learning
(EC-TEL 2020), 14–18 September 2020, Heidelberg, Germany. In Lecture notes in computer science. Springer Nature.
https://doi.org/10.1007/978-3-030-57717-9
Bergner, Y. (2017). Measurement and its uses in learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Ga ˇsevi ́c (Eds.),
Handbook of learning analytics (pp. 35–48). Society for Learning Analytics Research (SoLAR). https://doi.org/10.
/hla17.003
Bernardini, A., & Conati, C. (2010). Discovering and recognizing student interaction patterns in exploratory learning environments. In V. Aleven, J. Kay, & J. Mostow (Eds.), Proceedings of the 10th International Conference on Intelligent
Tutoring Systems Part I (ITS 2010), 14–18 June 2010, Pittsburgh, PA, Lecture notes in computer science, vol. 6094
(pp. 125–134). https://doi.org/10.1007/978-3-642-13388-6 17
Bovo, A., Sanchez, S., Heguy, O., & Duthen, Y. (2013). Clustering Moodle data as a tool for profiling students. In Proceedings
of the Second International Conference on E-Learning and E-Technologies in Education (ICEEE 2013), 23–25
September 2013, Lodz, Poland (pp. 121–126). IEEE. https://doi.org/10.1109/icelete.2013.6644359
Calder, B. J., Brendl, C. M., Tybout, A. M., & Sternthal, B. (2021). Distinguishing constructs from variables in designing
research. Journal of Consumer Psychology, 31(1), 188–208. https://doi.org/10.1002/jcpy.1204
Cerezo, R., S ́anchez-Santill ́an, M., Paule-Ruiz, M. P., & N ́u ̃nez, J. C. (2016). Students’ LMS interaction patterns and their
relationship with achievement: A case study in higher education. Computers & Education, 96, 42–54. https://doi.org/
1016/j.compedu.2016.02.006
Chen, B., Knight, S., & Wise, A. (2018). Critical issues in designing and implementing temporal analytics. Journal of Learning
Analytics, 5(1), 1–9. https://doi.org/10.18608/jla.2018.53.1
Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data: A comparison
of 17 blended courses using moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17–29. https :
//doi.org/10.1109/TLT.2016.2616312
Cooley, R., Mobasher, B., & Srivastava, J. (1999). Data preparation for mining world wide web browsing patterns. Knowledge
and Information Systems, 1(1), 5–32. https://doi.org/10.1007/bf03325089
Darlington, W. J. (2017). Predicting underperformance from students in upper level engineering courses. Rochester Institute of Technology.
de Oliveira, P. C., de Almeida Cunha, C. J. C., & Nakayama, M. K. (2016). Learning management systems (LMS) and e-learning management: An integrative review and research agenda. JISTEM—Journal of Information Systems and Technology Management, 13, 157–180. https://doi.org/10.4301/S1807-17752016000200001
Dermy, O., & Brun, A. (2020). Can we take advantage of time-interval pattern mining to model students activity?. In A. N.
Rafferty, J. Whitehill, C. Romero, & V. Cavalli-Sforza (Eds.), Proceedings of the 13th International Conference on
Educational Data Mining (EDM 2020), 10–13 July 2020, online (pp. 69–80). https://educationaldatamining.org/files/
conferences/EDM2020/EDM2020Proceedings.pdf
Fincham, E., Whitelock-Wainwright, A., Kovanovi ́c, V., Joksimovi ́c, S., van Staalduinen, J. - P., & Ga ˇsevi ́c, D. (2019). Counting clicks is not enough: Validating a theorized model of engagement in learning analytics. In Proceedings of the Ninth International Conference on Learning Analytics and Knowledge (LAK 2019), 4–8 March 2019, Tempe, AZ (pp. 501–
. ACM. https://doi.org/10.1145/3303772.3303775
Gushchina, O., & Ochepovsky, A. (2020). Data mining of students’ behavior in e-learning system. Journal of Physics:
Conference Series, 1553(1), 012027. https://doi.org/10.1088/1742-6596/1553/1/012027
Joksimovi ́c, S., Ga ˇsevi ́c, D., Loughin, T. M., Kovanovi ́c, V., & Hatala, M. (2015). Learning at distance: Effects of interaction
traces on academic achievement. Computers & Education, 87, 204–217. https://doi.org/10.1016/j.compedu.2015.07.
Kapusta, J., Munk, M., & Drlik, M. (2012). Cut-off time calculation for user session identification by reference length. In
Proceedings of the Sixth International Conference on Application of Information and Communication Technologies
(AICT 2012), 17–19 October 2012, Tbilisi, Georgia (pp. 1–6). IEEE. https://doi.org/10.1109/icaict.2012.6398500
Knight, S., Wise, A. F., & Chen, B. (2017). Time for change: Why learning analytics needs temporal analysis. Journal of
Learning Analytics, 4(3), 7–17. https://doi.org/10.18608/jla.2017.43.2
Knobbout, J., Everaert, H., & van der Stappen, E. (2019). From dirty data to multiple versions of truth: How different choices in data cleaning lead to different learning analytics outcomes. In A. Pucihar, M. K. Bor ˇstnar, R. Bons, J. Seitz, H. Cripps,
& D. Vidmar (Eds.), Proceedings of the 32nd Bled eConference Humanizing Technology for a Sustainable Society,
–19 June 2019, Bled, Slovenia (pp. 49–66). Hogeschool Utrecht. https://doi.org/10.18690/978-961-286-280-0.4
Kovanovi ́c, V., Ga ˇsevi ́c, D., Dawson, S., Joksimovic, S., & Baker, R. (2016). Does time-on-task estimation matter? implications on validity of learning analytics findings. Journal of Learning Analytics, 2(3), 81–110. https://doi.org/10.18608/jla.
23.6
Kularbphettong, K. (2018). Analysis of students’ behavior based on educational data mining. In R. Silhavy, P. Silhavy, &
Z. Prokopova (Eds.), Applied computational intelligence and mathematical methods: Computational methods in
systems and software 2017 (pp. 167–172). https://doi.org/10.1007/978-3-319-67621-0 15
Lang, C., Siemens, G., Wise, A., & Gasevic, D. (2017). Handbook of learning analytics. Society for Learning Analytics;
Research (SoLAR). https://doi.org/10.18608/hla17
Lee, Y. (2018). Effect of uninterrupted time-on-task on students’ success in massive open online courses (moocs). Computers in Human Behavior, 86, 174–180. https://doi.org/10.1016/j.chb.2018.04.043
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of
concept. Computers & Education, 54(2), 588–599. https://doi.org/10.1016/j.compedu.2009.09.008
Nguyen, Q. (2020). Rethinking time-on-task estimation with outlier detection accounting for individual, time, and task
differences. In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2020),
–27 June 2020, Munich, Germany (pp. 376–381). ACM. https://doi.org/10.1145/3375462.3375538
Pedhazur, E. J., & Pedhazur Schmelkin, L. (1991). Measurement, design, and analysis: An integrated approach. Lawrence
Erlbaum Associates.
Poon, L. K., Kong, S.- C., Wong, M. Y., & Yau, T. S. (2017). Mining sequential patterns of students’ access on learning
management system. In Y. Tan, H. Takagi, & Y. Shi (Eds.), Proceedings of the Second International Conference
on Data Mining and Big Data (DMBD 2017), 27 July–1 August 2017, Fukuoka, Japan. Lecture notes in computer
science, Vol. 10387 (pp. 191–198). Springer. https://doi.org/10.1007/978-3-319-61845-6 20
Ridzuan, F., & Zainon, W. M. N. W. (2019). A review on data cleansing methods for big data. Procedia Computer Science, 161, 731–738. https://doi.org/10.1016/j.procs.2019.11.177
Riel, J., Lawless, K. A., & Brown, S. W. (2018). Timing matters: Approaches for measuring and visualizing behaviours of
timing and spacing of work in self-paced online teacher professional development courses. Journal of Learning
Analytics, 5(1), 25–40. https://doi.org/10.18608/jla.2018.51.3
Rotelli, D., & Monreale, A. (2022). Time-on-task estimation by data-driven outlier detection based on learning activities.
In 12th International Conference on Learning Analytics and Knowledge (LAK 2022), 21–25 March 2022, online
(pp. 336–346). ACM. https://doi.org/10.1145/3506860.3506913
Rotelli, D., Monreale, A., & Guidotti, R. (2022). Uncovering student temporal learning patterns. In I. Hilliger, P. Mu ̃noz-Merino,
T. De Laet, A. Ortega-Arranz, & T. Farrell (Eds.), Educating for a New Future: Making Sense of Technology-Enhanced
Learning Adoption: Proceedings of the 17th European Conference on Technology Enhanced Learning (EC-TEL
, 12–16 September 2022, Toulouse, France. Lecture notes in computer science, Vol. 13450 (pp. 340–353).
https://doi.org/10.1007/978-3-031-16290-9 25
Sael, N., Marzak, A., & Behja, H. (2013). Web usage mining data preprocessing and multi level analysis on moodle. In
Proceedings of the 2013 ACS International Conference on Computer Systems and Applications (AICCSA), 27–30
May 2013, Ifrane, Morocco. IEEE. https://doi.org/10.1109/aiccsa.2013.6616427
Saqr, M., Nouri, J., & Fors, U. (2018). Temporality matters. A learning analytics study of the patterns of interactions and
its relation to performance. In Proceedings of the 10th International Conference on Education and New Learning
Technologies (EDULEARN 2018), 2–4 July 2018, Palma, Spain (pp. 5386–5393). https://doi.org/10.21125/edulearn.
1305
Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 30.
https://doi.org/10.17471/2499-4324/195
Villalobos, E., P ́erez-Sanagustin, M., Sanza, C., Tricot, A., & Broisin, J. (2022). Supporting self-regulated learning in BL:
Exploring learners’ tactics and strategies. In Educating for a New Future: Making Sense of Technology-Enhanced
Learning Adoption: Proceedings of the 17th European Conference on Technology Enhanced Learning (EC-TEL 2022),
–16 September, 2022, Toulouse, France (pp. 407–420). https://doi.org/10.1007/978-3-031-16290-9 30
Wang, Y., Li, T., Geng, C., & Wang, Y. (2019). Recognizing patterns of student’s modeling behaviour patterns via process
mining. Smart Learning Environments, 6(1), 1–16. https://doi.org/10.1186/s40561-019-0097-y
Yu, T., & Jo, I. - H. (2014). Educational technology approach toward learning analytics: Relationship between student online
behavior and learning performance in higher education. In Proceedings of the Fourth International Conference
on Learning Analytics and Knowledge (LAK 2014), 24–28 March 2014, Indianapolis, IN (pp. 269–270). ACM.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Journal of Learning Analytics
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
TEST