A Partnership-Based Approach to Operationalizing Learning Behaviours Using Event Data
DOI:
https://doi.org/10.18608/jla.2022.6751Keywords:
researcher–practitioner partnership, learning management systems, learning behaviours and strategies, research paperAbstract
This paper describes a partnership-based approach for analyzing data from a learning management system (LMS) used by students in grades 6–12. The goal of the partnership was to create indicators for the ways in which students navigated digital learning activities, referred to as playlists, that were comprised of resources, pre-assessments, and summative assessments. To develop various indicators, the collaboration gathered school practitioners’ perspectives on desirable and undesirable student actions within and across playlists, jointly explored and made sense of LMS data, and examined the relationships between behavioural indicators and outcomes that were important to practitioners. The approach described in this paper is intended to provide an example for future researcher–practitioner collaborations to build upon when seeking to jointly analyze data from digital learning environments. The widespread use of playlists and LMSs in K–12 schools throughout the United States means that the collaborative process described in this paper may have broad applicability to large numbers of digital environments, schools, and collaborations.
References
Ahn, H., Beck, A., Rice, J., & Foster, M. (2016). Exploring issues of implementation, equity, and student achievement with educational software in the DC public schools. AERA Open, 2(4), 1–10. https://doi.org/10.1177/2332858416667726
Ahn, J., Campos, F., Hays, M., & Digiacomo, D. (2019). Designing in context: Reaching beyond usability in learning analytics dashboard design. Journal of Learning Analytics, 6(2), 70–85. https://doi.org/10.18608/jla.2019.62.5
Allensworth, E. M., & Easton, J. Q. (2007). What matters for staying on track and graduating in Chicago Public Schools. University of Chicago Consortium on Chicago School Research. Retrieved from https://consortium.uchicago.edu/sites/default/files/2018-10/07%20What%20Matters%20Final.pdf
Baker, R. S. & Koedinger, (2018). Towards demonstrating the value of learning analytics for K–12 education. In D. Niemi, R. D. Pea, B. Saxberg, & R. E. Clark (Eds.), Learning analytics in education (pp. 49–62). Information Age Publishing.
Bates, D., Maechler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1–48. https://doi.org/10.18637/jss.v067.i01
Beck, J. E., & Gong, Y. (2013). Wheel-spinning: Students who fail to master a skill. In H. C. Lane, K. Yacef, J. Mostow, & P. Pavlik (Eds.), Proceedings of the 16th International Conference on Artificial Intelligence in Education (AIED ʼ13), 9–13 July 2013, Memphis, TN, USA (pp. 431–440). Springer. https://doi.org/10.1007/978-3-642-39112-5_44
Beck, J. E., & Rodrigo, M. M. T. (2014). Understanding wheel-spinning in the context of affective factors. In S. Trausan-Matu, K. E. Boyer, M. Crosby, & K. Panourgia (Eds.), Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), 5–9 June 2014, Honolulu, HI, USA (pp. 162–167). Springer. https://doi.org/10.1007/978-3-319-07221-0_20
Biag, M. (2017). Building a village through data: A research–practice partnership to improve youth outcomes. School Community Journal, 27(1), 9–28.
Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. U.S. Department of Education.
Boroujeni, M. S., Sharma, K., Kidzinski, L., Lucignano, L., & Dillenbourg, P. (2016). How to quantify student’s regularity? In K. Verbert et al. (Eds.), Proceedings of the 11th European Conference on Technology Enhanced Learning (EC-TEL 2016), 13–16 September 2016, Lyon, France (pp. 277–291). Lecture Notes in Computer Science, Springer. https://doi.org/10.1007/978-3-319-45153-4_21
Bowers, A. J. (2011). What’s in a grade? The multidimensional nature of what teacher assigned grades assess in high school. Educational Research & Evaluation, 17(3), 141–159. https://doi.org/10.1080/13803611.2011.597112.
Bowers, A. J. & Krumm, A. E. (2021), Supporting the initial work of evidence-based improvement cycles through a data-intensive partnership. Information and Learning Sciences, 122(9/10), 629–650. https://doi.org/10.1108/ILS-09-2020-0212
Bowker, G., & Star, S. (1999). Sorting things out: Classification and its consequences. MIT Press.
Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6, 284–90. https://doi.org/10.1037/1040-3590.6.4.284
Clow, D. (2014). Data wranglers: Human interpreters to help close the feedback loop. Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ʼ14), 24–28 March 2014, Indianapolis, IN, USA (pp. 49–53). ACM Press. https://doi.org/10.1145/2567574.2567603
Crisan, A., & Munzner, T. (2019). Uncovering data landscapes through data reconnaissance and task wrangling. Proceedings of the 2019 IEEE Conference on Visualization (VIS 2019) 20–25 October 2019, Vancouver, BC, Canada (pp. 46–50). IEEE Computer Society. https://doi.org/10.1109/VISUAL.2019.8933542
Coleman, C., Baker, R. S., & Stephenson, S. (2019). A better coldstart for early prediction of student at-risk status in new school districts. In C. F. Lynch, A. Merceron, M. Desmarais, & R. Nkambou (Eds.), Proceedings of the 12th International Conference on Educational Data Mining (EDM2019), 2–5 July 2019, Montréal, Quebec, Canada (pp. 732–737). International Educational Data Mining Society.
Daft, R. L., & Lengel, R. H. (1986). Organizational information requirements, media richness and structural design. Management Science, 32(5), 554–571. Retrieved from http://www.jstor.org/stable/2631846
DiCerbo, K. E. (2014). Game-based assessment of persistence. Educational Technology & Society, 17(1), 17–28. Retrieved from https://www.jstor.org/stable/jeductechsoci.17.1.17
DiCerbo, K. E. (2017). Building the evidentiary argument in game-based assessment. Journal of Applied Testing Technology, 18(S1), 7–18.
Farrell, C. C., Davidson, K. L., Repko-Erwin, M. E., Penuel, W. R., Quantz, M., Wong, H., Riedy, R., & Brink, Z. (2018). A descriptive study of the IES researcher–practitioner partnerships in education research program: Final report (Technical Report No. 3). National Center for Research in Policy and Practice.
Feng, M., Krumm, A. E., Bowers, A. J., & Podkul, T. (2016). Elaborating data intensive research methods through researcher–practitioner partnerships. Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ʼ16), 25–29 April 2016, Edinburgh, UK (pp. 540–541). ACM Press. https://doi.org/10.1145/2883851.2883908
Figlio, D., Karbownik, K., & Salvanes, K. (2017). The promise of administrative data in education research. Education Finance and Policy, 12(2), 129–136. https://doi.org/10.1162/EDFP_a_00229
Friedman, C. P., & Flynn, A. J. (2019). Computable knowledge: An imperative for learning health systems. Learning Health Systems, 3(4), e10203. https://doi.org/10.1002/lrh2.10203
Gamer, M., Lemon, J., Fellows, I., & Singh, P. (2019). irr: Various coefficients of interrater reliability and agreement. R package version 0.84.1. Retrieved from https://CRAN.R-project.org/package=irr
Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10
Goldin, S., O’Neill, M. K., Naik, S. S., & Zaccarelli, F. G. (2019). Supporting student learning practices: Redefining participation and engagement. The Elementary School Journal, 119(3), 417–442. https://doi.org/10.1086/701654
Gutiérrez, K. D., & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher, 43(1), 19–23. https://doi.org/10.3102/0013189X13520289.
Kerr, D., Andrews, J. J., & Mislevy, R. J. (2016). The in-task assessment framework for behavioral data. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 472–507). Wiley-Blackwell. https://doi.org/10.1002/9781118956588.ch20
Knight, S., & Buckingham Shum, S. (2017). Theory and learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 17–22). Society for Learning Analytics Research (SoLAR). https://doi.org/10.18608/hla17.001
Krumm, A. E., Means, B., & Bienkowski, M. (2018). Learning analytics goes to school: A collaborative approach to improving education. Routledge. https://doi.org/10.4324/9781315650722
Krumm, A. E., Boyce, J., & Everson, H. T. (2021). A collaborative approach to sharing learner event data. Journal of Learning Analytics, 8(2), 73–82. https://doi.org/10.18608/jla.2021.7375
Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015). Does time-on-task matter? Implications for the validity of learning analytics findings. Journal of Learning Analytics, 2(3), 81–110. https://doi.org/10.18608/jla.2015.23.6
Liu, R., Stamper, J. C., & Davenport, J. (2018). A novel method for the in-depth multimodal analysis of student learning trajectories in intelligent tutoring systems. Journal of Learning Analytics, 5(1), 41–54. https://doi.org/10.18608/jla.2018.51.4
Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114, 1–48. https://doi.org/10.1177/016146811211401106
McGraw, K. O., & Wong, S. P. (1996). Forming inferences about some intraclass correlation coefficients. Psychological methods, 1(1), 30. https://doi.org/10.1037/1082-989X.1.1.30
McLaughlin, M. W., & London, R. A. (2013). From data to action: A community approach to improving youth outcomes. Harvard Education Press.
Nelson, I. A., London, R. A., & Stroebel, K. R. (2015). Reinventing the role of the university researcher. Educational Researcher, 44(1), 17–26. https://doi.org/10.3102/0013189X15570387
Page, S. E. (2007). The difference: How the power of diversity creates better groups, firms, schools, and societies. Princeton University Press.
Penuel, W. R., Allen, A. R., Coburn, C. E., & Farrell, C. (2015). Conceptualizing research–practice partnerships as joint work at boundaries. Journal of Education for Students Placed at Risk, 20(1–2), 182–197. https://doi.org/10.1080/10824669.2014.988334
Penuel, W. R., & Gallagher, D. (2017). Creating research–practice partnerships in education. Harvard Education Press.
Piety, P. J. (2019). Components, infrastructures, and capacity: The question for the impact of actionable data use on P-20 educator practice. Review of Research in Education, 43(1), 394–421. https://doi.org/10.3102/0091732x18821116
R Core Team (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing. Retrieved from https://www.R-project.org/
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models in social and behavioral research: Applications and data analysis methods (2nd ed.). Sage Publications.
Roderick, M., Kelley-Kemple, T., Johnson, D. W., & Beechum, N. O. (2014). Preventable failure: Improvements in long-term outcomes when high schools focused on the ninth grade year. Research summary. University of Chicago Consortium on Chicago School Research. Retrieved from https://consortium.uchicago.edu/sites/default/files/2018-10/On-Track%20Validation%20RS.pdf
Roschelle, J., & Krumm, A. E. (2015). Infrastructures for improving learning in information-rich classrooms. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu, & B. Wasson (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 3–9). Routledge. https://doi.org/10.4324/9781315777979-7
Shavelson, R. J., & Webb, N. M. (1991). Generalizability theory: A primer. Newbury Park, CA: Sage.
Shrout, P. R., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428. https://doi.org/10.1037/0033-2909.86.2.420
Star, S. (2010). This is not a boundary object: Reflections of the origin of a concept. Science, Technology, & Human Values, 35, 601–617. https://doi.org/10.1177/0162243910377624.
Thompson, K. D., Martinez, M. I., Clinton, C., & Díaz, G. (2017). Considering interest and action: Analyzing types of questions explored by researcher–practitioner partnerships. Educational Researcher, 46(8), 464–473. https://doi.org/10.3102/0013189x17733965
Weick, K. E. (1979). The social psychology of organizing (2nd ed.). McGraw-Hill.
Weick, K. E. (1995). Sensemaking in organizations. Sage Publications.
Wilson, M., & Scalise, K. (2016). Learning analytics: Negotiating the intersection of measurement technology and information technology. In M. Spector, B. Lockee, & M. Childress (Eds.), Learning, design, and technology: An International Compendium of Theory, Research, Practice, and Policy. Springer. https://doi.org/10.1007/978-3-319-17727-4_44-1
Winne, P. H. (2014). Issues in researching self-regulated learning as patterns of events. Metacognition and Learning, 9, 229–237. https://doi.org/10.1007/s11409-014-9113-3
Winne, P. H. (2017). Leveraging big data to help each learner and accelerate learning science. Teachers College Record, 119, 1–24 https://doi.org/10.1177/016146811711900305
Zheng, G., Fancsali, S. E., Ritter, S., & Berman, S. (2019). Using instruction-embedded formative assessment to predict state summative test scores and achievement levels in mathematics. Journal of Learning Analytics, 6(2), 153–174. https://doi.org/10.18608/jla.2019.62.11
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Journal of Learning Analytics
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
TEST