What Makes Learning Analytics Research Matter

Authors

DOI:

https://doi.org/10.18608/jla.2021.7647

Keywords:

impact, closing-the-loop, editorial, equity, research-theory

Abstract

The ongoing changes and challenges brought on by the COVID-19 pandemic have exacerbated long-standing inequities in education, leading many to question basic assumptions about how learning can best benefit all students. Thirst for data about learning is at an all-time high, sometimes without commensurate attention to ensuring principles this community has long valued: privacy, transparency, openness, accountability, and fairness. How we navigate this dynamic context is critical for the future of learning analytics. Thinking about the issue through the lens of JLA publications over the last eight years, we highlight the important contributions of “problem-centric” rather than “tool-centric” research. We also value attention (proximal or distal) to the eventual goal of closing the loop, connecting the results of our analyses back to improve the learning from which they were drawn. Finally, we recognize the power of cycles of maturation: using information generated about real-world uses and impacts of a learning analytics tool to guide new iterations of data, analysis, and intervention design. A critical element of context for such work is that the learning problems we identify and choose to work on are never blank slates; they embed societal structures, reflect the influence of past technologies; and have previous enablers, barriers and social mediation acting on them. In that context, we must ask the hard questions: What parts of existing systems is our work challenging? What parts is it reinforcing? Do these effects, intentional or not, align with our values and beliefs? In the end what makes learning analytics matter is our ability to contribute to progress on both immediate and long-standing challenges in learning, not only improving current systems, but also considering alternatives for what is and what could be. This requires including stakeholder voices in tackling important problems of learning with rigorous analytic approaches to promote equitable learning across contexts. This journal provides a central space for the discussion of such issues, acting as a venue for the whole community to share research, practice, data and tools across the learning analytics cycle in pursuit of these goals.

References

Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1

Charleer, S., Moere, A. V., Klerkx, J., Verbert, K., & De Laet, T. (2017). Learning analytics dashboards to support adviser–student dialogue. IEEE Transactions on Learning Technologies, 11(3), 389–399. https://doi.org/10.1109/TLT.2017.2720670

Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12), 29 April – 2 May 2012, Vancouver, BC, Canada (pp. 134–138). New York: ACM. https://doi.org/10.1145/2330601.2330636

Dollinger, M. (2018). Technology for the scalability of co-creation with students. In M. Campbell, J. Willems, C. Adachi, D. Blake, I. Doherty, S. Krishnan, S. Macfarlane, L. Ngo, M. O’Donnell, S. Palmer, L. Riddell, I. Story, H. Suri, ... J. Tai (Eds.), Open Oceans: Learning Without Borders. Proceedings of the 35th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE 2018), 25–28 November 2018, Geelong, Victoria, Australia (pp. 346–350). Australasian Society for Computers in Learning in Tertiary Education. https://researchrepository.murdoch.edu.au/id/eprint/59155/1/ASCILITE-2018-Proceedings-Final.pdf#page=348

Ferguson, R., & Clow, D. (2017). Where is the Evidence?: A Call to Action for Learning Analytics. Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK ’17), 13–17 March 2017, Vancouver, BC, Canada (pp. 56–65). New York: ACM. https://doi.org/10.1145/3027385.3027396

Gašević, D., Mirriahi, N., Long, P. D., & Dawson, S. (2014). Editorial: Inaugural issue. Journal of Learning Analytics, 1(1), 1–2. https://doi.org/10.18608/jla.2014.11.1

Kitto, K., Buckingham Shum, S., & Gibson, A. (2018). Embracing imperfection in learning analytics. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 451–460). New York: ACM. https://doi.org/10.1145/3170358.3170413

Knight, S., Gibson, A., & Shibani, A. (2020). Implementing learning analytics for learning impact: Taking tools to task. Internet and Higher Education, 45. https://doi.org/10.1016/j.iheduc.2020.100729

Ladson-Billings, G. (2021). I’m here for the hard re-set: Post pandemic pedagogy to preserve our culture. Equity & Excellence in Education, 54(1), 68–78. https://doi.org/10.1080/10665684.2020.1863883

Maslow, A. H. (1966). The psychology of science: A reconnaissance. New York: Harper Collins.

Matz, R., Schulz, K., Hanley, E., Derry, H., Hayward, B., Koester, B., et al. (2021). Analyzing the efficacy of ECoach in supporting gateway course success through tailored support. Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK ’21), 12–16 April 2021, Irvine, CA, USA (pp. 216–225). New York: ACM. https://doi.org/10.1145/3448139.3448160

Papert, S. (1988). A critique of technocentrism in thinking about the school of the future. In B. Sendov & I. Stanchev (Eds.) Children in the information age (pp. 3–18). Oxford, UK: Pergamon.

Pardo, A., Bartimote, K., Buckingham Shum, S., Dawson, S., Gao, J., Gašević, D., et al. (2018). OnTask: Delivering data-informed, personalized learning support actions. Journal of Learning Analytics, 5(3), 235–249. https://doi.org/10.18608/jla.2018.53.15

Prieto Alvarez, C., Martinez-Maldonado, R., & Buckingham Shum, S. (2020, March). LA-DECK: A card-based learning analytics co-design tool. Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK ’20), 23–27 March 2020, Frankfurt, Germany (pp. 63–72). New York: ACM. https://doi.org/10.1145/3375462.3375476

Reeves, T. C., & Lin, L. (2020). The research we have is not the research we need. Educational Technology Research and Development, 68(4), 1991–2001. https://doi.org/10.1007/s11423-020-09811-3

Siemens, G. (2014). The Journal of Learning Analytics: Supporting and promoting learning analytics research. Journal of Learning Analytics, 1(1), 3–5. https://doi.org/10.18608/jla.2014.11.2

Wise, A. F. (2019). Learning analytics: Using data-informed decision-making to improve teaching and learning. In O. Adesope & A. G. Rudd (Eds.), Contemporary technologies in education: Maximizing student engagement, motivation, and learning (pp. 119–143). New York: Palgrave Macmillan. https://doi.org/10.1007/978-3-319-89680-9_7

Wise, A. F., Sarmiento, J. P., & Boothe, M. (2021). Subversive learning analytics. Proceedings of the 11th International Conference on Learning Analytics and Knowledge (LAK ’21), 12–16 April 2021, Irvine, CA, USA (pp. 639–645). New York: ACM. https://doi.org/10.1145/3448139.3448210

Downloads

Published

2021-12-15

How to Cite

Wise, A. F., Knight, S., & Ochoa, X. (2021). What Makes Learning Analytics Research Matter. Journal of Learning Analytics, 8(3), 1-9. https://doi.org/10.18608/jla.2021.7647