Synergies of Learning Analytics and Learning Design: A Systematic Review of Student Outcomes

Authors

DOI:

https://doi.org/10.18608/jla.2020.73.3

Keywords:

Learning design, Learning analytics, Higher education, Effect size, Analytics framework, Collaborative learning, Learning gain, Personalized learning, Online learning, Self-regulation, Student learning

Abstract

The field of learning analytics (LA) has seen a gradual shift from purely data-driven approaches to more holistic views of improving student learning outcomes through data-informed learning design (LD). Despite the growing potential of LA in higher education (HE), the benefits are not yet convincing to the practitioner, in particular aspects of aligning LA data with LD toward desired learning outcomes. This review presents a systematic evaluation of effect sizes reported in 38 key studies in pursuit of effective LA approaches to measuring student learning gain for the enhancement of HE pedagogy and delivery. Large positive effects on student outcomes were found in LDs that fostered socio-collaborative and independent learning skills. Recent trends in personalization of learner feedback identified a need for the integration of student-idiosyncratic factors to improve the student experience and academic outcomes. Finally, key findings are developed into a new three-level framework, the LA Learning Gain Design (LALGD) model, to align meaningful data capture with pedagogical intentions and their learning outcomes. Suitable for various settings — face to face, blended, or fully online — the model contributes to data-informed learning and teaching pedagogies in HE.

References

Adamopoulos, P. (2013). What makes a great MOOC? An interdisciplinary analysis of student retention in online courses. In Proceedings of the International Conference on Information Systems (ICIS 2013): Reshaping Society Through Information Systems Design, 15–18 December 2013, Milan, Italy (Vol. 5, pp. 4720–4740). Retrieved from https://aisel.aisnet.org/icis2013/proceedings/BreakthroughIdeas/13/

Akhtar, S., Warburton, S., & Xu, W. (2017). The use of an online learning and teaching system for monitoring computer aided design student participation and predicting student success. International Journal of Technology and Design Education, 27(2), 251–270. https://dx.doi.org/10.1007/s10798-015-9346-8

Alhadad, S. S., Thompson, K., Knight, S., Lewis, M., & Lodge, J. M. (2018). Analytics-enabled teaching as design: Reconceptualisation and call for research. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, Australia (pp. 427–435). New York, NY: ACM. https://dx.doi.org/10.1145/3170358.3170390

Baker, C. (2010). The impact of instructor immediacy and presence for online student affective learning, cognition, and motivation. Journal of Educators Online, 7(1). https://dx.doi.org/10.9743/JEO.2010.1.2

Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., …, Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In Proceedings of the 6th International Conference on Learning Analytics and Knowledge (LAK ’16), 25–29 April 2016, Edinburgh, UK (pp. 329–338). New York, NY: ACM. https://dx.doi.org/10.1145/2883851.2883944

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364. https://dx.doi.org/10.1007/BF00138871

Blikstein, P., Worsley, M., Piech, C., Sahami, M., Cooper, S., & Koller, D. (2014). Programming pluralism: Using learning analytics to detect patterns in the learning of computer programming. Journal of the Learning Sciences, 23(4), 561–599. https://dx.doi.org/10.1080/10508406.2014.954750

Blumenstein, M., Liu, D. Y. T., Richards, D., Leichtweis, S., & Stephens, J. M. (2018). Data-informed nudges for student engagement and success. In J. Lodge, J. C. Horvath, & L. Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (pp. 185–207). London, UK: Routledge. https://dx.doi.org/10.4324/9781351113038-12

Calvert, C. E. (2014). Developing a model and applications for probabilities of student success: A case study of predictive analytics. Open Learning: The Journal of Open and Distance Learning, 29(2), 160–173. https://dx.doi.org/10.1080/02680513.2014.931805

Casquero, O., Ovelar, R., Romo, J., Benito, M., & Alberdi, M. (2016). Students’ personal networks in virtual and personal learning environments: A case study in higher education using learning analytics approach. Interactive Learning Environments, 24(1), 49–67. https://dx.doi.org/10.1080/10494820.2013.817441

Cheung, A. C., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. https://dx.doi.org/10.3102%2F0013189X16656615

Cohen, J. (2013). Statistical power analysis for the behavioral sciences. New York, NY: Routledge. https://dx.doi.org/10.4324/9780203771587

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., & Fisher, J. (2015). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement. Sydney, NSW: Australian Office for Learning and Teaching. Retrieved from https://hdl.handle.net/10453/117173

Conole, G., Gašević, D., Long, P., & Siemens, G. (2011). Message from the LAK 2011 General and Program Chairs. In G. Conole, & D. Gašević (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK ’11), 27 February–1 March 2011, Banff, AB, Canada. New York, NY: ACM. Retrieved from https://dl.acm.org/doi/proceedings/10.1145/2330601

Cooper, A. (2012). What is analytics? Definition and essential characteristics. CETIS Analytics Series, 1(5), 1–10. Retrieved from http://publications.cetis.ac.uk/2012/521

Corrigan, O., Smeaton, A. F., Glynn, M., & Smyth, S. (2015). Using educational analytics to improve test performance. In G. Conole, T. Klobučar, C. Rensing, J. Konert, & E. Lavoué (Eds.), Design for teaching and learning in a networked world. Lecture Notes in Computer Science (Vol 9307, pp. 42–55). Cham, Switzerland: Springer. https://dx.doi.org/10.1007/978-3-319-24258-3_4

Dawson, S., Gašević, D., Siemens, G., & Joksimović, S. (2014). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ’14), 24–28 March 2014, Indianapolis, IN (pp. 231–240). New York, NY: ACM. https://dx.doi.org/10.1145/2567574.2567585

Dawson, S., Jovanović, J., Gašević, D., & Pardo, A. (2017). From prediction to impact: Evaluation of a learning analytics retention program. In Proceedings of the 7th International Conference on Learning Analytics and Knowledge (LAK ’17), 13–17 March, 2017, Vancouver, BC (pp. 474–478). New York, NY: ACM. https://dx.doi.org/10.1145/3027385.3027405

D’Mello, S., Dieterle, E., & Duckworth, A. (2017). Advanced, analytic, automated (AAA) measurement of engagement during learning. Educational Psychologist, 52(2), 104–123. https://dx.doi.org/10.1080/00461520.2017.1281747

Dodge, B., Whitmer, J., & Frazee, J. P. (2015). Improving undergraduate student achievement in large blended courses through data-driven interventions. In Proceedings of the 5th International Conference on Learning Analytics and Knowledge (LAK ’15), 16–20 March 2015, Poughkeepsie, NY (pp. 412–413). New York, NY: ACM. https://dx.doi.org/10.1145/2723576.2723657

Domínguez, C., Jaime, A., Sánchez, A., Blanco, J. M., & Heras, J. (2016). A comparative analysis of the consistency and difference among online self-, peer-, external- and instructor-assessments: The competitive effect. Computers in Human Behavior, 60, 112–120. https://dx.doi.org/10.1016/j.chb.2016.02.061

Evans, C., Kandiko Howson, C., & Forsythe, A. (2018). Making sense of learning gain in higher education. Higher Education Pedagogies, 3(1), 1–45. https://dx.doi.org/10.1080/23752696.2018.1508360

Ferguson, R., & Clow, D. (2015). Examining engagement: Analysing learner subpopulations in massive open online courses (MOOCs). In Proceedings of the 5th International Conference on Learning Analytics and Knowledge (LAK ’15), 16–20 March 2015, Poughkeepsie, NY (pp. 51–58). New York, NY: ACM. https://dx.doi.org/10.1145/2723576.2723606

Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics. In Proceedings of the 7th International Conference on Learning Analytics and Knowledge (LAK ’17), 13–17 March 2017, Vancouver, BC, Canada (pp. 56–65). New York, NY: ACM. https://dx.doi.org/10.1145/3027385.3027396

Ferguson, R., Clow, D., Griffiths, D., & Brasher, A. (2019). Moving forward with learning analytics: Expert views. Journal of Learning Analytics, 6(3), 43–59. https://dx.doi.org/10.18608/jla.2019.63.8

Gašević, D., Adesope, O., Joksimović, S., & Kovanović, V. (2015). Externally-facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions. The Internet and Higher Education, 24, 53–65. https://dx.doi.org/10.1016/j.iheduc.2014.09.006

Gašević, D., Dawson, S., Rogers, T., & Gašević, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. Internet and Higher Education, 28, 68–84. https://dx.doi.org/10.1016/j.iheduc.2015.10.002

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. https://dx.doi.org/10.1007/s11528-014-0822-x

Gašević, D., Kovanović, V., & Joksimović, S. (2017). Piecing the learning analytics puzzle: A consolidated model of a field of research and practice. Learning: Research and Practice, 3(1), 63–78. https://dx.doi.org/10.1080/23735082.2017.1286142

Gašević, D., Kovanović, V., Joksimović, S., & Siemens, G. (2014). Where is research on massive open online courses headed? A data analysis of the MOOC Research Initiative. The International Review of Research in Open and Distributed Learning, 15(5). https://dx.doi.org/10.19173/irrodl.v15i5.1954

Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285–292. https://dx.doi.org/10.1016/j.chb.2012.09.005

Goggins, S., & Xing, W. L. (2016). Building models explaining student participation behavior in asynchronous online discussion. Computers & Education, 94, 241–251. https://dx.doi.org/10.1016/j.compedu.2015.11.002

Gunn, C., McDonald, J., Donald, L., Nichols, M., Milne, J., & Blumenstein, M. (2017). Building an evidence base for teaching and learning design using learning analytics (Report). Wellington, New Zealand: Ako Aotearoa — The National Centre for Tertiary Teaching Excellence. Retrieved from https://ako.ac.nz/knowledge-centre/using-learning-analytics/research-report-building-an-evidence-base-for-teaching-and-learning-design-using-learning-analytics/

Gunnarsson, B. L., & Alterman, R. (2012). Predicting failure: A case study in co-blogging. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK ’12), 29 April–2 May 2012, Vancouver, BC, Canada (pp. 263–266). New York, NY: ACM. https://dx.doi.org/10.1145/2330601.2330665

Hattie, J. (2015). The applicability of visible learning to higher education. Scholarship of Teaching and Learning in Psychology, 1(1), 79–91. Retrieved from https://psycnet.apa.org/doi/10.1037/stl0000021

Huang, T. C. K., Huang, C. H., & Chuang, Y. T. (2016). Change discovery of learning performance in dynamic educational environments. Telematics and Informatics, 33(3), 773–792. https://dx.doi.org/10.1016/j.tele.2015.10.005

Ifenthaler, D., Adcock, A. B., Erlandson, B. E., Gosper, M., Greiff, S., & Pirnay-Dummer, P. (2014). Challenges for education in a connected world: Digital learning, data rich environments, and computer-based assessment — Introduction to the inaugural special issue of Technology, Knowledge and Learning. Technology, Knowledge and Learning, 19(1-2), 121–126. https://dx.doi.org/10.1007/s10758-014-9228-2

Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R., & Baron, J. D. (2014). Early alert of academically at-risk students: An open source analytics initiative. Journal of Learning Analytics, 1(1), 6–47. https://dx.doi.org/10.18608/jla.2014.11.3

Jo, I.-H., Park, Y., Yoon, M., & Sung, H. (2016). Evaluation of online log variables that estimate learners’ time management in a Korean online learning context. International Review of Research in Open and Distance Learning, 17(1), 195–213. https://dx.doi.org/10.19173/irrodl.v17i1.2176

Joksimović, S., Gašević, D., Kovanović, V., Riecke, B. E., & Hatala, M. (2015). Social presence in online discussions as a process predictor of academic performance. Journal of Computer Assisted Learning, 31(6), 638–654. https://dx.doi.org/10.1111/jcal.12107

Joksimović, S., Gašević, D., Loughin, T., Kovanović, V., & Hatala, M. (2015). Learning at distance: Effects of interaction traces on academic achievement. Computers & Education, 87, 204–217. https://dx.doi.org/10.1016/j.compedu.2015.07.002

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., & Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research, 88(1), 43–86. https://dx.doi.org/10.3102/0034654317740335

Jovanović, J., Gašević, D., Pardo, A., Dawson, S., & Whitelock-Wainwright, A. (2019). Introducing meaning to clicks: Towards traced-measures of self-efficacy and cognitive load. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK ’19), 4–8 March 2019, Tempe, AZ (pp. 511–520). New York: ACM. https://dx.doi.org/10.1145/3303772.3303782

Kim, D., Yoon, M., Jo, I. H., & Branch, R. M. (2018). Learning analytics to support self-regulated learning in asynchronous online courses: A case study at a women’s university in South Korea. Computers and Education, 127, 233–251. https://dx.doi.org/10.1016/j.compedu.2018.08.023

Kitchenham, B., & Charters, S. (2007). Guidelines for performing systematic literature reviews in software engineering. EBSE Technical Report v2.3. Keele, Staffs and Durham, UK: Keele University and University of Durham [Joint Report]. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.117.471

Laurillard, D., Charlton, P., Craft, B., Dimakopoulos, D., Ljubojevic, D., Magoulas, G., & Whittlestone, K. (2013). A constructionist learning environment for teachers to model learning designs. Journal of Computer Assisted Learning, 29(1), 15–30. https://dx.doi.org/10.1111/j.1365-2729.2011.00458.x

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge University Press. https://dx.doi.org/10.1017/CBO9780511815355

Li, K. C., Lam, H. K., & Lam, S. S. (2015). A review of learning analytics in educational research. In J. Lam, K. Ng, S. Cheung, T. Wong, K. Li, & F. Wang (Eds.), Technology in education. Technology-mediated proactive learning. ICTE 2015. Communications in Computer and Information Science, vol. 559 (pp. 173–184). New York, NY: Springer. https://dx.doi.org/10.1007/978-3-662-48978-9_17

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://dx.doi.org/10.1177/0002764213479367

Macfadyen, L., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. https://dx.doi.org/10.1016/j.compedu.2009.09.008

Mangaroska, K., & Giannakos, M. (2019). Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies, 12(4), 516–534. http://dx.doi.org/10.1109/TLT.2018.2868673

Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21(1), 59–74. https://dx.doi.org/10.1007/s10758-015-9261-9

Martínez-Muñoz, G., & Pulido, E. (2015). Using a SPOC to flip the classroom. Paper presented at the IEEE Global Engineering Education Conference, EDUCON, 18–20 March 2015, Tallinn, Estonia. Retrieved from https://ieeexplore.ieee.org/document/7096007

McConnell, M. (2019). Using OnTask in a large first year Commercial Law course on a Bachelor of Commerce degree. In A. Pardo, D. Y. T. Liu, L. Vigentini, & M. Blumenstein (Eds.), Scaling personalised student communication — Current initiatives and future directions (Workshop). Australian Learning Analytics Summer Institute (ALASI), University of Wollongong, NSW. Retrieved from https://www.ontasklearning.org/scaling-personalised-student-communication-current-initiatives-future-directions/

Milliron, M. D., Malcolm, L., & Kil, D. (2014). Insight and action analytics: Three case studies to consider. Research & Practice in Assessment, 9(Winter), 72–82. Retrieved from http://www.rpajournal.com/dev/wp-content/uploads/2014/10/A7.pdf

Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221–229. https://dx.doi.org/10.1111/bjet.12273

Muuro, M. E., Oboko, R., & Wagacha, P. (2016). Evaluation of intelligent grouping based on learners’ collaboration competence level in online collaborative learning environment. International Review of Research in Open and Distance Learning, 17(2). https://dx.doi.org/10.19173/irrodl.v17i2.2066

Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714. https://dx.doi.org/10.1016/j.chb.2017.03.028

Ott, C., Robins, A., Haden, P., & Shephard, K. (2015). Illustrating performance indicators and course characteristics to support students’ self-regulated learning in CS1. Computer Science Education, 25(2), 174–198. https://dx.doi.org/10.1080/08993408.2015.1033129

Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49–64. Retrieved from https://www.jstor.org/stable/10.2307/jeductechsoci.17.4.49

Pardo, A., Bartimote-Aufflick, K., Buckingham Shum, S., Dawson, S., Gao, J., Gašević, D., …, Vigentini, L. (2018). OnTask: Delivering data-informed personalized learning support actions. Journal of Learning Analytics, 5(3), 235–249. https://dx.doi.org/10.18608/jla.2018.53.15

Pardo, A., Han, F., & Ellis, R. A. (2017). Combining university student self-regulated learning indicators and engagement with online learning events to predict academic performance. IEEE Transactions on Learning Technologies, 10(1), 82–92. https://dx.doi.org/10.1109/TLT.2016.2639508

Pardo, A., Jovanović, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. https://doi.org/10.1111/bjet.12592

Redecker, C., Leis, M., Leendertse, M., Punie, Y., Gijsbers, G., Kirschner, P. A., …, Hoogveld, B. (2011). The future of learning: Preparing for change. Seville, Spain: JRC IPTS. Retrieved from https://www.researchgate.net/publication/256461836_The_Future_of_Learning_Preparing_for_Change

Reimann, P. (2016). Connecting learning analytics with learning research: The role of design-based research. Learning: Research and Practice, 2(2), 130–142. https://dx.doi.org/10.1080/23735082.2016.1210198

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 3–19). Boston, MA: Springer. https://dx.doi.org/10.1007/978-1-4614-2018-7_1

Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action Evaluation Framework: A review of evidence-based learning analytics interventions at the Open University UK. Journal of Interactive Media in Education, 2016(1). http://dx.doi.org/10.5334/jime.394

Rienties, B., & Alden, B. (2014). Emotions used in learning analytics: A state-of-the-art review. Measuring and Understanding Learner Emotions: Evidence and Prospects (2). LACE Project. Retrieved from https://oro.open.ac.uk/72634/1/LACE_emotions_05_12_2014%20copy.pdf

Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333–341. https://dx.doi.org/10.1016/j.chb.2016.02.074

Rienties, B., Toetenel, L., & Bryan, A. (2015). “Scaling up” learning design: Impact of learning design activities on LMS behavior and performance. In Proceedings of the 5th International Conference on Learning Analytics and Knowledge (LAK ’15), 16–20 March 2015, Poughkeepsie, NY (pp. 315–319). New York, NY: ACM. https://dx.doi.org/10.1145/2723576.2723600

Romero, C., López, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students’ final performance from participation in on-line discussion forums. Computers and Education, 68, 458–472. https://dx.doi.org/10.1016/j.compedu.2013.06.009

Romero-Zaldivar, V. A., Pardo, A., Burgos, D., & Delgado Kloos, C. (2012). Monitoring student progress using virtual appliances: A case study. Computers and Education, 58(4), 1058–1067. https://dx.doi.org/10.1016/j.compedu.2011.12.003

Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52(1), 59–82. https://dx.doi.org/10.1146/annurev.psych.52.1.59

Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student participation in Spanish blended courses. Computer Assisted Language Learning, 31(3), 226–250. https://dx.doi.org/10.1080/09588221.2017.1372481

Schneider, M., & Preckel, F. (2017). Variables associated with achievement in higher education: A systematic review of meta-analyses. Psychological Bulletin, 143(6), 565–600. Retrieved from https://psycnet.apa.org/doi/10.1037/bul0000098

Schumacher, C., & Ifenthaler, D. (2018). The importance of students’ motivational dispositions for designing learning analytics. Journal of Computing in Higher Education, 30(3), 599–619. https://dx.doi.org/10.1007/s12528-018-9188-y

Sønderlund, A. L., Hughes, E., & Smith, J. (2018). The efficacy of learning analytics interventions in higher education: A systematic review. British Journal of Educational Technology, 50(5), 2594–2618. https://dx.doi.org/10.1111/bjet.12720

Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers and Education, 89, 53–74. https://dx.doi.org/10.1016/j.compedu.2015.08.004

Tempelaar, D., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. https://dx.doi.org/10.1016/j.chb.2014.05.038

Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408–420. https://dx.doi.org/10.1016/j.chb.2017.08.010

Vermunt, J. D. (1996). Metacognitive, cognitive and affective aspects of learning styles and strategies: A phenomenographic analysis. Higher Education, 31(1), 25–50. https://dx.doi.org/10.1007/BF00129106

Vermunt, J. D., Ilie, S., & Vignoles, A. (2018). Building the foundations for measuring learning gain in higher education: A conceptual framework and measurement instrument. Higher Education Pedagogies, 3(1), 266–301. https://dx.doi.org/10.1080/23752696.2018.1484672

Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110. https://dx.doi.org/10.1016/j.chb.2018.07.027

West, D., Huijser, H., & Heath, D. (2018). Blurring the boundaries: Developing leadership in learning analytics. In J. Lodge, J. C. Horvath, & L. Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (pp. 267–283). London, UK: Routledge. https://dx.doi.org/10.4324/9781351113038-16

You, J. W. (2015). Examining the effect of academic procrastination on achievement using LMS data in e-learning. Journal of Educational Technology & Society, 18(3), 64–74. Retrieved from https://www.jstor.org/stable/10.2307/jeductechsoci.18.3.64

Yu, T., & Jo, I. H. (2014). Educational technology approach toward learning analytics: Relationship between student online behavior and learning performance in higher education. In Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ’14), 24–28 March 2014, Indianapolis, IN (pp. 269–270). New York, NY: ACM https://dx.doi.org/10.1145/2567574.2567594

Zhang, J.-H., Zou, L.-c., Miao, J.-j., Zhang, Y.-X., Hwang, G.-J., & Zhu, Y. (2019). An individualized intervention approach to improving university students’ learning performance and interactive behaviors in a blended learning environment. Interactive Learning Environments, 28(2), 231–245. https://dx.doi.org/10.1080/10494820.2019.1636078

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41(2), 64–70. https://dx.doi.org/10.1207/s15430421tip4102_2

Downloads

Published

2020-12-17

How to Cite

Blumenstein, M. (2020). Synergies of Learning Analytics and Learning Design: A Systematic Review of Student Outcomes. Journal of Learning Analytics, 7(3), 13-32. https://doi.org/10.18608/jla.2020.73.3

Issue

Section

Special Section: Learning Design and Learning Analytics