Curriculum Analytics of Course Choices:

Links with Academic Performance

Authors

DOI:

https://doi.org/10.18608/jla.2024.8095

Keywords:

curriculum analytics, learning analytics, course choices, course difficulty, research paper

Abstract

In a higher education context, students are expected to take charge of their learning by deciding “what” to learn and “how” to learn. While the learning analytics (LA) community has seen increasing research on the “how” to learn part (i.e., researching methods for supporting students in their learning journey), the “what” to learn part is still underinvestigated. We present a case study of curriculum analytics and its application to a dataset of 243 students of the bachelor’s program in the broad discipline of health sciences to explore the effects of course choices on students’ academic performance. Using curriculum metrics such as grading stringency, course temporal position, and duration, we investigated how course choices differed between high- and low-performing students using both temporal and sequential analysis methods. We found that high-performing students were likely to pick an elective course of low difficulty. It appeared that these students were more strategic in terms of their course choices than their low-performing peers. Generally, low-performing students seemed to have made suboptimal choices when selecting elective courses; e.g., when they picked an elective course of high difficulty, they were less likely to pick a following course of low difficulty. The findings of this study have design implications for researchers, program directors, and coordinators, because they can use the results to (i) update the course sequencing, (ii) guide students about course choices based on their current GPA (such as through course recommendation dashboards), (iii) identify bottleneck courses, and (iv) assist higher education institutions in planning a more balanced course roadmap to help students manage their workload effectively.

References

Allen, J. M., & Smith, C. L. (2008). Importance of, responsibility for, and satisfaction with academic advising: A faculty perspective. Journal of College Student Development, 49(5), 397–411. https://doi.org/10.1353/csd.0.0033

Barr, D. A., Gonzalez, M. E., & Wanat, S. F. (2008). The leaky pipeline: Factors associated with early decline in interest in premedical studies among underrepresented minority undergraduate students. Academic Medicine, 83(5), 503–511. https://doi.org/10.1097/ACM.0b013e31816bda16

Bendatu, L. Y., & Yahya, B. N. (2015). Sequence matching analysis for curriculum development. Jurnal Teknik Industri, 17(1), 47–52. https://doi.org/10.9744/jti.17.1.47-52

Berland, M., Baker, R. S., & Blikstein, P. (2014). Educational data mining and learning analytics: Applications to Constructionist research. Technology, Knowledge and Learning, 19(1), 205–220. https://doi.org/10.1007/s10758-014-9223-7

Borchers, C., & Pardos, Z. A. (2023, March 13). Insights into undergraduate pathways using course load analytics. In Proceedings of the 13th International Conference on Learning Analytics and Knowledge (LAK 2023), 13–17 March 2023, Arlington, Texas, USA (pp. 219–229). ACM. https://doi.org/10.1145/3576050.3576081

Bouwma-Gearhart, J. L., & Hora, M. T. (2016). Supporting faculty in the era of accountability: How postsecondary leaders can facilitate the meaningful use of instructional data for continuous improvement. Journal of Higher Education Management, 31(1), 44–56. https://jhem.wpengine.com/wp-content/uploads/2019/03/JHEM-31-1-2016-rev8-23-16.pdf

Caulkins, J. P., Larkey, P. D., & Wei, J. (1996). Adjusting GPA to reflect course difficulty, The Heinz School of Public Policy and Management Carnegie Mellon University. https://kilthub.cmu.edu/articles/Adjusting_GPA_to_Reflect_Course_Difficulty/6470981/files/11899826.pdf

Cohen, J. (2013). Statistical power analysis for the behavioral sciences (2nd edition). Elsevier Science.

Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a Hispanic serving institution. American Educational Research Journal, 24(4), 924–942. https://doi.org/10.3102/0002831209349460

Dawson, S., & Hubball, H. (2014). Curriculum analytics: Application of social network analysis for improving strategic curriculum decision-making in a research-intensive university. Teaching and Learning Inquiry, 2(2), 59–74. https://doi.org/10.2979/teachlearninqu.2.2.59

Dennehy, D., Conboy, K., & Babu, J. (2021). Adopting learning analytics to inform postgraduate curriculum design: Recommendations and research agenda. Information Systems Frontiers, 25, 1315–1331. https://doi.org/10.1007/s10796-021-10183-z

D’Mello, S., Lehman, B., Sullins, J., Daigle, R., Combs, R., Vogt, K., Perkins, L., & Graesser, A. (2010). A time for emoting: When affect-sensitivity is and isn’t effective at promoting deep learning. In V. Aleven, J. Kay, & J. Mostow (Eds.), International Conference on Intelligent Tutoring Systems (ITS 2010), 14–18 June 1010, Pittsburgh, Pennsylvania, USA (pp. 245–254). Springer. https://doi.org/10.1007/978-3-642-13388-6_29

D’Mello, S., Taylor, R. S., & Graesser, A. (2007). Monitoring affective trajectories during complex learning. Proceedings of the Annual Meeting of the Cognitive Science Society, 29, 203–208. https://escholarship.org/uc/item/6p18v65q

Gottipati, S., & Shankararaman, V. (2018). Competency analytics tool: Analyzing curriculum using course competencies. Education and Information Technologies, 23(1), 41–60. https://doi.org/10.1007/s10639-017-9584-3

Greer, J., Molinaro, M., Ochoa, X., & McKay, T. (2016). Learning analytics for curriculum and program quality improvement (PCLA 2016). In Proceedings of the Sixth International Conference on Learning Analytics and Knowledge (LAK 2016), 25–29 April 2016, Edinburgh, UK (pp. 494–495). ACM. https://doi.org/10.1145/2883851.2883899

Hilliger, I., Aguirre, C., Miranda, C., Celis, S., & Pérez-Sanagustín, M. (2020). Design of a curriculum analytics tool to support continuous improvement processes in higher education. In Proceedings of the 10th International Conference on Learning Analytics and Knowledge (LAK 2010), 23–27 March 2020, Frankfurt, Germany (pp. 181–186). ACM. https://doi.org/10.1145/3375462.3375489

Hilliger, I., Laet, T. D., Henríquez, V., Guerra, J., Ortiz-Rojas, M., Zuñiga, M. A´ ., Baier, J., & Pérez-Sanagustín, M. (2020). For learners, with learners: Identifying indicators for an academic advising dashboard for students. In C. Alario-Hoyos, M. J. Rodríguez-Triana, M. Scheffel, I. Arnedillo-Sánchez, & S. M. Dennerlein (Eds.), Proceedings of the 15th European Conference on Technology Enhanced Learning (EC-TEL 2020), 14–18 September 2020, Heidelberg, Germany (pp. 117–130). Springer. https://doi.org/10.1007/978-3-030-57717-9_9

Hilliger, I., Miranda, C., Celis, S., & Pérez-SanAgustín, M. (2019). Evaluating usage of an analytics tool to support continuous curriculum improvement. In A. Fessl & T. Z. Draksler (Eds.), EC-TEL Practitioner Proceedings 2019: 14th European Conference on Technology Enhanced Learning (EC-TEL 2019), 16–19 September 2019, Delft, Netherlands. https://ceur-ws.org/Vol-2437/paper5.pdf

Jin, W., Muriel, A., & Sibieta, L. (2011). Subject and course choices at ages 14 and 16 amongst young people in England: insights from behavioural economics. Department for Education, Government of the United Kingdom. https://www.gov.uk/government/publications/subject-and-course-choices-at-ages-14-and-16-amongst-young-people-inengland-insights-from-behavioural-economics

Kardan, A. A., Sadeghi, H., Ghidary, S. S., & Sani, M. R. F. (2013). Prediction of student course selection in online higher education institutes using neural network. Computers & Education, 65, 1–11. https://doi.org/10.1016/j.compedu.2013.01.015

Karumbaiah, S., Baker, R. B., Ocumpaugh, J., & Andres, A. (2021). A re-analysis and synthesis of data on affect dynamics in learning. IEEE Transactions on Affective Computing, 14(2), 1070–1081. https://doi.org/10.1109/TAFFC.2021.3086118

Kizilcec, R. F., Baker, R. B., Bruch, E., Cortes, K. E., Hamilton, L. T., Lang, D. N., Pardos, Z. A., Thompson, M. E., & Stevens, M. L. (2023). From pipelines to pathways in the study of academic progress. Science, 380(6643), 344–347. https://doi.org/10.1126/science.adg5406

Matcha, W., Ahmad Uzir, N., Gaševíc, D., & Pardo, A. (2020). A systematic review of empirical studies on learning analytics dashboards: A self-regulated learning perspective. IEEE Transactions on Learning Technologies, 13(2), 226–245. https://doi.org/10.1109/TLT.2019.2916802

Mendez, G., Ochoa, X., Chiluiza, K., & De Wever, B. (2014). Curricular design analysis: A data-driven perspective. Journal of Learning Analytics, 1(3), 84–119. https://doi.org/10.18608/jla.2014.13.6

Mervis, J. (2010). Better intro courses seen as key to reducing attrition of STEM majors. Science, 330(6002), 306. https://doi.org/10.1126/science.330.6002.306

Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221–229. https://doi.org/10.1111/bjet.12273

Nawaz, S., Alghamdi, E. A., Srivastava, N., Lodge, J., & Corrin, L. (2022). Understanding the role of AI and learning analytics techniques in addressing task difficulties in STEM education. In F. Ouyang, P. Jiao, B. M. McLaren, & A. H. Alavi (Eds.), Artificial intelligence in STEM education (p. 25). Taylor & Francis.

Nawaz, S., Kennedy, G., Bailey, J., & Mead, C. (2020). Moments of confusion in simulation-based learning environments. Journal of Learning Analytics, 7(3), 118–137. https://doi.org/10.18608/jla.2020.73.9

Nawaz, S., Srivastava, N., Yu, J. H., Baker, R. S., Kennedy, G., & Bailey, J. (2020). Analysis of task difficulty sequences in a simulation-based POE environment. In I. I. Bittencourt, M. Cukurova, K. Muldner, R. Luckin, & E. Millán (Eds.), Proceedings of the 21st International Conference on Artificial Intelligence in Education (AIED 2020), 6–10 July 2020, Ifrane, Morocco (pp. 423–436). Springer. https://doi.org/10.1007/978-3-030-52237-7_34

Nawaz, S., Srivastava, N., Yu, J. H., Khan, A. A., Kennedy, G., Bailey, J., & Baker, R. S. (2022). How difficult is the task for you? Modelling and analysis of students’ task difficulty sequences in a simulation-based POE environment. International Journal of Artificial Intelligence in Education, 32(2), 233–262. https://doi.org/10.1007/s40593-021-00242-6

Ochoa, X. (2016). Simple metrics for curricular analytics. In J. Greer, M. Molinaro, X. Ochoa, & T. McKay (Eds.), Proceedings of the First Learning Analytics for Curriculum and Program Quality Improvement Workshop, co-located with the Sixth International Learning Analytics and Knowledge Conference (LAK 2016), 25 April 2016, Edinburgh, UK (pp. 20–26, Vol. 1590). https://ceur-ws.org/Vol-1590/paper-04.pdf

OECD. (2018). Education 2030: The future of education and skills. https://www.oecd.org/education/2030-project

Ognjanovic, I., Gasevic, D., & Dawson, S. (2016). Using institutional data to predict student course selections in higher education. The Internet and Higher Education, 29, 49–62. https://doi.org/10.1016/j.iheduc.2015.12.002

Pardos, Z. A., Borchers, C., & Yu, R. (2022). Credit hours is not enough: Explaining undergraduate perceptions of course workload using lms records. The Internet and Higher Education, 100882. https://doi.org/10.1016/j.iheduc.2022.100882

Pechenizkiy, M., Trcka, N., De Bra, P., & Toledo, P. (2012). CurriM: curriculum mining. In K. Yacef, O. Zaïane, A. Hershkovitz, M. Yudelson, & J. Stamper (Eds.), Proceedings of the Fifth International Conference on Educational Data Mining (EDM 2012), 19–21 June 2012, Chania, Greece. https://educationaldatamining.org/EDM2012/uploads/procs/EDM2012_proceedings.pdf

Ruffalo Noel Levitz. (2013). National student satisfaction and priorities report. https://www.ruffalonl.com/wp-content/uploads/pdf/2013_Student_Satisfaction_Report.pdf

Salazar-Fernandez, J. P., Sepúlveda, M., Munoz-Gama, J., & Nussbaum, M. (2021). Curricular analytics to characterize educational trajectories in high-failure rate courses that lead to late dropout. Applied Sciences, 11(4). https://doi.org/10.3390/app11041436

Simanca, F., Gonzalez Crespo, R., Rodríguez-Baena, L., & Burgos, D. (2019). Identifying students at risk of failing a subject by using learning analytics for subsequent customised tutoring. Applied Sciences, 9(3), 448. https://doi.org/10.3390/app9030448

Sutton, K. L., & Sankar, C. S. (2011). Student satisfaction with information provided by academic advisors. Journal of STEM Education: Innovations and Research, 12(7). https://www.jstem.org/jstem/index.php/JSTEM/article/view/1734/1404

Uzonwanne, F. C. (2016). Rational model of decision making. In A. Farazmand (Ed.), Global encyclopedia of public administration, public policy, and governance (pp. 1–6). Springer International Publishing. https://doi.org/10.1007/978-3-319-31816-5_2474-1

Vygotsky, L. S. (2012). Thought and language. MIT press.

Yaginuma, Y. (2017). Syllabus visualization tool based on standard curriculum. In Proceedings of the 2017 IEEE 6th Global Conference on Consumer Electronics (GCCE 2017), 24–27 October 2017, Nagoya, Japan (pp. 1–2). IEEE. https://doi.org/10.1109/GCCE.2017.8229215

Downloads

Published

2024-01-22

How to Cite

Srivastava, N., Nawaz, S., Tsai, Y.-S., & Gašević, D. (2024). Curriculum Analytics of Course Choices: : Links with Academic Performance. Journal of Learning Analytics, 11(1), 116-131. https://doi.org/10.18608/jla.2024.8095

Most read articles by the same author(s)

1 2 3 4 > >>