From Students’ Questions to Students’ Profiles in a Blended Learning Environment

Authors

  • Fatima Harrak Sorbonne Université
  • François Bouchet Sorbonne Université
  • Vanda Luengo Sorbonne Université

DOI:

https://doi.org/10.18608/jla.2019.61.4

Keywords:

Clustering, question taxonomy, student’s behaviour, blended learning

Abstract

The analysis of students’ questions can be used to improve the learning experience for both students and teacher. We investigated questions (N = 6457) asked before the class by 1st year medicine/pharmacy students on an online platform, used by professors to prepare their on-site Q&A session. Our long-term objectives are to help professors in categorizing those questions, and to provide students with feedback on the quality of their questions. To do so, first we developed a taxonomy of questions then used for an automatic annotation of the whole corpus. We identified students’ characteristics from the typology of questions they asked using K-Means algorithm over four courses. The students were clustered based on the question dimensions only. Then, we characterised the clusters by attributes not used for clustering such as the students’ grade, the attendance, the number and popularity of questions asked. Two similar clusters always appeared (lower than average students with popular questions, and higher than average students with unpopular questions). We replicated these analyses on the same courses across different years to show the possibility to predict students’ profiles online. This work shows the usefulness and the validity of our taxonomy and the relevance of this approach to identify different students’ profiles.

Author Biographies

Fatima Harrak, Sorbonne Université

CNRS, Laboratoire d'Informatique de Paris 6

François Bouchet, Sorbonne Université

CNRS, Laboratoire d'Informatique de Paris 6

Vanda Luengo, Sorbonne Université

CNRS, Laboratoire d'Informatique de Paris 6

References

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Antonenko, P. D., Toy, S., & Niederhauser, D. S. (2012). Using cluster analysis for data mining in educational technology research. Educational Technology Research and Development, 60(3), 383–398. http://doi.org/10.1007/s11423-012-9235-8

Artstein, R., & Poesio, M. (2008). Inter-coder agreement for computational linguistics. Computational Linguistics, 34(4), 555–596. http://doi.org/0.1162/coli.07-034-R2

Bergsma, W. (2013). A bias-correction for Cramér’s V and Tschuprow’s T. Journal of the Korean Statistical Society, 42(3), 323–328. http/doi.org/10.1016/j.jkss.2012.10.002

Bloom, B. S., Engelhart, M. B., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives. The classification of educational goals. Handbook I: Cognitive domain. New York: Longmans Green.

Bouchet, F. (2009). Characterization of conversational activities in a corpus of assistance requests. Proceedings of the 14th Student Session of the European Summer School for Logic, Language, and Information (ESSLLI 2018), 6–17 August 2018, Sofia, Bulgaria (pp. 40–50).

Cao, M., Tang, Y., & Hu, X. (2017). An analysis of students’ questions in MOOCs forums. In X. Hu, T. Barnes, A. Hershkovitz, & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (EDM2017), 25–28 June 2017, Wuhan, China (pp. 412–413). International Educational Data Mining Society.

Chin, C., & Brown, D. E. (2000). Learning deeply in science: An analysis and reintegration of deep approaches in two case studies of grade 8 students. Research in Science Education, 30(2), 173–197. http://doi.org/10.1007/BF02461627

Chin, C., & Brown, D. E. (2002). Student-generated questions: A meaningful aspect of learning in science. International Journal of Science Education, 24(5), 521–549. http://doi.org/10.1080/09500690110095249

Chin, C., & Kayalvizhi, G. (2002). Posing problems for open investigations: What questions do pupils ask? Research in Science & Technological Education, 20(2), 269–287. http://doi.org/10.1080/0263514022000030499

Chin, C., & Osborne, J. (2008). Students’ questions: A potential resource for teaching and learning science. Studies in Science Education, 44(1), 1–39. http://doi.org/10.1080/03057260701828101

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

Colbert, J. T., Olson, J. K., & Clough, M. P. (2007). Using the web to encourage student-generated questions in large-format introductory biology classes. CBE-Life Sciences Education, 6(1), 42–48.

Cook, C., Olney, A. M., Kelly, S., & D’Mello, S. K. (2018). An open vocabulary approach for detecting authentic questions in classroom discourse. In K. E. Boyer & M. Yudelson (Eds.), Proceedings of the 11th International Conference on Educational Data Mining (EDM2018), 16–20 July 2018, Buffalo, New York, USA (pp. 116–126). International Educational Data Mining Society.

Elgort, I., Lundqvist, K., McDonald, J., & Moskal, A. C. M. (2018). Analysis of student discussion posts in a MOOC: Proof of concept. Companion Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia. SoLAR. Retrieved from https://solaresearch.org/core/companion-proceedings-of-the-8th-international-learning-analytics-knowledge-conference-lak18/

Etkina, E., & Harper, K. A. (2002). Weekly reports: Student reflections on learning. Journal of College Science Teaching, 31(7), 476.

Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2–8. http://doi.org/10.1037/a0024338

Graesser, A. C., Ozuru, Y., & Sullins, J. (2009). What is a good question? In M. G. McKeown & L. Kucan (Eds.), Threads of coherence in research on the development of reading ability (pp. 112–141). New York: Guilford.

Graesser, A. C., & Person, N. K. (1994). Question asking during tutoring. American Educational Research Journal, 31(1), 104–137. http://doi.org/10.2307/1163269

Graesser, A. C., Person, N. K., & Huber, J. D. (1992). Mechanisms that generate questions. In T. Lauer, E. Peacock, & A. C. Graesser (Eds.), Questions and information systems (pp. 167–187). Hillsdale, NJ: Erlbaum. http://doi.org/10.4324/9780203763148

Harrak, F., Bouchet, F., & Luengo, V. (2017). Identifying relationships between students’ questions type and their behavior. In X. Hu, T. Barnes, A. Hershkovitz, & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (EDM2017), 25–28 June 2017, Wuhan, China (pp. 402–403). International Educational Data Mining Society.

Harrak, F., Bouchet, F., Luengo, V., & Gillois, P. (2018). Profiling students from their questions in a blended learning environment. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 102–110). New York: ACM. http://doi.org/10.1145/3170358.3170389

Harper, K. A., Etkina, E., & Lin, Y. (2003). Encouraging and analyzing student questions in a large physics course: Meaningful patterns for instructors. Journal of Research in Science Teaching, 40(8), 776–791. http://doi.org/10.1002/tea.10111

Ishola, O. M., & McCalla, G. (2017). Predicting prospective peer helpers to provide just-in-time help to users in question and answer forums. In X. Hu, T. Barnes, A. Hershkovitz, & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (EDM2017), 25–28 June 2017, Wuhan, China (pp. 238–243). International Educational Data Mining Society.

Kim, J., Shaw, E., & Ravi, S. (2011). Mining student discussions for profiling participation and scaffolding learning. In C. Romero, S. Ventura, M. Pechenizkiy, & R. S. J. d. Baker (Eds.), Handbook of Educational Data Mining (pp. 299–310). Boca Raton, FL: CRC Press. http://doi.org/10.1201/b10274-24

Kiss, T., & Strunk, J. (2006). Unsupervised multilingual sentence boundary detection. Computational Linguistics, 32(4), 485–525. http://doi.org/10.1162/coli.2006.32.4.485

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall.

Kolb, D. (1985). LSI learning style inventory: Self-scoring inventory and interpretation booklet. Boston, MA: McBer & Co.

Li, H., Duan, Y., Clewley, D. N., Morgan, B., Graesser, A. C., Shaffer, D. W., & Saucerman, J. (2014). Question asking during collaborative problem solving in an online game environment. In S. Trausan-Matu, K. E. Boyer, M. Crosby, & K. Panourgia (Eds.), Proceedings of the 12th International Conference on Intelligent Tutoring Systems (ITS 2014), 5–9 June 2014, Honolulu, HI, USA (pp. 617–618). New York: Springer. http://doi.org/10.1007/978-3-319-07221-0_80

Liu, Q., Peng, W., Zhang, F., Hu, R., Li, Y., & Yan, W. (2016). The effectiveness of blended learning in health professions: Systematic review and meta-analysis. Journal of Medical Internet Research, 18(1). http://doi.org/10.2196/jmir.4807

Marbach‐Ad, G., & Sokolove, P. G. (2000a). Can undergraduate biology students learn to ask higher level questions? Journal of Research in Science Teaching, 37(8), 854–870. http://doi.org/10.1002/1098-2736(200010)37:8<854::AID-TEA6>3.0.CO;2-5

Marbach-Ad, G., & Sokolove, P. G. (2000b). Good science begins with good questions. Journal of College Science Teaching, 30(3), 192.

Otero, J., & Graesser, A. C. (2001). PREG: Elements of a model of question asking. Cognition and Instruction, 19(2), 143–175. . Good science begins with good questions. Journal of College Science Teaching, 30(3), 192. http://doi.org/10.1207/S1532690XCI1902_01

Scardamalia, M., & Bereiter, C. (1992). Text-based and knowledge-based questioning by children. Cognition and Instruction, 9(3), 177–199. http://doi.org/10.1207/S1532690XCI1902_0110.1207/s1532690xci0903_1

Pedrosa de Jesus, H., Almeida, P. C., & Watts, M. (2004). Questioning styles and students’ learning: Four case studies. Educational Psychology, 24(4), 531–548. http://doi.org/10.1207/S1532690XCI1902_0110.1080/0144341042000228889

Pedrosa de Jesus, H. P., Teixeira-Dias, J. J., & Watts, M. (2003). Questions of chemistry. International Journal of Science Education, 25(8), 1015–1034. http://doi.org/10.1080/09500690305022

Pizzini, E. L., & Shepardson, D. P. (1991). Student questioning in the presence of the teacher during problem solving in science. School Science and Mathematics, 91(8), 348–352. http://doi.org/10.1111/j.1949-8594.1991.tb12118.x

Sindhgatta, R., Marvaniya, S., Dhamecha, T. I., & Sengupta, B. (2017). Inferring frequently asked questions from student question answering forums. In X. Hu, T. Barnes, A. Hershkovitz, & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (EDM2017), 25–28 June 2017, Wuhan, China (pp. 256–261). International Educational Data Mining Society.

Supraja, S., Hartman, K., Tatinati, S., & Khong, A. W. (2017). Toward the automatic labeling of course questions for ensuring their alignment with learning outcomes. In X. Hu, T. Barnes, A. Hershkovitz, & L. Paquette (Eds.), Proceedings of the 10th International Conference on Educational Data Mining (EDM2017), 25–28 June 2017, Wuhan, China (pp. 56–63). International Educational Data Mining Society.

Watts, M., Gould, G., & Alsop, S. (1997). Questions of understanding: categorising pupils’ questions in science. School Science Review, 79(286), 57–63.

White, R. T., & Gunstone, R. F. (1992). Probing understanding. London: Falmer Press.

Downloads

Published

2019-04-14

How to Cite

Harrak, F., Bouchet, F., & Luengo, V. (2019). From Students’ Questions to Students’ Profiles in a Blended Learning Environment. Journal of Learning Analytics, 6(1), 54—84. https://doi.org/10.18608/jla.2019.61.4