Beyond Cognitive Ability: Enabling Assessment of 21st Century Skills Through Learning Analytics - Call for Papers
GUEST EDITORS
- Srećko Joksimović, Teaching Innovation Unit & School of Education, University of South Australia
- Elle (Yuan) Wang, The Action Lab, EdPlus, Arizona State University,
- Maria Ofelia San Pedro, ACT, Inc.
- Jason Way, ACT, Inc.
- George Siemens, Centre for Change and Complexity in Learning, University of South Australia, LINK Lab & University of Texas at Arlington
AIMS & SCOPE
Educational research is increasingly measuring attributes beyond cognitive ability (Farrington et al., 2012; Lounsbury, Sundstrom, Loveland, & Gibson, 2003; Mattern et al., 2014; McAbee, Oswald, & Connelly, n.d.; Richardson, Abraham, & Bond, 2012). Consensus does not yet exist on a common term to describe these skills or attributes as they are variously defined as: “non-cognitive skills”, “21st century competencies”, “personal qualities”, “social and emotional learning skills”, or “soft skills”. These capture dimensions of learning broader than academic knowledge and have been well established in contemporary literature as highly relevant for success in school, work, or life in general (Barrick, Mount, & Judge, 2003; Poropat, 2009). However, a challenge remains with the current assessment practices (e.g., summative, self-reported, infrequent, subjective) of these constructs as they are hard to quantify and measure.
Challenges of 21st Century Skills Assessment and Ways Forward
Although various assessments have been developed to support the implementation of 21st century skills into the curriculum, the challenge remains with respect to providing real-time (analytics-based) insights into the current level of the development of such particular skill(s). For example, in a longitudinal study, Duckworth and Seligman (2005) showed that self-discipline, measured by self-report, parent, and teacher reports, as well as monetary choice questionnaires, accounted for more than twice as much variance as IQ in core academic skills. Walton and Cohen (2011), on the other hand, showed that social belonging is a “psychological lever where targeted intervention can have broad consequences that lessen inequalities in achievement and health”.
However, while the approaches used to assess these relevant skills and competencies work well in laboratory environments, they do not easily translate into immediate actionable insights that would help teachers and learners understand learning as it unfolds. In addition to applicability at scale, the nature of tools and methods used to assess these 21st century skills and competencies yields another issue with the suitability, reliability, and accuracy of their measures (e.g., self-reports, teacher-reports, performance tasks; Duckworth & Yeager, 2015).
Various reasons have been identified as potential factors that affect reliability of self-reports. Most commonly highlighted issues included honesty, introspective ability, understanding of interpretation of particular questions, response bias, or control of a sample (Austin, Deary, Gibson, McGregor, & Dent, 1998; Wilcox, 2012). Thus, it is not surprising that there has not been much consensus on which of the 21st century skills are more important, and how stable these skills and competencies are observed within the same individual across different time points or even in different contexts (Farrington et al., 2012). Therefore, there is no common understanding as to how each of the skills should be measured.
Recent research has introduced another line of evidence for the importance of 21st century skills in understanding and improving teaching and learning in diverse settings, ranging from face-to-face to informal learning settings (such as with MOOCs). For example, multimodal data, such as eye gazes, facial expressions of emotions, heart rate and electro-dermal activities were investigated to infer mental states associated with learners' engagement (D’Mello, Dieterle, & Duckworth, 2017). Ocumpaugh and colleagues (2012) developed “automated detectors” for inferring students’ emotions and engagement in real time, using trace data, allowing for the possibility of developing easily-scalable measures of 21st century skills. However, in spite of these advances, there is still work to be done in developing unobtrusive and objective measures of self-control, growth mindset, and many other personal qualities that contribute to educational success.
In addition to the approaches used to measure 21st century skills, another challenge is the lack of a common name that would be used as an umbrella concept to capture all the skills and competencies that fall outside of core academic skills. As indicated in numerous studies that focus on examining the importance of 21st century skills, the “[n]oncognitive factors are ‘noncognitive’ only insofar as they are not measured directly by cognitive tests” (Farrington et al., 2012, p.41). Surprisingly, the lack of a common name does not necessarily introduce confusion as to what constitutes the skills and competencies measured beyond the content knowledge and core academic skills, as evident by a variety of frameworks of 21st Century skills and competencies in different domains (Dede, 2010). However, we posit that there is a need to more formally define what is intended to be captured. In this context, learning analytics, as an interdisciplinary field of research and practice, has generated tremendous opportunities to help advance the assessment of 21st century skills and competencies in a timely and formative manner (Gašević, Dawson, Rogers, & Gašević, 2016; Knight, Buckingham Shum, & Littleton, 2013).
What makes the position of learning analytics pivotal in this endeavor to redefine “noncognitive” are constant changes and advancements in learning environments and the quality and quantity of data collected about learners and the process of learning. Contemporary learning environments that utilize virtual and augmented reality to enhance learning opportunities accommodate for designing tasks and activities that allow learners to elicit behaviors (either in face-to-face or online context) not being captured in traditional educational settings. Novel data streams that are beyond recording interactions within learning management systems (e.g., eye tracking, accelerometer, EEG) provide information on variety of learners' personal, psychological data using rather affordable hardware sensors. Finally, novel computational methods, such as deep learning, complex network analysis, or recent advances in temporal data analytics, allow for more sophisticated analyses and for obtaining precise, timely, and formative data about learners, learning, and environments.
TOPICS OF INTEREST
In this special section, we invite studies that provide objective, unobtrusive, and innovative measures (e.g., indirect measures, content analysis, or analysis of trace data) of 21st century skills, relying primarily on learning analytics methods and approaches that would potentially allow for expanding the assessment of 21st century skills and competencies at scale. We also seek studies that go beyond examining one particular factor, skill, or behavior in isolation, using traditional approaches to data collection (e.g., self-reports, teacher report, or parent report, to name a few). That is, we encourage submissions that explore the combination of different factors and how these skills and competencies work together in affecting immediate (e.g., course level) and long-term (e.g., graduation, employment) learner outcomes. Finally, we aim at bringing coherence in defining the term used to capture skills and competencies sometimes referred to as “noncognitive skills”. In this call for special section we refer to these constructs as 21st century skills, however, we acknowledge the limitations of this term and call for defining more appropriate concept that would capture what “those” skills actually are. Contributions to this special section may address, but are not limited to, one or more of the following topics:
- Theories: What are the relevant theories and/or existing frameworks (in various disciplines such as education, psychology, or workforce) that can inform assessments of 21st century skills and competencies, their overlaps, as well as their application?
- Data sources: Where and how can data related to learners’ 21st century skills and competencies be measured and collected?
- Tools: What analytical and assessment tools are useful in analyzing 21st century skills at scale? How can we appropriately link 21st century skills assessment with cognitive assessment?
- Methods: What analytical methods have been used? What other methods can be applied? How are 21st century skills operationalized?
- Generalizability: What kind of practices and findings are domain-general across online learning environments?
- Applicability: How can research findings translate into actionable insights for various stakeholders (e.g., learners, instructors, administrators, investors)?
- Critical perspectives: We also welcome critiques and opinions that question the value of 21st century skills.
SUBMISSION INSTRUCTIONS
Prospective authors may contact the section editors with queries. Final submissions will take place through JLA’s online submission system at http://learning-analytics.info When submitting a paper, select the section “Special Section: Beyond Cognitive Ability: Enabling Assessment of 21st Century Skills Through Learning Analytics”. All submissions should follow JLA’s standard manuscript guidelines and template available on the journal website, and will undergo peer review.
TIMELINE
- Full manuscripts due: April 21st 2019
- Completion of first review round: July 2019
- Revised manuscripts due: October 2019
- Final decision notification: December 2019
- Final versions: January 2020
- Publication of special issue anticipated in 2020
REFERENCES
Austin, E. J., Deary, I. J., Gibson, G. J., McGregor, M. J., & Dent, J. B. (1998). Individual response spread in self-report scales: personality correlations and consequences. Personality and Individual Differences, 24(3), 421–438. https://doi.org/10.1016/S0191-8869(97)00175-X
Barrick, M. R., Mount, M. K., & Judge, T. A. (2003). Personality and Performance at the Beginning of the New Millennium: What Do We Know and Where Do We Go Next? International Journal of Selection and Assessment, 9(1‐2), 9–30. https://doi.org/10.1111/1468-2389.00160
Dede, C. (2010). Comparing frameworks for 21st century skills. 21st Century Skills: Rethinking How Students Learn, 20, 51–76.
D’Mello, S., Dieterle, E., & Duckworth, A. (2017). Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning. Educational Psychologist, 52(2), 104–123. https://doi.org/10.1080/00461520.2017.1281747
Duckworth, A. L., & Seligman, M. E. P. (2005). Self-Discipline Outdoes IQ in Predicting Academic Performance of Adolescents. Psychological Science, 16(12), 939–944. https://doi.org/10.1111/j.1467-9280.2005.01641.x
Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251.
Farrington, C. A., Roderick, M., Allensworth, E., Nagaoka, J., Keyes, T. S., Johnson, D. W., & Beechum, N. O. (2012). Teaching Adolescents to Become Learners: The Role of Noncognitive Factors in Shaping School Performance–A Critical Literature Review. ERIC.
Gašević, D., Dawson, S., Rogers, T., & Gašević, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68–84. http://dx.doi.org/10.1016/j.iheduc.2015.10.002
Knight, S. ., Buckingham Shum, S. ., & Littleton, K. . (2013). Epistemology, pedagogy, assessment and learning analytics. In ACM International Conference Proceeding Series (pp. 75–84). https://doi.org/10.1145/2460296.2460312
Lounsbury, J. W., Sundstrom, E., Loveland, J. M., & Gibson, L. W. (2003). Intelligence,“Big Five” personality traits, and work drive as predictors of course grade. Personality and Individual Differences, 35(6), 1231–1239.
Mattern, K., Burrus, J., Camara, W., O’Connor, R., Hansen, M. A., Gambrell, J., … Bobek, B. (2014). Broadening the Definition of College and Career Readiness: A Holistic Approach. ACT Research Report Series, 2014 (5). ACT, Inc.
McAbee, S. T., Oswald, F. L., & Connelly, B. S. (n.d.). Bifactor Models of Personality and College Student Performance: A Broad Versus Narrow View. European Journal of Personality, 28(6), 604–619. https://doi.org/10.1002/per.1975
Ocumpaugh, J., Baker, R., & Rodrigo, M. M. T. (2012). Baker-Rodrigo Observation Method Protocol (BROMP) 1.0. (Training Manual version 1.0). New York, NY: EdLab.
Poropat, A. E. (2009). A meta-analysis of the five-factor model of personality and academic performance. Psychological Bulletin, 135(2), 322–338. https://doi.org/10.1037/a0014996
Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’ academic performance: A systematic review and meta-analysis. Psychological Bulletin, 138(2), 353–387. https://doi.org/10.1037/a0026838
Walton, G. M., & Cohen, G. L. (2011). A Brief Social-Belonging Intervention Improves Academic and Health Outcomes of Minority Students. Science, 331(6023), 1447–1451. https://doi.org/10.1126/science.1198364
Wilcox, R. R. (2012). Introduction to Robust Estimation and Hypothesis Testing. Academic Press. Retrieved from https://books.google.com.au/books?id=zZ0snCw9aYMC