Three of my journal articles got published recently, two on learning analytics/ writing analytics implementations [Learning Analytics Special Issue in The Internet and Higher Education journal], and one on a text analysis method [Educational Technology Research and Development journal]. that I worked on earlier (many years ago in fact, which just got published!).
Article 1: Educator Perspectives on Learning Analytics in Classroom Practice
The first one is predominantly qualitative in nature, based on instructor interviews of their experiences in using Learning Analytics tools such as the automated Writing feedback tool AcaWriter. It provides a practical account of implementing learning analytics in authentic classroom practice from the voices of educators. Details below:
Abstract: Failing to understand the perspectives of educators, and the constraints under which they work, is a hallmark of many educational technology innovations’ failure to achieve usage in authentic contexts, and sustained adoption. Learning Analytics (LA) is no exception, and there are increasingly recognised policy and implementation challenges in higher education for educators to integrate LA into their teaching. This paper contributes a detailed analysis of interviews with educators who introduced an automated writing feedback tool in their classrooms (triangulated with student and tutor survey data), over the course of a three-year collaboration with researchers, spanning six semesters’ teaching. It explains educators’ motivations, implementation strategies, outcomes, and challenges when using LA in authentic practice. The paper foregrounds the views of educators to support cross-fertilization between LA research and practice, and discusses the importance of cultivating educators’ and students’ agency when introducing novel, student-facing LA tools.
Keywords: learning analytics; writing analytics; participatory research; design research; implementation; educator
Citation and article link: Antonette Shibani, Simon Knight and Simon Buckingham Shum (2020). Educator Perspectives on Learning Analytics in Classroom Practice [Author manuscript]. The Internet and Higher Education. https://doi.org/10.1016/j.iheduc.2020.100730. [Publisher’s free download link valid until 8 May 2020].
Article 2: Implementing Learning Analytics for Learning Impact: Taking Tools to Task
The second one led by Simon Knight provides a broader framing for how we define impact in learning analytics. It defines a model addressing the key challenges in LA implementations based on our writing analytics example. Details below:
Abstract: Learning analytics has the potential to impact student learning, at scale. Embedded in that claim are a set of assumptions and tensions around the nature of scale, impact on student learning, and the scope of infrastructure encompassed by ‘learning analytics’ as a socio-technical field. Drawing on our design experience of developing learning analytics and inducting others into its use, we present a model that we have used to address five key challenges we have encountered. In developing this model, we recommend: A focus on impact on learning through augmentation of existing practice; the centrality of tasks in implementing learning analytics for impact on learning; the commensurate centrality of learning in evaluating learning analytics; inclusion of co-design approaches in implementing learning analytics across sites; and an attention to both social and technical infrastructure.
Keywords: learning analytics, implementation, educational technology, learning design
Citation and article link: Simon Knight, Andrew Gibson and Antonette Shibani (2020). Implementing Learning Analytics for Learning Impact: Taking Tools to Task. The Internet and Higher Education. https://doi.org/10.1016/j.iheduc.2020.100729.
Article 3: Identifying patterns in students’ scientific argumentation: content analysis through text mining using LDA
The third one led by Wanli Xing discusses the use of Latent Dirichlet Allocation, a text mining method to study argumentation patterns in student writing (in an unsupervised way). Details below:
Abstract: Constructing scientific arguments is an important practice for students because it helps them to make sense of data using scientific knowledge and within the conceptual and experimental boundaries of an investigation. In this study, we used a text mining method called Latent Dirichlet Allocation (LDA) to identify underlying patterns in students written scientific arguments about a complex scientific phenomenon called Albedo Effect. We further examined how identified patterns compare to existing frameworks related to explaining evidence to support claims and attributing sources of uncertainty. LDA was applied to electronically stored arguments written by 2472 students and concerning how decreases in sea ice affect global temperatures. The results indicated that each content topic identified in the explanations by the LDA— “data only,” “reasoning only,” “data and reasoning combined,” “wrong reasoning types,” and “restatement of the claim”—could be interpreted using the claim–evidence–reasoning framework. Similarly, each topic identified in the students’ uncertainty attributions— “self-evaluations,” “personal sources related to knowledge and experience,” and “scientific sources related to reasoning and data”—could be interpreted using the taxonomy of uncertainty attribution. These results indicate that LDA can serve as a tool for content analysis that can discover semantic patterns in students’ scientific argumentation in particular science domains and facilitate teachers’ providing help to students.
Keywords: text mining, latent dirichlet allocation, educational data mining, scientific argumentation
Citation and article link: Wanli Xing, Hee-Sun Lee and Antonette Shibani (2020). Identifying patterns in students’ scientific argumentation: content analysis through text mining using Latent Dirichlet Allocation. Educational Technology Research and Development. https://doi.org/10.1007/s11423-020-09761-w.