The first colloquium online chat, this past week, was with Simon Welsh, Manager of Adaptive Teaching and Learning Services at Charles Sturt University. Simon had recommended an article by Siemens & Long (2011), Penetrating the fog, to read prior. An apt title, as I definitely felt very much stumbling around in the murk of big data and learning analytics. The authors’ use a definition from the 1st International Conference on Learning Analytics and knowledge:

“Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” (ibid, p.34).

It would appear that data, evidence and learning analytics can play a central role in ensuring educational institutions become more intentional and intelligent organizations (ibid, p. 36) allowing for data informed practice to see what’s working and what’s not.  It must “be transformative, altering existing teaching, learning, and assessment processes, academic work, and administration” (ibid, p. 38).

However, perhaps my ‘take away’ was in the authors’ comment:

But using analytics requires that we think carefully about what we need to know and what data is most likely to tell us what we need to know.” (ibid, p. 32).

It comes back to relevance and our purpose.

My other takeaways came from my colleagues questionings and comments,  during the colloquium, such as:

  • there is a need for common vocabularies and semantics across vendor/services (Jerry)
  • people can interpret the same data in different ways (Jo Q.)
  • if you do nothing with the feedback, what is the point of it? (Jo Q.)
  • real learning only starts occurring once the learner goes beyond to asking questions (Nadine)

One of my considerations during the colloquium was that data is only one aspect of the ‘big picture’:  “I think perhaps that we do not totally rely on LA & AA, but need to take a holistic approach, that it is only one part”. As a later reading by Sharkey (Blue Canary blog) validated:

“I’ve always said that analytics complement the human decision-making process — they don’t replace it.”

The other part of this participatory learning was the sharing of resources by my colleagues, which sent me off in further directions to explore learning analytics:

Other concepts that came out of the colloquium conversation were: ethics and privacy issues, trust, student-centered learning analytics.

Apart from the ‘sharings’ on the colloquium, I explored further. Rodriquiez-Triana, Martiniz-Mones, & Villagra-Sobrino, S. (2016) was an interesting read as it delved into the world of ‘small-scale teacher led innovations’ and considered not only higher education but primary educational institutions. Considering most of the research (appears to) have looked at universities and colleges, the authors have provided a window on the limitations and constraints of learning analytics in primary settings.  The major themes that arose from this research were issues of privacy, data ownership and control, student identity, and ethical issues’ (ibid, p.44). Kruse & Pongsajapan (n.d. p.2) state how “Analytics implementations seem to be primarily concerned with students poised to fail”. They continue by referencing the “constant language of ‘intervention’ that perpetuates an institutional culture of students as passive subjects”.

Both Rodriquiez-Triana et al (2016) and Kruse & Pongsajapan (n.d.) both mention the benefit of involving stakeholders, and letting them decide what information they want to share, and looking more closely at students’ own metacognitive reflection on their learning.

Kruse and Pongsajapan (n.d.) query whether learning analytics can show ‘significant learning’ as it cannot include the activity that takes place outside of the system “with the result that only a small portion of a student’s learning and engagement is being captured” (p. 4). Which is exactly what my colleague, Nadine commented upon her own learning.

And perhaps my parting acknowledgement (thanks to Graham C. for sharing on the colloquium chat) from Mike Sharkey, Blue Canary blog (January 18, 2016):

“In education, analytics can help you break down the problem and look at all of the pieces, but we need to rely on faculty, advisors, dedicated administrators, or the students themselves to take action and make a difference.  Remember that the next time you have a conversation about analytics in higher education.  Think about the problem you’re trying to solve.  Believe it or not, that problem isn’t “analytics”.  It’s more likely a higher level issue such as retention or effective teaching, and analytics by themselves won’t solve it for you.”

Endnote:

Shared summary by colleague moderators Nadine Bailey, Graham Clark and Jerry Leeson.

Guest Colloquium 1 summary

Reference:

Kruse, A. & Pongsajapan, R. (n.d.) Student-centered learning analytics. CNDLS Thought Papers. Retrieved from: https://cndls.georgetown.edu/m/documents/thoughtpaper-krusepongsajapan.pdf

Rodriquiez-Triana, M.J., Martiniz-Mones, A., & Villagra-Sobrino, S. (2016). Learning analytics in small-scale teacher-led innovations: Ethical and data privacy issues. Journal of Learning Analytics, pp. 43-65. Retrieved from: http://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4581/5428

Sharkey. M. (18 Jan, 2016). What Analytics Aren’t. Blue Canary [blog]. Retrieved from: http://bluecanarydata.com/theres-no-recipe-for-analytics/

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education.EDUCAUSE review, 46(5), 30.

Slade, S. & Prinsloo, P., (2013). Learning Analytics: Ethical Issues and Dilemmas. American Behavioral Scientist. October 2013, 57(10), pp. 1510-1529.  doi:10.1177/0002764213479366

Advertisements