I’m taking an open course on Learning and Knowledge Analytics being facilitated by George Siemens. An open course is a bit like an all-you-can-eat buffet; there’s a wealth of goodies to available for you to digest but you get to decide what to put on your plate. People create learning structures based on their interests and affinities for methods of study, socializing, and expression. As part of my own approach to making sense of what I’ve been learning, I’ll be blogging about the course from time to time.
Learning and knowledge analytics in higher education
This week, I had the opportunity to listen to a talk by John Fritz, who was kind enough to share his views on learning analytics and his experiences in using LAK at the University of Maryland, Baltimore County (UMBC).
Currently, many educators are implementing LAK projects through the learning management systems (LMS’s) used in their own institutions (e.g., Blackboard and Moodle). This is both a good and bad approach, in my opinion. It’s good because the systems are designed to collect data automatically and users don’t have to do any additional programming. Common LMS use by many different institutions means that a wealth of data can be collected. However, it’s also bad because the systems are designed to collect data automatically, which means that data collection isn’t particularly hypothesis-driven (i.e., designed to prove or disprove a theory) and the systems won’t necessarily provide the data that are most useful.
Right now, the data being collected generally relate to grades and how often users dip into an LMS. LAK approaches are not really addressing how learners are using information provided by an LMS and systems outside the LMS and translating these sources of information into problem-solving capabilities (not at all an easy issue to tackle).
What LAK and Health Information Analytics have in common
In a bit of synchronicity, I came across a monograph this week on building a digital infrastructure for a Learning Health System (Grossman & McGinnis, 2010). The monograph was the culmination of a series of workshops sponsored by the Institute of Medicine (IOM) that tapped into the experiences of researchers, computer scientists, privacy experts, clinicians, health information technology professionals, patient advocates, health care policy makers and other stakeholders. The purpose of the workshops was to identify strategies and priorities to “build a more seamless learning enterprise that will improve the health and health care of Americans.”
I was struck by the similar challenges faced by those considering LAK and health information analytics (aka HIA, aka health informatics, aka HIT, aka ehealth). Both LAK and HIA represent fields in which the goal is to move from beyond mere data collection to knowledge generation, and ultimately, to interventions. In the case of LAK, the ideal is to use data to support self-directed learners so they can make better use of educational systems and resources. In the case of HIA, the ideal is to make patients (who are learners in this system as well) the center of their own health care strategies.
In both cases, learners are generators of data, but also benefit from consuming data, provided that the data are presented in meaningful ways that lead to knowledge acquisition and ultimately to problem-solving capabilities. In both LAK and HIA fields, there are significant challenges involved in making data both transparent and meaningful. Mere access does not lead to understanding or transformative behaviors.
Some themes raised by the IOM monograph strike me as common to both LAK and HIA. There’s the need…
- To build a shared learning environment
- To leverage existing technologies
- To keep systems simple
- To merge the social and the technical
- To “weave a strong and secure trust fabric among stakeholders”
Both LAK and HIA suffer in that currently learners are participating more as providers of data than as sharers of data. This lack of transparency can lead to a lack of trust and a loss of empowerment. As Grossman and McGinnis note, a critical strategy for moving forward in building a data infrastructure is to provide learners with “useful information concerning…the relevant state of evidence, and [to give] them more responsibility for utilizing this information in their own decision making” (Grossman & McGinnis, 2010).
Needs-driven data collection
The IOM monograph also points out the need to continuously evaluate the data collection system and the questions being asked. To move from data to wisdom requires not just analysis after the data is collected but before. Without considering what questions you actually want answers to and whether these answers will in fact be useful to drive the interventions you desire, data collection provides the appearance of authority without actual authority. We all know how definitive conclusions can sound once you start associating them with numbers. In a world that’s increasingly characterized by Twitter sound bytes, it’s vital to question assumptions and consider alternate explanations for such conclusions and to craft better questions at the outset.
On the positive side, participants in both the instructional design and health fields tend to be pretty passionate about seeking wisdom. Developing and improving practices is an iterative process creating both opportunities and challenges, a model of “continuous improvement rather than an all-at-once effort” (Grossman & McGinnis, 2010).
Grossman, C. & McGinnis, J.M. (2010). The Digital Infrastructure for a Learning Health System: Foundation for Continuous Improvement in Health and Health Care – Workshop Summary. Institute of Medicine of the National Acadamies. Washington, DC; National Academies Press. Retrieved from http://www.nap.edu/catalog/12912.html
Additional references from the LAK11 course:
- Baker, S.J.D., Yacef, K. (2009) The State of Educational Data Mining in 2009: A Review and Future Visions: http://www.educationaldatamining.org/JEDM/images/articles/vol1/issue1/JEDMVol1Issue1_BakerYacef.pdf
- Goldstein, P. J. (2005) Academic Analytics: Uses of Management Information and Technology in Higher Education http://net.educause.edu/ir/library/pdf/ecar_so/ers/ers0508/EKF0508.pdf
- Elias, T. (2011) Learning Analytics: Definitions, Processes, Potential http://learninganalytics.net/LearningAnalyticsDefinitionsProcessesPotential.pdf
- Siemens, G. (2011) Learning Analytics: Foundation for Informed Change in Higher Education: http://www.slideshare.net/gsiemens/learning-analytics-educause (recording EDUCAUSE from presentation, January 10, 2011)