Tag Archives: LMS

Dazzled by visualizations? Put on the X-ray glasses!

I’m taking an open course on Learning and Knowledge Analytics (aka LAK11) being facilitated by George Siemens.  Learning analytics is the use of data and analysis models to discover information about learning. The goal of learning analytics is to be able to predict which practices are more effective.

Moving from data collection to visualization

This week’s topic in LAK11 is data visualization. Like any form of communication, if you are better at engaging your viewer (attracting him or her to your web of data) and you can make the patterns in your data clearer, you’re more likely to create discussion. You’re also more likely to gather stakeholders to consider and care about the underlying questions the data collection’s designed to  answer.

There are many examples of data visualization:

  • Infographics are increasingly popular
  • Many eyes offers many examples of data sets presented in graphically appealing forms
  • Gapminder allows you to view and actually interact with data using sliders on graphs

Data visualization in higher education

In a previous post, I described some of the analytic approaches to evaluating data obtained from learning management systems (LMS’s). One of the examples raised and being revisited this week is the Signals Program at Purdue University. Here, the data visualization of a student’s “progress” is represented using a stoplight metaphor. The student can view signals of progress in the form of green, yellow, or red dots by the name of a course she’s taking. I’m not reproducing the screenshots of the dashboard here for copyright reasons, but you can view samples here and sample interventions here.

Dazzled by visualizations?

I do believe that  analytics relating to student interactions with an LMS can be useful, but I have yet to see it applied or displayed in an informative way.  The main problem is that the data being analyzed tends to remain the data that a third-party LMS was initially designed to measure: e.g., grades, time spent accessing the system, number of comments in forums. This information, by itself, doesn’t really tell a student what or how she’s learning.

There’s no doubt you can create striking visualizations relating to these parameters and that these visualizations can create an emotional resonance (Pretty! Or… Scary!). Visualizations may also provoke some sort of unstructured action: I need to do something, but what? I need to study more, but what? I  should check the LMS more (not a problem, I can do that while watching the television). I need to comment more. This latter feedback and call to action can be useful provided that commenting actually prompts some self-reflection and triggers some peer interaction. Still, the calls to action are all pretty vague and unformed.  It’s also focused on the student’s interactions with the LMS and so doesn’t measure how the student is accessing resources outside the LMS to her benefit or detriment. The student may want to be self-directed, but needs some guidance.

The call to action also may impact  instructors who are also not immune to the charms of a visualization. An instructor may see green lights (Pretty!) or yellow or red lights (Warning, a student “at risk!). The instructor will receive a similarly vague call to action. I should send an email or text message and hopefully some constructive help…but what help? Where’s the student going wrong?

The analytics system tells the instructor that the student is falling below some threshold for grades, amount of time spent accessing the LMS, or commenting, but doesn’t really tell the instructor what the student doesn’t understand or when the student’s losing her motivation.

The impact of a visualization on students

While instructors may recognize that a data collection system is imperfect, nascent, and experimental, it’s worth considering the impact of data visualizations on students.

A student may feel motivated if she sees a panel of green lights to keep doing what she’s doing (whether or not she quite understands what about what she’s doing has triggered a green light). If the action of studying and problem-solving isn’t intrinsically satisfying, a little extrinsic motivation might have a short-term positive effect.

Alternatively, a student might feel alarmed and discouraged if her efforts are being awarded with yellow or red labels. This may be a call to action that’s productive: studying more may help as may interacting more with peers. But it may not be an efficient call to action. As noted above, the student doesn’t know where to focus her efforts by looking at the visualization alone. If the signals are not updated dynamically, the student gets false sense of her progress and/or problems.

The critical human element

In the Signals program, the visualization is not the be all and end all to the system. A professor sends an email and reaches out to the at risk student (though arguably, after some anxiety has already been triggered by the visualization). If this email triggers a personal conversation about the student’s specific study habits, challenges, and areas of difficulty then this makes the system more worthwhile if not a perfect paradigm. (And we hope this is what goes on at Purdue.)  If the email itself is sent out as a semi-automated form letter, you’ve merely exacerbated the negative impact of the visualization and highlighted the issue that not all feedback is good or worthwhile and that some types of feedback are, in fact, damaging.

Further, take the case of the student who stays in the green. The student doesn’t get an “atta girl” that’s in any way meaningful. The system doesn’t recognize her individual interests and passions  since she’s not on the radar as trouble. Her feedback is also not particularly enriching though it may create a transient sense of reward.

So we’re back to the fact that the human element, the interactions between students and instructors remains a vital part of the learning system.  Don’t get the impression that I’m against data collection; I think it’s  important to challenge the soft spots of anecdotal stories of success and failure to always try to do better, and data can help with this. It’s just useful to bear in mind that in some cases the emperor may not be wearing any clothes.

Put on the X-ray glasses

The mere fact that we’ve attached analytics and a striking visualization to our systems can create a false sense of action and authority. There’s an “algorithm behind the system” so it must be discerning influences that are important, right?  If data are arranged in a visual, colorful format, the representation’s even likely to spread virally. There’s the assumption that someone who’s gone to the trouble to develop an algorithm or create a pretty infographic must have done the work up front to vet the data sources and the sense-making instrument. They must have thought carefully about the assumptions that went into creating the data and then represented the most salient points, right?

It’s very likely that people have thought carefully about the assumptions but that doesn’t mean these thoughts are foolproof and can’t withstand further scrutiny.  Further, the conclusions drawn from a visualization are equally, if not more, error-prone because we are distilling information based on our own personal filters of what’s important.

My call to action whenever you see a data visualization is that you:

  • examine the source of the data and how it was collected
  • ask what was included and what was dropped out for the sake of clarity (whether by a human being or by the software collecting the information)
  • investigate and question the underlying assumptions behind the data collection method and visualization
  • have discussions with others to check your own assumptions and filters

Do attempt to identify patterns from data visualizations, but recognize that these patterns always need to be tested. In the microblogging universe in which we live, question every conclusion you see, no matter how many times it’s retweeted or shared. Do it in a positive, respectful way of course, because cynicism can be equally close-minded and facile.

Bottom line

We don’t overcome the limits of analytics by dismissing analytics, we overcome it by crafting better questions, better data collection systems, and by retaining our human insights and sense of wonder throughout the process.

Reference

Arnold, K.E. (2010). Signals: applying academic analytics. Educause Quarterly, 33(1). Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/SignalsApplyingAcademicAnalyti/199385