Thursday, March 28, 2013

Why Do We Need "Learning Analytics"?

http://mfeldstein.com/if-you-like-learning-could-i-recommend-analytics/

A colleague just sent this link to me; as I told her, I don’t know whether to be encouraged that this argument is still being advanced, at least in some small niches of the world, or depressed that it HAS to keep being advanced over and over again. While the technical terminology used here is different, some of us have been saying such things for at least the 30 years I've been involved in higher education assessment work, apparently, I might add, to little avail).  Of course, the current reality in higher education--shrinking budgets and state support, growing influence of the private sector and foundations in broad policy-setting and direction, and great advancements in all sort sof whiz-bang technological tools--creates a particularly ominous and potent confluence of forces, raising the threat/opportunity level beyond where it's been in times past; there's just something awfully seductive about "analytics" and color-coded "data dashboards" and the like, isn't there? Part of me wants to take the traditional higher education curmudgeonly position and say “this, too, shall pass,” but the other part thinks that may not work this time, and really, it’s impossible to know whether that’s actually true when you’re in the middle of it. In any case I also agree with Jerome that it’s important we be open to adapting and using what we might possibly learn from these ideas as we think about the complexities of learning, so we shouldn't just close our minds , cover our ears, and and chant "la la la la" hoping it will all disappear if we don't listen to any of it. On the other hand, we also need to remember, as Michael Scriven reminded the field of educational evaluation some 40 years ago, "The question is not just, 'what does your machine produce,' but also 'how does your garden grow?'" Data analytics can be helpful in answering certain limited kinds of questions, but if we really want to understand the complex ecosystem of a classroom, let alone an academic program or a college, in ways that help direct us toward meaningful improvements, we need to ask and at least attempt to answer deeper qualitative questions, beyond even the "semantic data" that Jerome alludes to in his post (though that kind of data is certainly more helpful in this context than the "click stream data" Jerome describes). Beyond that, it's also important to remember--and remind anyone pushing things like "learning analytics"--that we already have a robust literature about “analytics” in a learning context, we just happen to call it “formative assessment;” most of it is in the K-12 world so outside of teacher education a lot of higher education types are blissfully unaware it exists, but it's extensive, well-supported, and ongoing. One of my favorite researchers and thinkers in this arena is Dylan Wiliam, who's keynoted at our ATL conference and worked with us on the Transition Math Project as well (he's actually speaking at the Washington Education Research Association spring meeting next month but I understand it's sold out, so if you're interested in checking out his work the resources on his web site will have to suffice...). Wiliam, along with his colleague Paul Black, wrote the classic piece "Inside the Black Box," which you can find on his site, but he's done a considerable amount of conceptual and empirical work since then on using classroom formative assessments to inform instruction and improve student achievement--to use his language, "keep learning on track." If "learning analytics" can help with that process, all the better, but let's start with the right questions and figure out what kind of tools and resources we need rather than the other way around!

 

No comments: