A colleague just sent this link to me; as I told her, I don’t know whether to be encouraged that this argument is still being advanced, at least in some small niches of the world, or depressed that it HAS to keep being advanced over and over again. While the technical terminology used here is different, some of us have been saying such things for at least the 30 years I've been involved in higher education assessment work, apparently, I might add, to little avail). Of course, the current reality in higher education--shrinking budgets and state support, growing influence of the private sector and foundations in broad policy-setting and direction, and great advancements in all sort sof whiz-bang technological tools--creates a particularly ominous and potent confluence of forces, raising the threat/opportunity level beyond where it's been in times past; there's just something awfully seductive about "analytics" and color-coded "data dashboards" and the like, isn't there? Part of me wants to take the traditional higher education curmudgeonly position and say “this, too, shall pass,” but the other part thinks that may not work this time, and really, it’s impossible to know whether that’s actually true when you’re in the middle of it. In any case I also agree with Jerome that it’s important we be open to adapting and using what we might possibly learn from these ideas as we think about the complexities of learning, so we shouldn't just close our minds , cover our ears, and and chant "la la la la" hoping it will all disappear if we don't listen to any of it. On the other hand, we also need to remember, as Michael Scriven reminded the field of educational evaluation some 40 years ago, "The question is not just, 'what does your machine produce,' but also 'how does your garden grow?'" Data analytics can be helpful in answering certain limited kinds of questions, but if we really want to understand the complex ecosystem of a classroom, let alone an academic program or a college, in ways that help direct us toward meaningful improvements, we need to ask and at least attempt to answer deeper qualitative questions, beyond even the "semantic data" that Jerome alludes to in his post (though that kind of data is certainly more helpful in this context than the "click stream data" Jerome describes). Beyond that, it's also important to remember--and remind anyone pushing things like "learning analytics"--that we already have a robust literature about “analytics” in a learning context, we just happen to call it “formative assessment;” most of it is in the K-12 world so outside of teacher education a lot of higher education types are blissfully unaware it exists, but it's extensive, well-supported, and ongoing. One of my favorite researchers and thinkers in this arena is Dylan Wiliam, who's keynoted at our ATL conference and worked with us on the Transition Math Project as well (he's actually speaking at the Washington Education Research Association spring meeting next month but I understand it's sold out, so if you're interested in checking out his work the resources on his web site will have to suffice...). Wiliam, along with his colleague Paul Black, wrote the classic piece "Inside the Black Box," which you can find on his site, but he's done a considerable amount of conceptual and empirical work since then on using classroom formative assessments to inform instruction and improve student achievement--to use his language, "keep learning on track." If "learning analytics" can help with that process, all the better, but let's start with the right questions and figure out what kind of tools and resources we need rather than the other way around!
Thursday, March 28, 2013
Why Do We Need "Learning Analytics"?
http://mfeldstein.com/if-you-like-learning-could-i-recommend-analytics/
A colleague just sent this link to me; as I told her, I don’t know whether to be encouraged that this argument is still being advanced, at least in some small niches of the world, or depressed that it HAS to keep being advanced over and over again. While the technical terminology used here is different, some of us have been saying such things for at least the 30 years I've been involved in higher education assessment work, apparently, I might add, to little avail). Of course, the current reality in higher education--shrinking budgets and state support, growing influence of the private sector and foundations in broad policy-setting and direction, and great advancements in all sort sof whiz-bang technological tools--creates a particularly ominous and potent confluence of forces, raising the threat/opportunity level beyond where it's been in times past; there's just something awfully seductive about "analytics" and color-coded "data dashboards" and the like, isn't there? Part of me wants to take the traditional higher education curmudgeonly position and say “this, too, shall pass,” but the other part thinks that may not work this time, and really, it’s impossible to know whether that’s actually true when you’re in the middle of it. In any case I also agree with Jerome that it’s important we be open to adapting and using what we might possibly learn from these ideas as we think about the complexities of learning, so we shouldn't just close our minds , cover our ears, and and chant "la la la la" hoping it will all disappear if we don't listen to any of it. On the other hand, we also need to remember, as Michael Scriven reminded the field of educational evaluation some 40 years ago, "The question is not just, 'what does your machine produce,' but also 'how does your garden grow?'" Data analytics can be helpful in answering certain limited kinds of questions, but if we really want to understand the complex ecosystem of a classroom, let alone an academic program or a college, in ways that help direct us toward meaningful improvements, we need to ask and at least attempt to answer deeper qualitative questions, beyond even the "semantic data" that Jerome alludes to in his post (though that kind of data is certainly more helpful in this context than the "click stream data" Jerome describes). Beyond that, it's also important to remember--and remind anyone pushing things like "learning analytics"--that we already have a robust literature about “analytics” in a learning context, we just happen to call it “formative assessment;” most of it is in the K-12 world so outside of teacher education a lot of higher education types are blissfully unaware it exists, but it's extensive, well-supported, and ongoing. One of my favorite researchers and thinkers in this arena is Dylan Wiliam, who's keynoted at our ATL conference and worked with us on the Transition Math Project as well (he's actually speaking at the Washington Education Research Association spring meeting next month but I understand it's sold out, so if you're interested in checking out his work the resources on his web site will have to suffice...). Wiliam, along with his colleague Paul Black, wrote the classic piece "Inside the Black Box," which you can find on his site, but he's done a considerable amount of conceptual and empirical work since then on using classroom formative assessments to inform instruction and improve student achievement--to use his language, "keep learning on track." If "learning analytics" can help with that process, all the better, but let's start with the right questions and figure out what kind of tools and resources we need rather than the other way around!
A colleague just sent this link to me; as I told her, I don’t know whether to be encouraged that this argument is still being advanced, at least in some small niches of the world, or depressed that it HAS to keep being advanced over and over again. While the technical terminology used here is different, some of us have been saying such things for at least the 30 years I've been involved in higher education assessment work, apparently, I might add, to little avail). Of course, the current reality in higher education--shrinking budgets and state support, growing influence of the private sector and foundations in broad policy-setting and direction, and great advancements in all sort sof whiz-bang technological tools--creates a particularly ominous and potent confluence of forces, raising the threat/opportunity level beyond where it's been in times past; there's just something awfully seductive about "analytics" and color-coded "data dashboards" and the like, isn't there? Part of me wants to take the traditional higher education curmudgeonly position and say “this, too, shall pass,” but the other part thinks that may not work this time, and really, it’s impossible to know whether that’s actually true when you’re in the middle of it. In any case I also agree with Jerome that it’s important we be open to adapting and using what we might possibly learn from these ideas as we think about the complexities of learning, so we shouldn't just close our minds , cover our ears, and and chant "la la la la" hoping it will all disappear if we don't listen to any of it. On the other hand, we also need to remember, as Michael Scriven reminded the field of educational evaluation some 40 years ago, "The question is not just, 'what does your machine produce,' but also 'how does your garden grow?'" Data analytics can be helpful in answering certain limited kinds of questions, but if we really want to understand the complex ecosystem of a classroom, let alone an academic program or a college, in ways that help direct us toward meaningful improvements, we need to ask and at least attempt to answer deeper qualitative questions, beyond even the "semantic data" that Jerome alludes to in his post (though that kind of data is certainly more helpful in this context than the "click stream data" Jerome describes). Beyond that, it's also important to remember--and remind anyone pushing things like "learning analytics"--that we already have a robust literature about “analytics” in a learning context, we just happen to call it “formative assessment;” most of it is in the K-12 world so outside of teacher education a lot of higher education types are blissfully unaware it exists, but it's extensive, well-supported, and ongoing. One of my favorite researchers and thinkers in this arena is Dylan Wiliam, who's keynoted at our ATL conference and worked with us on the Transition Math Project as well (he's actually speaking at the Washington Education Research Association spring meeting next month but I understand it's sold out, so if you're interested in checking out his work the resources on his web site will have to suffice...). Wiliam, along with his colleague Paul Black, wrote the classic piece "Inside the Black Box," which you can find on his site, but he's done a considerable amount of conceptual and empirical work since then on using classroom formative assessments to inform instruction and improve student achievement--to use his language, "keep learning on track." If "learning analytics" can help with that process, all the better, but let's start with the right questions and figure out what kind of tools and resources we need rather than the other way around!
Thursday, March 7, 2013
Education with purpose
Matt Yglesias over at MoneyBlog
makes an interesting point while
responding to Felix Salmon's suggestion that
watching lectures on YouTube is “100% education, 0% credentialing.” As Yglesias sees it, education isn't just about the
learning; education signals something about one’s normalcy.
"In particular, since going to college is a normal bourgeois
thing to do in America in 2013, doing it indicates that you are a normal
American who subscribes to normal bourgeois values. A summer intern who's just
finished up her third year at Yale doesn't have any kind of particular
credentials, but we know that she probably has very good SAT scores and sounds
like an exceedingly normal person. A young woman who got a 1600 on her SATs and
has been spending the past three years working at 7-11 and watching Open Yale Courses videos
sounds like a huge weirdo.
And employers seem to genuinely value that "you're not a
weirdo" factor. … “
I see and hear this sort of discourse about higher education
frequently in one form or another. From my perspective as an educator, there
are at least 3 things misguided about it, starting with Salmon's equation of
watching video lectures with getting an education. While learning can certainly
take place in many forms and modalities, an education implies something more
structured and systematic than streaming among some YouTube videos: An
education includes exposure to a breadth of knowledge, appropriate assessments,
feedback, and interaction with others. To be sure, fully online degrees can provide
this sort of structure, and the technology is getting better to the point that
meaningful interaction can take place too, but you can’t exactly find that for
free on YouTube just yet. Second, I think it’s important to keep in
mind the purpose of an education. Indeed, if we want to render any worthwhile
judgments at all with respect to our social practices, we need to keep in mind
what our purposes are. This is hardly a new claim. Alasdair McIntyre and other neo-Aristotelian thinkers have been making this
kind of argument since the late 1980s. Third, as
I’ve argued before, the purpose of higher education is learning, and on a
broad social level that learning is related to the empowerment that comes with
the pursuit of knowledge and truth. Granting degrees and credentials are a
means to ensure student learning, but those aren’t the real purpose. In short,
Yglesias gets the purpose wrong. Ideally, education isn’t about signaling to
the world that you’re normal or that you’re qualified for a job, even though those
considerations may play into why some people are interested in obtaining a
degree or credential. Education is about transformative learning. It’s just
that too often in our discourse we reduce education down to an instrumental
interest, which could possibly have the effect of degrading the original
purpose. I’m willing to admit that some people—including more than a few
students—may disagree that the “real” purpose of education is learning, and
there are different ideas as to what learning means as well. I’ll need to leave
it at that for now, but it’s an issue I’ll come back to soon in the future. Our
education depends on us getting the purpose(s) right.
Friday, March 1, 2013
Committee work and...student success?
I've
always thought that faculty engagement was one of the primary keys to student
success, and a sufficient cadre of full-time faculty seems necessary to that
effort. That’s not a criticism of part-time faculty—many of whom are
contributing members of our campus community and excellent teachers too. Yet,
on balance, part-time faculty members are often pushed for economic reasons to
cobble together employment from multiple institutions and they aren’t generally
expected (or compensated) to do more than classroom teaching. Full-time
faculty, however, are expected to be more thoroughly engaged with the students
and the institution; their job descriptions typically involve maintaining the
curriculum as well as providing service to the broader campus community. However,
as a result of budget cuts, there tend to be fewer full-time faculty members to
shoulder this work. This issue recently surfaced on my campus and I think
it’s telling. In one of our academic divisions, there was no full-time faculty
member willing to step forward to serve as the chair. To be sure,
engagement in administrative and governance activities may not be the kind of
engagement that leads directly to student success. I wouldn’t argue that all
engagement is equal, after all, and as this
brief article from the Community College Times suggests,
a healthy faculty culture focused on student learning and success has
significant impact on student success rates (see
Valencia CC in Florida). One could argue whether serving as a department
chair, on the faculty senate, or on some other campus-wide committee is related
to student success. Intuitively though, I think service to the campus is
connected to the quality of learning that is provided to the students. A
faculty who is deeply invested in the institution and takes ownership of it
helps to create a broader culture of engagement. This sounds right to me in
theory anyway, although I’d be curious if anyone is familiar with any studies
or evidence to support it. If you’re aware of any scholarship along these
lines, drop me a line at Kenneth.lawson@skagit.edu.
6 Things You Should Say to Your Professor
From USA Today College:
Published
February 19th, 2013
By Ellen Bremen, M.A.
By Creatas
Everything you read about speaking to professors warns that you shouldn’t tick them off, ruin their impression of you or say something to sink you further.
But should you keep your mouth shut? No!
I’ve witnessed 14-plus years of student interactions and I teach interpersonal communication. Not talking to your instructors isn’t the answer and texting is out of the question. The right words give you a shot at solving your problem and possibly getting better grades.
I’ve got two pocket themes you can use with your professor that work for many class-related situations: “Here is what I’ve done…” and “Here is what I propose…” A little background behind my recommendations:
When students have issues — late work, absences, needing help — they often come across as, “Solve it for me. Now!” Professors’ reactions? Mildly irritated to snarky to downright grizzly. Would you throw your lateness, your missed work or your confusion in your boss’ lap and say, “Fix it!”? That could cost you a job, right? You want to appear ultra-professional with your professor, even if you’ve blown something up. So use these phrases to sound proactive, rather than reactive.
• Instead of saying, “I’m so lost!” say, “I am confused about our upcoming paper. Here is what I’ve done: I read over the assignment sheet. I reviewed your examples. I am stuck on the transitions and two of my sources. Can you help with that?”
Read the full article...
Subscribe to:
Posts (Atom)