It had been a while since I’d attended a New Media Consortium conference — five years, to be exact. The approximate lifespan of a rabbit — or of any interest in Second Life as an educational platform. Way back in 2007 it seemed like at least every other session was about SL, the SL whose users are now outnumbered by Farmville addicts 35-1.
This year’s NMC was considerably more interesting, not least for a fundamental tension running through the conference — and extending, I’d argue, into the dual track courseware-for-the-masses ventures of our host, MIT. Does the university release modular, somewhat decontextualized elements of its learning programs onto the open web — or does it work up a comprehensive platform and run full courses online? OpenCourseWare or edX?
At issue at NMC, throughout, was the need to spur actual learning. We begin in childhood curious and playful, it was posited repeatedly (nobody used the phrase “trailing clouds of glory,” but it was in the air) — we begin piecing together the world in gladness, effortlessly making discovery after discovery, until the crush, the educational system, the system smothering organic desire to learn wholly, implacably. Numbed by standards-based tests and rote knowledge-dumps, marooned in problem sets and lectures and test routines and settings that were stale a hundred years ago, drowning in debt and cut off from employment, the hungry sheep look up and are not fed. Cue technology.
Despite ample experience to the contrary (and, to be fair, not a little experience pointing the other way too), we keep hoping that machines will make the acquisition of information easier, more effective, better. Surely if only we could tailor knowledge into the channels of computational processing, or teach machines the quirks of our natural language, or train them better on our associative behavior, or leverage them, at least, to expose what we don’t know, we would stake out some real progress.
And yet, whenever I’m in a dystopic mood, our increasing reliance on machines to “personalize” data seems profoundly isolating — cutting us off from the complexity of unchanneled (or paranodal) experience. Pursuing networks, efficiencies, we eliminate the unquantifiable — the uncanny, the unexpected, the awesome, the drives so murky that shape us so powerfully. The best stories I know about knowledge are incredibly messy, beholden to motivations and coincidences unanticipatable by artificial intelligence. See Oedipus — see everything (character, fate, chance) that is necessary for him to actually see.
The press is all over the online education story. It’s official: Harvard and MIT are hitching up to birth yet another MOOC venture — massively open online courses, such an ugly acronym — to the tune of $60 million. edX thus join Coursera, Udacity, and whatever next venture blooms by the time you read this in rolling out prestigious curriculum to the masses. It will be led by the director of MIT’s Computer Science and Artificial Intelligence lab (just as Udacity was co-founded by an AI innovator, the inventor of Google’s self-driving car and Google Glass.)
These stories in the press rarely pursue implications of AI-driven MOOCs. Among their other aspirations, MOOCs promise to be giant learning data farms, ones that loop information back into customization and management of the learning experience. The emphasis is on “massive”: wash a giant population through your system and it refines itself. Ithaka’s recently published Barriers to Adoption of Online Learning Systems in U.S. Higher Education makes the point:
By gathering data on how thousands of students progress through a common body of material, these systems should be able to help future curriculum planners optimize the sequence and design of courses and modules.
As crowds course through your system, their activity is harnessed and fed back, making the system smarter. Just as the key to making IBM’s Watson a Jeopardy champ was supplementing its giant database with just-in-time registers of human decisions, MOOCs promise to be more than a giant pile of course material: they will monitor quizzes, ratings, and discussions, they will improve themselves.
The impulse of MOOCs is no doubt altruistic: opening up the ivory tower, distributing the best teaching to the broadest possible public. But we may be many, many more failures away from understanding how that might work in practice: how the artifacts of an in-person educational institution, produced by scholars and administrators dancing to local imperatives and cut off from the full landscape of global access, translate meaningfully at scale.
Is there not, though, something poignant if not perverse about examples like this?
Online courses with thousands of students give researchers the ability to monitor students’ progress, they said, identifying what they click on and where they have trouble. Already, a researcher from the Harvard Graduate School of Education, using the M.I.T. Circuits course, found that students overwhelmingly preferred to read the handwritten notes of Professor Agarwal rather than the same notes presented on PowerPoint.
When the topic of study is mechanistic, or at least rational enough so that there are correct and incorrect choices that can be monitored and corrected by bots, these projects seem, well, natural. But when they stray into humanities…
The edX project will include not only engineering courses, in which computer grading is relatively simple, but also humanities courses, in which essays might be graded through crowd-sourcing, or assessed with natural-language software.
Here we go. The day a computer starts grading Janie’s analysis of The Great Gatsby, say, seems very close, and very dreary. Ten points for mentioning the green light in the first five sentences; escalating points for increasing proximation of “America*” and “disillusion*”; passive construction to be scoured out and flagged. Will it matter what she thinks when she’s just talking to the machine? And yes there are a thousand clever Facebookish ways to match up users of the system, a myriad of recommendation engines, any number of ways to set up pristine (because secure and erasable) interactive spaces.