Does Tech Improve the Quality of Education?

iphoneipodmacbooksmThere’s a fascinating formal debate taking place on The Economist site this week. The proposition put forward is:

This house believes that the continuing introduction of new technologies and new media adds little to the quality of most education.

Speaking to the pro side is Sir John Daniel, President and Chief Executive Officer of The Commonwealth of Learning which is “an intergovernmental organisation created by Commonwealth Heads of Government to encourage the development and sharing of open learning/distance education knowledge, resources and technologies. COL is helping developing nations improve access to quality education and training.”

I don’t know that I agree with Sir John when he says there’s a zero-sum relationship among volume, quality, and cost. Technology changes that, a lot. One of the really amazing things about using the Internet, as in the Open University model, for example, is that there’s very little marginal cost associated with opening a program to a large number of additional learners.

What’s sort of sad is that the guy arguing the con side, Dr. Robert Kozma, Emeritus Director and Principal Scientist at SRI international, has to carefully qualify his argument in order even to make it. He says “new technologies and new media do make a significant contribution to the quality of education, at least under certain circumstances.”

I’ve spent the last 15 years of my career facilitating the incorporation of technology into educational programs. I wouldn’t have been doing this if I didn’t think I was in fact adding value to the world.  But it’s an area fraught with landmines.

The truth is it’s possible, and sadly, common to detract from the quality of a given learning opportunity by adding a lot of cognitive overhead in the form of new technology learners must master in order to access that learning opportunity!  Except in those programs which are explicitly designed to introduce technology, it’s probably a bad idea to require learners, or instructors, to master more than one new technological interface per course.  And it’s really better for learning efficiency if there’s NO new technology at all to master, so that one can free up one’s brain to learn the program material which is being presented, without having to be on guard against looking stupid because one doesn’t know exactly what to do.

My husband remarked to my son-the-junior-in-college and a classmate of his that really, they should savor their junior year.  As juniors, they know their way around the campus. They know how to figure out what the professors want. They have mostly fulfilled their core requirements, and are free to study the subjects that interest them in depth. And, they don’t yet need to devote energy to that looming question of “what’s next”.  In short, they are free to concentrate on the learning tasks at hand, without all that excess overhead to manage.

I don’t know that we can reliably simulate junior year in the groves of Academe here in the corporate training world. But I do think keeping that ideal in mind might curb some of our more technophilic tendencies to throw a lot of cool new stuff into our programs without sufficient thought about where the resources to master that stuff must be taken from!  If it’s worth teaching, it’s probably worth teaching in a format our learners have a level of comfort with.  And if we’ve got a brilliant idea for a new format which we really do have reason to believe will improve the cost/quality/reach of our programming, it’s probably worth taking some time to train learners in it’s use apart from whatever challenging programming we’d like to put into it.