Saturday, August 08, 2009

Ten Things I (no longer) Believe About Transforming Teaching and Learning with Technology: Introduction

(revised November 23, 2009)

Working on a curriculum committee? planning a new academic building? are you a chief information officer? director of a teaching center? Welcome to this discussion!

I've been writing a series of posts about how to improve (or even transform) college education.  I'm 60 now, and I've begun to realize that I've changed my mind about a lot of things I used to believe in that area.

I first heard about an imminent technology revolution in higher education when I was a senior in high school.  Initially the change was to affect how people learned and what they learned (e.g., programming). Later the hopes also grew to include who could learn (e.g., distance learning) and a reduction in the costs of instruction.

I heard more about this coming transformation when I was in college, in graduate school, as a program officer for the Fund for the Improvement of Postsecondary Education (1978-85), as a senior program officer with the Annenberg/CPB Projects (1985-96), at the American Association for Higher Education (1996-97), and since Steve Gilbert and I founded the Teaching, Learning, and Technology Group (1998- ), a non-profit that supports over 100 colleges and universities around the world.

The buzz continues, continually changing its details. The catalyst would be mainframe engineering analysis and design programs. It would be simulations, computer-aided instruction, microcomputers, computer-aided design, videodisc, email, distributed laboratories, hypertext, Gopher servers, math tools, the Web, asynchronous learning networks, national learning infrastructure, streaming video, ePortfolios, eLearning, blended courses, course redesign, iPods, clickers, lecture capture, smart phones, digital cameras, social networking,...

Each new technology changed important elements of the direction -- students learning to program in BASIC on microcomputers would learn to think logically and creatively for themselves, videodisc would make education more visual and interactive, HyperCard would trigger a wave of hypertextual, interdisciplinary thinking...

There are common threads to this rapidly flickering vision, ideas that I've heard repeatedly over the decades:
  • Education will shift the student's role from passive to active. 
  • Education will become more self-paced. 
  • Many new kinds of students will be involved. 
  • As more of the explaining function of education becomes embodied in technological materials, the faculty role will shift from always being the 'sage on the stage' toward spending more of their time coaching and managing the learning process("a guide on the side"). 
  • Although the upfront costs would be large, there will be cost savings, too. 
  • And the exponential improvement in the power and efficiency of computer chips will help assure that the wave of change that we could already see would soon accelerate. 
Those are frequently repeated hopes.


Since 1966, most predictions I've heard for technology-driven, paradigm-shifting change in the "how" of learning have not come true - not pervasively, not yet.

For example, most of today's students still seem to do most of their learning through what instructors and textbooks explain to them, for example. The students' job is still mainly to take notes, preparing to verify on the test that they have understood what they have been told. (Do you agree that this was, and is, the norm in colleges? If not, what was the norm 20-40 years ago? has the norm changed?). I think the norm has changed a little, but not in the ways most people predicted years ago.

Please add your own observations (click the 'comment' button below.)

What claims for technology-driven change in learning did you hear, 5-10 years ago or more?

What did you think that academics would need to do in order to achieve that improvement in learning on a programmatic scale?

Have any of those hoped-for improvements in learning actually happened across your program?
Which predicted changes in learning have not yet occurred on that scale? Why not?
  • Were those goals for changes in learning actually unattainable or over-valued?
  • Were there problems with the proposed strategy for achieving those changes?
  • Was the new technology of the day inadequate to the task?
Link to: The first thing I no longer believe: to attract attention, people and resources, be the first to buy into the hottest new technology.


  1. You left out slide projectors and strip films, oops - I meant film strips.

    Seriously, I wonder if you bought into the "transforming" hype. How about "improving" education?

    We do a much better job of educating visual learners, and simulations have opened up new learning tools for many, ...

    Have all of these smaller improvements added up to a transformation?

    That's a judgment call - but I note that a much larger proportion of our population gets diplomas.


  2. Anonymous9:27 AM

    Hi Steve. In the 80s I believed that synchronous network communication would encourage weak students to develop writing skills. I sadly concluded after several years that whatever skills students developed this way were not "writing" as I know and teach it, and of course the current twitter generation is living (distant) proof of that. In the 90s I believed that async computer communication was the way to go because people could stop and think about what they were writing. Alas, on their own not much of that thinking occurred. By 2000 or so I had concluded that not computers but "communication" was the issue. It comes down to a teacher, some students, and the hard work of communicating in writing. I think teaching and learning writing was done well face to face with chalkboards and even stamped into mud. It is done better on computers and personally I love teaching at a distance, but computers are ONLY a tool and can never replace good teaching or good learning. Diane Thompson

  3. Matt Wagner9:21 PM

    I was (and still am to a certain degree) a believer in the classroom flip. Our face-to-face class time was spend delivering information and application was in the form of homework done outside of the classroom. Technology has created more efficient ways to deliver information which in terms can free the classroom to become more dynamic.

    Technology is merely a tool and isn't good or bad, but what I think technology does is allows us to rethink how we teach and it challenges our beliefs. For example, if I believe strongly in the importance of lecture, I may not completely change how I teach, but I may at least begin to think about ways I can be more effective in my lectures and how technology may play a part. I would say I am much more practical in my thinking of the impacts of technology. Even though there is large interest in something (like clickers during a pilot), that doesn't automatically translate to adoption. I am still excited about possibilities, but realize the avenue to adoption is much more complex than getting a demo of the newest and shiniest object.

  4. Steve-

    It seems that every generation is enamored with some new technology that is going to cause a tectonic shift in teaching and learning. Alas, we're still waiting for the promised changes. (Cuban's book Oversold & Underused is a great take on this issue.)

    If we're to learn from the past, we have to get out of the paradigm of technology-driven change. Technology will continue to march forward regardless of what we do in the educational space. In many instances we've appropriated technology and improved things--but largely in areas of administrivia. We've used technology to improve efficiency, but not learning (at least not in the kind of way Bloom envisioned when he articulated the "2 sigma" problem).

    We have to get to the real heart of the issue, one that you note in your post--we have to change the relationships, roles, expectations, and patterns of interactions between teachers and learners, learners and learners, and learners and the broader communities in which they live. Certainly there are technologies that can afford these sorts of changes, but the problems we face are still people problems, not technical problems.

  5. I like Jon's comment. In essence, that's what this whole series of blog posts is about. The things I no longer believe (the posts numbered 1-10) relate to a technology-driven view of change, while the things I now suggest (lettered A-H) relate to a particular view of what we humans need to do, and how we might use technology, among other ingredients, to get there.


What do you think?