Skip to content
Join our Newsletter

Geoff Johnson: Artificial intelligence augments learning, but won’t replace it

You can’t blame educators for their long history of being fascinated with futurism. From George Orwell’s 1984 and Arthur C.
johnson
Irving Wladawsky-Berger of IBM suggests that while some job activities “are more susceptible to automation, others require judgment, social skills and other hard-to-automate human capabilities. … Just because some of the activities in a job have been automated, does not imply that the whole job has disappeared.”

You can’t blame educators for their long history of being fascinated with futurism.

From George Orwell’s 1984 and Arthur C. Clarke’s 2001: A Space Odyssey right through to Alvin Toffler, educators have been intrigued by books claiming to provide a glimpse of what the future might hold for the kids sitting in classrooms today or, for that matter, the future of the teaching profession itself.

Unfortunately, these glimpses of a predicted future have often relied upon the prospect of a dystopian future where robots take over everyone’s jobs and, as Toffler suggested in his 1970s bestseller Future Shock, the pace of technological change will cause societal psychological breakdown and general social paralysis.

Bear in mind that social-media futurologists who seldom let the facts get in the way of a good ominous prophecy sense impending employment doom for the next generation while quoting misleading observations that imply that somehow technology is torching the economy even as it snuffs out human jobs.

Uber, they point out, is just a software tool that doesn’t own any cars, but is now the biggest taxi company in the world, while Airbnb, possibly closing in on being the biggest hotel company on the planet, does not own any properties.

And so on.

Possibly most disturbing to Tofflerites is the emergence of technology such as IBM’s Watson, a question-answering computer system originally developed as an artificial intelligence Jeopardy game-show competitor. It is now also capable of answering questions posed in natural language and derived from a wide variety of professional knowledge bases.

Serious applications soon materialized such as health-care programs that assist doctors and nurses in making evidence-based clinical decisions.

IBM also claims that there is now a Watson application that will provide basic legal advice within seconds with 90 per cent accuracy.

Fortunately, alongside all this “robots will take over the world by 2050” futurism, saner heads are prevailing.

In a 2018 Forbes article, Erik Brynjolfsson and Daniel Rock, with MIT, and Tom Mitchell of Carnegie Mellon University point out that the impact of machine learning, the self-programming, self-adjusting core of AI, on jobs is iffy.

In fact, they say, few jobs can be fully automated using machine learning. While machine-learning technology can transform many jobs in the economy, full automation will be less significant than the re-engineering of processes and the reorganization of how the same tasks are accomplished.

Irving Wladawsky-Berger, who for more than 30 years influenced and shaped IBM’s innovation and technical strategy, agrees. As quoted in the 2018 Forbes article, Wladawsky-Berger suggests that while some job activities “are more susceptible to automation, others require judgment, social skills and other hard-to-automate human capabilities. … Just because some of the activities in a job have been automated, does not imply that the whole job has disappeared.”

On the contrary, he continues, “automating parts of a job will often increase the productivity and quality of workers by complementing their skills with machines and computers, as well as enabling them to focus on those aspects of the job that most need their attention.”

While researchers cannot predict the numbers of new jobs/careers that new technology will create, one study by Gartner Research, a leading global research and advisory firm, again as quoted by Forbes magazine, states that while 1.8 million jobs will be lost by 2020, 2.3 million new ones will be created.

Even today, according to Gartner, there are a huge number of technology jobs that did not exist 10 years ago: State-of-the-art programming, data science, web security, online marketing and sales are a few examples.

“There is no reason to believe that the need for humans to create and manage new technology will decrease,” says Gartner.

The point is that while technology is making it easier than ever to query Google or effortlessly calculate a math problem, educators, lawyers and doctors, along with tradespeople and both skilled and unskilled workers, will be in the process of determining the types of new knowledge needed to thrive in a technology-saturated workforce.

Teachers, for example, are finding that the educational models of the past, which focused on providing students with the requisite skills required by 21st-century occupations, are now more concerned with teaching students how to learn on their own.

In that sense, evolving technology will never undermine a teacher’s role in the classroom; instead, it will augment it. As Bill Gates says: “Technology is just a tool. In terms of getting the kids working together and motivating them, the teacher is still the most important.”

Geoff Johnson is a former superintendent of schools.