“Anyone who can spell ‘artificial intelligence’ can get a job these days,” observed Dame Wendy Hall at the Times Higher Education World Academic Summit in Singapore last autumn.
What she resisted adding was the usual rejoinder that 10 years hence all those jobs will have been taken over by the technology in question.
That is how the story often goes: advances in AI are either the answer to all our problems and the basis for a world of jobs yet to be imagined, or a tech nightmare that will render humans obsolete.
Hype notwithstanding, if this is your field you are currently in high demand, with universities and the tech industry locked in what the president of McGill University has described as a “war for talent”.
This is a big problem. Universities, after all, have something unique to offer with their ethical guidelines, multidisciplinary approaches and focus on big questions and blue-sky research.
Tech companies, on the other hand, have deep budgets and unrivalled data and computational capabilities – along with more short-term and commercially driven goals.
Both have roles to play, and it was in this context that we set out to ask the real experts – primarily professors in AI-related fields – to assess the state of AI play for higher education.
The findings of this unique survey are discussed in our cover story.
But alongside the data, it is always helpful to hear the considered view of individuals, and I recently discussed the issues with Toby Walsh, professor of AI at UNSW, Sydney.
The idea that robots are about to take over is fundamentally misguided because “robots do exactly what we tell them to do”, Walsh told me at the recent THE Research Excellence Summit.
“The more I study AI, the more respect I have for the human brain. We have extraordinary breadth of ability, adaptability, creative, social and emotional intelligence. So there’s a lot still we have to work on, and there are a lot of human strengths that I am not sure we will ever replicate in silicon.”
In Walsh’s view, AI should stand for “augmented intelligence”, in which technology is used as a tool to “amplify what we can do with our brains, just as we have with our muscles in the past”.
The real danger, he argues, is not that technology outsmarts us, but that we misuse “stupid AI”.
“I’m optimistic in the long term – investing in technology has always brought about a better quality of life – but I am quite pessimistic in the short term,” he said.
“It is going to be a very bumpy road, and we are already starting to see some of those bumps – the way in which our political discourse is being eroded by the misuse of technology, the growing inequality and discontent in society driven often by technological changes.
“People think that society changes technology, and it does. But equally society gets to change technology, and these are things that we need to think carefully about.”
This, perhaps, points to one of the fundamental strengths and purposes of the university, and why it’s a matter of grave concern that, as confirmed by our survey, they are struggling to recruit and retain talent in this area.
“It’s very hard to keep hold of people, to recruit professors to educate the vast numbers of people we need with these sorts of skills,” Walsh agreed.
“We do need creative solutions and partnerships, and tech companies recognise that it’s not in their interest to leave universities to whither on the vine.
“One of our strengths is diversity of thought: the realisation that it isn’t just technologists who should be answering these questions, we need ethicists, philosophers, sociologists, economists, political scientists, historians – universities are the perfect place to have those broad conversations and to have the long-term vision that we need.”
The second great strength, he said, is that “we can entertain much more left-field, long-term ideas”.
It’s a point that’s often made in these pages, but perhaps not by a computer scientist in a field that will get any captain of industry’s or politician’s pulse racing.
The point of academic freedom is not, as in the populist imagination, so that scholars can fritter away their days and taxpayer dollars on pet projects, it is that this is the environment in which the truly important, epoch-defining advances happen. History has shown this to be the case again and again.
And Walsh is adamant that it remains the case with AI: “I do talk to my colleagues in the big tech companies, and their view is very much that the big breakthroughs are not going to happen there: they are going to happen as they always have in the past, in universities.”
后记
Print headline: Where AI’s future is forged