Here is a hypothetical dilemma. You find yourself, early in your career, at a crossroads.
Turn left, and you will walk into a job with a million-dollar signing-on bonus and a salary thereafter that is beyond most people’s dreams. You will have access to datasets on human behaviour that academic social scientists can only dream of – and virtually limitless freedom to play with them.
Turn right, and your job will come with a lot less pay and experimental freedom but with a sense of well-being: you will have a positive impact on the world. The only problem is that this will not be appreciated by everyone; some people may even start sending you death threats.
Which path do you choose?
For the most talented postgrads and postdocs in key fields, this question is not a hypothetical one.
Silicon Valley is sucking brilliant minds out of institutions such as the California Institute of Technology and the University of Cambridge faster than a Russian troll factory sucks the life out of Western democracy.
Data is the new gold, and has turned the Bay Area of San Francisco into a nerd nirvana.
Not all have turned left at the crossroads, however. One research star who has resisted the allure of the tech titans is Michal Kosinski, an expert on psychometrics at Stanford Graduate School of Business.
In our cover story, we interview Kosinski, whose work has revealed the almost unbelievable insights that can be derived from individuals’ seemingly innocuous online behaviour – and the disturbing implications of this in terms of privacy, manipulation and repression.
In some instances, Kosinski’s papers have proved spectacularly controversial – for example, his demonstration that artificial intelligence can be used to identify sexual orientation from a picture of a person’s face. It is work of this sort that has provoked the death threats.
But, as Kosinski tells us, these peer-reviewed papers are “controlled explosions” designed to reduce the destructive force of similar work that is undoubtedly going on within the technology companies, not to mention government-sponsored efforts to master digital mind control.
There is a rising tide of disquiet about the extent to which technology is now shaping the world. This week’s revelations about the leak of Facebook data are a case in point.
The negative impact is not always malicious – although it can be – but the point made by Kosinski is that we need to know what technology is capable of if we are to understand how it is being used to influence our behaviour – and how it might be used in future.
The imbalance of power between universities, which ultimately serve the public good, and technology giants, which do not (whatever their pluto/technocrat founders may say), puts this public understanding in real danger.
As Kosinski says, “being in academia maximises my chances to have a positive impact on the world.” Unfortunately, he also tells us that he has “given up trying to work with computer science students, because they always leave me after three months” for the opportunities – as much intellectual as financial – of Silicon Valley.
Perhaps this is inevitable. But if so, it suggests that we need even greater permeability between universities and this most crucial of 21st-century industries.
That might mean collaboration, but ideally would also see more substantive traffic in both directions because, as Kosinski points out, the learning opportunities in companies such as Facebook are enormous (working there for three years, he suggests, is the equivalent of two master’s and a PhD).
Academia might be a vocation, but it’s no good relying on divine intervention to make this happen.
Perhaps capitalising on the current unease about the impact that technology is having on all our lives provides a more realistic answer – an appeal to those who have got rich (and even cleverer) working in Silicon Valley’s corporate campuses to give something back on the academic ones across the road.
Rather than turning left or right, we need a third road that ensures that the best minds are working in all our interests – and keeping those who would use digital sorcery for harm on the straight and narrow.