An AI for an eye?
Sam Grace, associate director
Next year will mark the fictional 20th anniversary of Judgement Day. The date that Terminator 2’s Skynet becomes self-aware. Whilst James Cameron’s vision of the future has done for robots what Jaws did for sharks, 2016 has been a watershed year for Artificial Intelligence (AI).
In March we saw Google’s AlphaGo, an artificially intelligent system designed by a team of researchers at DeepMind, defeat Korean grandmaster Lee Sedol in a best-of-five series. In a game that has more possibilities than you would get by considering a separate chess game played on every atom in the universe, what struck observers was not just the one-sided nature of the win, but the way some of the games were won. AlphaGo used “beautiful” moves that no human player would have done.
Virtual assistants have become a part of everyday life via Siri, Cortana, Alexa and Google. But perhaps most telling has been the increasing focus on AI and the next industrial revolution. Over a year ago, The Bank of England’s Chief Economist Andy Haldane warned that up to 15 million jobs in Britain are at risk of being lost to increasingly sophisticated machines. According to Haldane, automation poses a risk to almost half of those employed in the UK and a “third machine age” would hollow out the labour market, widening the gap between rich and poor.
The concern is that machines have substituted not just manual human tasks, but cognitive ones as well, reproducing an ever widening and deepening set of human skills, and at a lower cost.
Forrester predicts that in the US robots will have eliminated 6% of all jobs by 2021. This will start with customer service representatives and eventually move on to human truck and taxi drivers being replaced. Tesla already has 1.3 billion miles of Autopilot data. Change is coming.
But what happens to the 6% or the 15 million Brits displaced from their work? Are we all doomed?
Inevitably there will be new jobs created to oversee and maintain these automated systems, but they will require an entirely different skillset. And this need to constantly develop and evolve your skills is a trend we’re likely to see become increasingly pervasive within the workforce. In an age of disruptive technology, people have to plan for disruption. Changing jobs will become normal; training and skill upgrading will be a life-long pursuit.
Author Tom Chatfield argues we are not in competition with machines and that human beings should embrace our demotion from the centre of things.
“Our creations grow faster than we do, and may reach further. Yet we are all the more remarkable for this
– if we can learn to let go of the insistence that it all still comes down to either a battle or a love affair.”
The most optimistic view comes from Jeffrey D. Sachs, a University Professor and Director of the Center for Sustainable Development at Columbia University. Professor Sachs proposes that if robots and expert systems perform all the unpleasant and humdrum work of the economy, and as long as fiscal policies ensure that everybody can share in the bounty, the results could be a 21st-century society in which we have much more time – and take more time — to learn, study, create, innovate, and enjoy and protect nature and each other.
The World Economic Forum suggests that the problem with the changing world of work is not so much the loss of opportunities caused by technology as the period of transition. And without doubt we are in a period of transition driven by AI.
The truth is that no-one knows which jobs will go and when. But I, for one, am hugely excited about the potential for AI to be a force for good.