We're clearly heading into an age of brilliant technology. Computers are already impressively good at guiding driverless cars and beating humans at chess and "Jeopardy!." As Erik Brynjolfsson and Andrew McAfee of the Massachusetts Institute of Technology point out in their book "The Second Machine Age," computers are going to be able to perform important parts of even mostly cognitive jobs, like picking stocks, diagnosing diseases and granting parole.
Certain mental skills will become less valuable because computers will take over. Having a great memory will probably be less valuable. Being able to be a straight-A student will be less valuable. So will being able to do any mental activity that involves a set of rules.
But what skills will be valuable?
In the news business, some of those skills are already evident. Technology has rewarded sprinters (people who can recognize and alertly post a message on Twitter about some immediate event) and marathoners (people who can write large conceptual stories), but it has hurt middle-distance runners (people who write summaries of yesterday's news conference). Technology has rewarded graphic artists who can visualize data, but it has punished those who can't turn written reporting into video presentations.
The age of brilliant machines seems to reward a few traits. First, it rewards enthusiasm. The amount of information in front of us is practically infinite; so is that amount of data that can be collected with new tools. The people who seem to do best possess a voracious explanatory drive to follow their curiosity. Maybe they started with obsessive gaming sessions, or marathon study sessions, but they are driven to perform bouts of concentration, diving into and trying to make sense of bottomless information oceans.
The era seems to reward people with extended time horizons and strategic discipline. When Garry Kasparov was teaming with a computer to playing freestyle chess, he reported that his machine partner possessed greater "tactical acuity," but he had greater "strategic guidance."
That doesn't seem too surprising. A computer can calculate a zillion options, move by move, but a human can provide an overall sense of direction. In a world of online distractions, the person who can maintain a long obedience toward a single goal, and who can filter out what is irrelevant to that goal, will obviously have enormous worth.
The age seems to reward procedural architects. The giant Internet celebrities didn't so much come up with ideas, they came up with systems in which other people could express ideas: Facebook, Twitter, Wikipedia, etc. That is to say they designed an architecture that possesses a center of gravity, but which allowed loose networks of soloists to collaborate.
Essentialists will probably be rewarded. Any child can say, "I'm a dog" and pretend to be a dog. Computers struggle to come up with the essence of "I" and the essence of "dog," and they really struggle with coming up with what parts of "I-ness" and "dog-ness" should be usefully blended if you want to pretend to be a dog.
This is important, because creativity can be described as the ability to grasp the essence two different things and create something new.
In the 1950s, the bureaucracy was the computer. People were organized into technocratic systems in order to perform information processing. But now the computer is the computer. The role of the human is not to be dispassionate, depersonalized or neutral. It is precisely the emotive traits that are rewarded: the voracious lust for understanding, the ability to grasp the gist, the empathetic sensitivity to what will attract attention and linger in the mind.
Unable to compete when it comes to calculation, the best workers will come with heart in hand.
David Brooks writes for The New York Times.