Part 1: A re-examination of value
AI is without a doubt the trendiest economic scare of the day. Technology pundits love to put forward apocalyptic scenarios about the impending takeover of the robots overlords. We are told that intelligent smart driving cars will leave millions of drivers without jobs. Lawyers are soon to follow as much more specialized software will commoditize expert legal advice. The AIpocalipse, it is declared, will continue to destroy job after job, wiping over 47% of the US workforce in 20 yrs. Scary!
Few question these dire prophecies, which is mind-boggling considering the profound ramifications of the suggested solutions. I am not as terrified of the impending disaster. But I believe the AI alarmists are making fundamental-logic mistakes it's important to consider. There are a few points I would want to go over ... so, to keep things easy to follow, I will divide my arguments into a series of articles. Part 1 (this article) will focus on the notion of value and its (miss)interpretation by the AI voices of doom.
Before we begin let me clarify that I am addressing the specialized weak AI: algorithms that can drive your car, provide medical diagnosis etc. A general AI is far away and we will not go into its implications, which I expect to be HUGE.
The AIpocalipse Hypothesis
The proposition put forward by the people crying wolf is the following: Machines will become smarter and smarter, gaining more and more specialized skills that will, in time, rival the humans employed to get some of these things done. As technology develops, machines will also get cheaper and cheaper while their capabilities will expand at an exponential rate, making humans seem inefficient and expensive by comparison. Unemployment will ensue as most employers will chose the computers over humans, forced by the unforgiving competition of the capitalist system. A spine-chilling scenario!
The doomsday scenario is followed by the all-so-trendy advice to introduce basic income, tax the rich corporations etc. Che Guevara policy making at its finest.
So the underlying premise is that the automation of the means of production (by making computers outskill the humans) of a certain service will suck out all the value from it, therefore leaving us humans with no way to contribute anything of economic value. But the whole case stands on a very weak understanding of what qualifies as valuable.
Another argument I hear, tied to the value concept, is that most of the jobs will be commoditized by machines, leaving us with nothing to do. This version of the future seems more like a dystopian version of the Jetsons. However this argument is also unrealistic. It assumes a human to be nothing more than an automaton, a button pusher, with a go into limited set of needs. Which makes it so it's only a matter of time until these jobs will be adequately satisfied by a machine.
A macroeconomic look at value
Baked into the argument is the idea that the means of production are solely or mainly responsible for the economic output of a service. Measured, I assume, by the price of the item in discussion. It's an intuitive perspective but a very unsophisticated one.
Equating economic value with the costs associated with production is an idea as old as time. Even a child can comprehend the concept as soon as he sets up his first lemonade stand. It was first set forth by Aristotle, who proposed that goods themselves must possess some property that makes multiple parts of a certain type of good equal to another good. Say 5 apples are as valuable as a loaf of bread. It follows then that economists ought to look for for the element by which certain quantities of certain various goods can be declared equal to each other. The stature of "the philosopher" turned his intrinsic value theory into unchallenged dogma for millennia. Classical economists like Adam Smith embraced the theory:
Karl Marx famously based his value theory on the amount of labor that went into a good. The political implications were that if the worker did not receive 100% of the final price of a good he made, he was being "exploited." The same foundational idea is shared by the experts equating AI development with Skynet, who see the price of a good being dependent first of all on the cost of the ingredients. As the main ingredient, humans offer the scarcity that will keep the cost high enough to afford a salary to the workers.
Until Austrian economist Carl Menger revolutionized economics by rejecting the above theory of value (based on the cost-of-production). Menger's breakthrough insight was to realize that "value is… nothing inherent in goods, no property of them, but merely the importance we first attribute to the satisfaction of our needs... and in consequence carry over to economic goods as the… causes of the satisfaction of our needs." (Principles of Economics).
The notion of consumer value introduced by Menger did not however simplify economics. On the contrary. Menger's value is a complex and psychologically complicated concept. It took the absolute value proposed by his predecessors and turned it on its head by making it relative and deeply personal. This means that what is valuable to me can be useless to you. You may see music as nothing much but background noise that will help you focus on your reading. I may be an audiophile getting shivers down my spine when I hear an amazing reproduction of my favorite song. Our friend may appreciate the atmosphere and the shared experience of a live show. You will most likely listen to the music for free by turning on the radio. I will get a Tidal subscription and spend thousands of dollars on audio equipment, while our friend may travel the world going to festivals.
And we may all be into the same music genre, maybe even listen to the same artist... or the same song. The melody itself had the same production costs but the value we derive from the same successions of notes differs from person to person. And this makes us pay vastly different amounts of money to satisfy our desires.
Menger's observation also means that our purchasing decisions are complex enough that it's extremely hard for any of us to justify most of them. Why do I spend huge amounts money on vinyl records when I have access to the world’s music library for free? LPs degrade pretty fast, hold only a few songs, they are hard to maneuver compared to the ease of use of Spotify-enabled devices. It's not the quality of the sound either, as I am conscious enough to admit that I cannot tell the difference, and if I were to pick my favorite medium in a blind test, I will most likely pick the digital version. But apparently I value the habit of turning the disc, the physical manipulation of the vinyl gives me an emotional thrill that is hard to explain. I can't explain it, but I frequently pay $50 for a good album in this format. And most likely the record was mass-produced by a machine with minimal input from a human.
The ingredients of value
Consumer value might be a rather-fuzzy concept but getting close to understanding it is a key attribute of anyone working to bring new products to market. It is something I have been doing my whole career, as a founder of several companies or as a product manager. What makes it hard is that, as HBR wrote in 2015, "the amount and nature of value in a particular product or service always lie in the eye of the beholder". Reid Hoffman also likes to point out people are terrible predictors of their own tastes or preferences. He illustrated this piece of advice by pointing at Mark Zuckerberg, who is a master at listening to the users of Facebook while selectively ignoring their feature requests. It's why we have user revolts every time Facebook changes something while the engagement number spike.
At a very high level, customers have a few categories of needs they want a product to fulfill:
- Some are functional needs. Washing the dishes is one example. Creating a report for your boss is another.
- Some are purely social. We are social creatures and we have to take others into account as part of our value map. We may want to look trendy or be perceived as an innovator in the workplace.
- In a lot of cases we seek emotional fulfillment. We want something to thrill us or to give us a feeling of awe. It's part of the reason we share rage-inducing political articles or funny cat-vid eos.
We are complex beings and our needs, as consumers, are complicated as well.
The functional needs, represent just a small part of the characteristics that we humans appreciate when we make a purchasing decisions. When asked to justify a recent acquisition, we may point to discrete functionalities, but in most cases these are just after-the-fact rationalizations.
Our human limitations may keep us from being able to define our value judgement algorithm but there is still hope for the product person. HBR proposes that universal building blocks of value do exist, and advanced 30 of these elements of value. These elements fall into four categories: functional, emotional, life changing, and social impact. Some elements are more inwardly focused, primarily addressing consumers’ personal needs:
Let's take another look at the value map. Notice how few of these are bare-bones functional needs can be reduced to an algorithm (especially the effort reducing one). The vast majority of the items on display speak more to a human soul than to his brain. Attributes like social affiliation, wellness, motivation or self actualization are quite hard to automate. And the reality is that very few products right now are even remotely concerned with scoring marks in these categories.
This brings us right back to our HAL 9000 situation. Throughout our history we struggled to nail down as many of these functional attributes. We started as apes in the jungle and we took hundreds of thousands of years to get basic commodities like food, water and energy. We took some of the randomness out of production with the invention of agriculture. Then we automated things even more during the industrial revolution. All our history we have been pursuing simplifying production and we have mostly paid attention to functional aspects alone.
We are now seeing glimpses of a future where some of these functional needs are easier to be automated as we devise smarter and smarter algorithms. Algorithms that would do a job faster, better and cheaper than a human counterpart. And that's awesome. Because if our functional needs are being satisfied by robots, then we can channel our energy on items that are more akin to our spiritual needs. New classes of products would be created to address some of the ignored attributes. As basic functionalities become commodities, their capabilities will continue to increase, while competition will drive prices down (in a free market with healthy competition, the price trends towards its costs of production, an observation made by economist Friedrich Hayek). The functionality abundance will make other value building blocks more economically valuable and unlock our energy and creativity to play a higher game.
One may point out that some of these higher-value building blocks are not that important (it's all touchy-feely stuff) or that it's only a matter a time before they could be automated too (by machines alone or by a team of humans and machines). Let's address this final argument before we wrap up:
It's convenient to discard the economic value of qualities like nostalgia or attractiveness. But this speaks more to our commodities-constrained past than to the future. As discussed above, until recently we struggled with the satisfaction of basic needs. We have been wired by millennia of evolutionary biology to put a high price on basic human necessities. Automation made some of these less scarce and the AI is continuing the trend, making more and more industries functionally accessible. The skeptic's position is that this level of sophistication will be good enough for most consumers. But that's never the case. Our never-ending dissatisfaction with what we have is one thing that makes us human. John Steinbeck famously remarked “For it is said that humans are never satisfied, that you give them one thing and they want something more. And this is said in disparagement, whereas it is one of the greatest talents the species has and one that has made it superior to animals that are satisfied with what they have.” One we get our functional needs met (by a machine or otherwise) we will desire other items. Our economy will rearrange itself around new sources of scarcity.
As for the inevitability of automating the whole value pyramid …. we may get there but we are quite far away from it. Most of these value blocks are so vague that we will work for a long time to define them, let alone provide an algorithm that will check the box. A lot of these blocks are multi-dimensional. Nostalgia for example is one axis that we can all relate to. Its definition depends on personal past experiences, which vary from society to society, from state to state, from person to person. It also has a biological component that can not be ignored. My wife, for example, experiences nostalgia through the smell senses, I am more visual. We have a long time to go until we can automate this sort of things.
Thank you for making it this far. As usual, I am curious to hear your thoughts (ideally focused the argument at hand). The AI topic is more extensive than value theory and I plan to continue the series and address some other dimensions of the AIpocalypse: the fast pace of the changes, human’s ability to adapt, the reliability of the prediction models and finally weigh in on some policy proposals.
Til next time!
I would like to thank Gabi Coarna, Octav Druta and Alexandra Cojocaru for helping me put this together. Your observations were extremely valuable.