Friday, November 18, 2022

Obsolescent Angst

 Conversations about AI are getting harder to avoid: mostly because they are no longer meandering speculations about what might be; these days, they are reviews of what's already occurred. As a result, we are all starting to grapple with the melancholia of obsolescence. The obsolescent angst, if you will.

A couple months ago, of course, I would have waved aside the notion that machines might one day be able, say, to create an original work of visual design. "Oh come on," I would have said, "what a ludicrous sci-fi scenario. Save it for your screenplay. Never gonna happen." Now, when I draw breath to say the same thing, friends text me images generated by AI that were created in less than a second--all of them utterly convincing, impossible to distinguish from the work of a human intelligence, and no more acts of plagiarism than most products of the human imagination, which likewise works through a process of agglutination rather than creation ex nihilo. 

So we have to accept that the AI is here and can convincingly mimic-- if nothing else-- the human creative process, in both the verbal and visual realms. The question then shifts to: how big of a problem is this?

A friend assures me it is a very big problem indeed: enough to cause him an existential crisis. "It's like living with cancer," he told me. "You can maybe forget about it for an hour or two. But it's always there. And you know that eventually it will destroy you." My friend, you see, recently took up drawing as an avocation-- a personal skill he set out to perfect, for now as a hobby, but possibly someday as a source of remuneration. He feels a bit singled out and betrayed by the universe, therefore, that it chose this precise moment, just as he was starting to get pretty good at his craft, to replace the same process with machines. He knows he can keep drawing as much as he wants to regardless: the AI can't stop him. But he feels some of the meaning of it has been lost. 

"The worst part of it is," he told me, "that we used to tell ourselves the future was in the arts. After the machines replace all the other jobs and we're all living at home collecting our UBI checks, we said, then we'll have all the time in the world to work on our creative projects. But now even those are being taken from us."

He has a good point; and in some ways, the irony runs even deeper than his observation suggests. It has indeed been a mainstay of utopian predictions about the future of tech for more than a century that the machines would merely eliminate "drudgery." All the "rote" muscular tasks of human life could lend themselves to being replaced by steam or motor power, leaving us with abundant free time to perfect civilization, write symphonies, etc. (because these were believed to be the one thing that could never be automated). Thus, we had Oscar Wilde foreseeing that the machines would eliminate the misery of street cleaning on a cold day; and we had John Maynard Keynes prophesying that, within his grandchildren's generation, the "economic problem" would be solved and we could all spend our time writing operas. 

If anything, though, the opposite has occurred. People are still employed cleaning streets (albeit with some mechanical assistance). Many forms of manual labor have proved stubbornly difficult to automate. The result is that home construction and transportation, say-- as Robert Gordon has pointed out-- are today practiced pretty much the same way they were seventy years ago, with astonishingly little productivity growth or increases in efficiency from technology to speak of. Whereas-- we are now discovering-- the symphonies, blog copy, and visual design arts may turn out to be far easier to replace with AI. We may therefore be left in a world that is the precise inversion of Keynes' and Wilde's optimistic scenario. We may instead have a future in which the machines are writing the operas, and human beings are still laying brick.

And in some way, I can't shake the feeling that this serves us right. It is a kind of poetic justice for those of us in the intelligentsia who were overly sanguine about the impact of automation on blue collar labor. Now, we will be called upon to practice what we preach. Those who were bored or tired at the thought of muscular exertion but really wanted to write screenplays, and who therefore thought that everyone would surely welcome a future of total automation of the labor market and the elimination of most jobs to be replaced by UBI checks from the government, will now have to put to the test whether they really think it is so great to be deprived of a livelihood and cast suddenly into a state of dependency, forced to bank on the hypothetical munificence of a distant government or tech firm for survival-- which is the fate they had eagerly envisioned for everyone not currently employed in a knowledge profession. 

Of course, I don't actually think that's how it will play out anyway. I am aware of and can appeal to all the history and economic theory suggesting than an AI disruption to the labor market might not be as bad as all that. After all, every wave of automation of any task prompted the same apocalyptic warnings about looming unemployment and pauperism; and every time, the opposite eventually occurred. The increase in efficiency from the technological innovation made the given industry more profitable, eventually creating more jobs, rather than fewer, in that sector. Meanwhile, it's an especially odd spectacle to see people wringing their hands in our current society over the prospect of automation eliminating all jobs from the economy, since what our society is experiencing right now is actually a slowdown in productivity growth, suggesting that we don't have enough technological innovation occurring, not that we have too much. 

Besides-- remember a decade or so ago, when everyone was warning that in the wake of the Great Recession, companies would seize the opportunity of the downturn to lay people off and never bring them back, because the unemployment crisis would coincide with the development of self-driving cars and other technology that would render many current forms of human labor otiose? Well, did those warnings come true? Hardly. Self-driving cars haven't panned out too well, and employment ultimately came roaring back for reasons having nothing to do one way or the other with technology. It was simply because the government finally engaged in some good old-fashioned deficit spending of the kind that could have gotten us out of the recession far earlier if Republicans hadn't blocked it when they were out of the White House; and now indeed we have the opposite problem--there's so little unemployment that the economy has gotten overheated, setting off inflation. 

Historians regarding our era would be puzzled, for these reasons-- I suspect-- to hear that we were worried about automation as a problem, of all things, at a time of both historically low unemployment and weak productivity growth. 

For AI to take away all the jobs, therefore, it would have to be the one technological innovation in history that played out differently from all the prior innovations. It would have be different in kind, not just quantity. It would also have to defy economic theory, which-- as stated above-- suggests that efficiency gains increase employment in a given sector over time, rather than diminishing them. It seems to me unlikely that AI fits these criteria, however uncanny and astonishing its emerging capabilities might be. 

But how then would this play out? It's all well and good to say to displaced blue collar workers-- as elites have been doing for decades-- "oh, don't worry: these changes to your industry will bring more jobs in the long run." It's another story when it is our own paychecks on the line. Do we still sound so optimistic when the technological change comes for our own livelihoods and skill sets? Or do we resort to protectionism and state-enforced monopolies because, after all, people-- including us-- are more afraid of losing what we have than of missing out on what might be? 

Let's take my own industry. Freelance communications strategizing for nonprofits may not be a very large slice of the economy, but it is the one I know best; so: let's ask-- what does AI mean for me? Of course, my first reaction is sheer panic: the AI is coming for me! Much of what I do for the hours I bill currently is to write blog copy, after all. If it becomes possible to automate this form of writing effectively, that's a big chunk of time taken out of my invoice. The specter is appalling. But if we pause for a moment and apply the lessons of the history of innovation and of economic theory, we can see that this might not in fact spell the end of my or other human employment in the field after all. What it would really do is make the more rote aspects of writing out our press statements much faster: making it possible to do our work more efficiently, thereby creating demand for the higher-order strategy that goes into knowing when and where to intervene in a news cycle. 

If we accept the premises of mainstream economic theory-- that desires are unlimited, for instance-- then there is no increase in efficiency that eliminates all jobs in a sector: it just shifts all the required tasks to a higher order of cognition. In theory, this is what would result from the AI revolution as well. Instead of writing the blogs, it would be as if each writer were managing a team of subcontracted writers-- telling them the key points to hit in the press release, telling them when to work on an item, selecting from the drafts they submit, and punching them up as needed to ensure quality and consistency of voice. The same would be true in the visual arts and design fields. Each artist their own workshop manager. And isn't this just a version of what the Renaissance masters were doing centuries ago anyways? They didn't draw each hand and sinew themselves-- they had a team to do that. But the central creative vision was their own.

That's what I tell myself anyways; and it seems to me a more probable outcome on balance than the most dire warnings. I think we are not likely to see an apocalyptic wave of economic displacement and unemployment in aggregate.

This does not mean, however, that we can say these economic transformations are costless. This is where I have the biggest dispute with economic theory. Because even if it's the case that AI would ultimately create more jobs in the long run, that does not mean we would welcome the way it does so. I, after all, like writing blogs-- just as my friend likes drawing. It gives us meaning and a sense of achievement. If these tasks were automated, it would be a loss. Of course, nothing that does or does not happen with AI could stop me from continuing to type out my thoughts directly on this blog, just as people can still knit or hand-weave cloth if they are so inclined. But to write only as a hobby, and not as a livelihood, would deprive me of something I value-- even if it wouldn't deprive me of food and the possibility to earn a paycheck. 

And even with regard to that paycheck, meanwhile, we have seen already that the hypothetical new jobs created by AI would force me to move up a level in the kind of tasks I perform. And it's possible I might not be able to make the transition. And even if I could make such a change, others will not. And the ones who will not are those who cannot survive and feed themselves during the transition period-- who can't afford to sit around waiting for the economic "long run," that is to say-- because they were living hand-to-mouth. The choice to innovate and let those adapt who can is therefore a policy choice in favor of the strong and the rich, and against the poor and the weak. 

So it has been with every wave of economic transformation brought about by new technology. Maybe the power loom eventually created more jobs in total, we could say. But they came too late for the Silesian weavers whom Heine and Hauptmann wrote about, whose traditional handicrafts were unable to compete with British machine-made cloth on international markets and therefore had no choice but to starve. 

Economic theory-- to the extent it acknowledges these brutal facts at all-- lumps them together under the heading of "friction"-- a euphemism on the order of "collateral damage" or "enhanced interrogation." For what the term "friction" disguises is actually generations and whole societies driven into poverty and despair by incapacity to weather an abrupt economic transition. 

People who suggest that such massive holocausts every ten years or so may not in fact be the best policy choice are not necessarily ignorant of or deluded as to economic theory. They may in fact be well aware that such suffering is compatible with an eventual recalibration of the economy in a way that produces more jobs on net. But that doesn't mean they accept as inevitable and desirable the human suffering that results in the meantime. In his famous book on international relations, The Twenty Years' Crisis, E.H. Carr cites the words of a Serbian delegate to the League of Nations-- Marinkovitch-- which prove that one can understand the economic arguments in favor of greater efficiency and technological innovation perfectly-- yet still think they are not worth the cost: 

The old 'things-will-right-themselves' school of economists argued that if nothing were done and events were allowed to follow their natural course from an economic point of view, economic equilibrium would come about of its own accord. That is probably true (I do not propose to discuss the point). But how would that equilibrium come about? At the expense of the weakest.  

Marinkovitch had it exactly right. The "friction" does not mean the new "equilibrium" will not eventually be reached. But that does not remove one iota of the suffering that human beings experience from that friction in the meantime. 

To return to our example above, we know that the invention of the power loom didn't eliminate all jobs in textiles. In fact, surely millions more people are employed today in the making and marketing of cloth than were in the 1840s. But that also doesn't diminish the truth of what Gerhart Hauptmann described, in his play chronicling the destruction of the Silesian weavers. Hauptmann's weavers, barely subsisting, take out their misery by exacting vengeance on the manufacturers; but as the playwright artfully shows, their employers are subject to the same economic forces. Indeed, to continue employing weavers at all they leave themselves with a razor-thin margin, since they are competing with machine-made cloth from Britain and rival mechanized domestic factories at the same time. As a result, the weavers' pay suffers. And they didn't have the option of waiting for fifty years, of learning a new discipline, or of moving to Britain and getting a new job supervising the machines. 

And if we say that the weavers were simply SOL because innovation must occur as a matter of historical necessity, and we must enter upon it regardless-- even knowing full well as we do so the human displacement and misery that will result in the near term-- then are we not engaged in a sort of Stalinism: sacrificing generations on the altar of a utopian projection that may never come to pass? Must we really say that there is no alternative? (Isaiah Berlin thought not-- and his great essays on Herzen and Tolstoy were efforts to hint at alternative paths to development that have still to be tried-- ones that build upon what's already there instead of sacrificing people to the Moloch of progress.)

I don't think it is possible or desirable to simply look at all this process of displacement and say that we should forbid innovation. To do so would require exercising a degree of control over economic life that would soon descend into authoritarianism. But I do think that the arguments in favor of innovation and improved efficiency-- while in no way wrong so far as they go-- do not always and everywhere have to win out over rival considerations. It is possible, as Karl Polanyi famously argued, for democratic processes to impose a check upon the speed and magnitude of economic development, in order to conserve some ways of life and ease the process of transformation. It is desirable and perhaps inevitable that the public will demand some slowing of innovation; not necessarily because they are oblivious of its benefits, but because human nature is limited in its capacity to weather change beyond a certain pace and scale. 

And when the choice comes down to advantaging the strong or the weak, I have no trouble at all in saying that public policy in a democratic community should favor the latter. Not because the strong don't have their rights and claims of their own, but because they are-- by definition-- better positioned to look after themselves. They can only be hindered so far. The poor, meanwhile, can be annihilated by economic changes and technological innovations that occur too rapidly. To prevent this atrocity, the state has to intervene as a counterbalancing force to the market. Or, in places where modern technological development has not yet reached, it is the task of the democratic community to figure out ways to render development compatible with current ways of life. 

Of course, the idea of "slowing down innovation" cuts against the grain of American life. Many people will simply scoff at the very phrase. But, to the extent that AI forces change on the kinds of jobs that have hitherto been relatively secure and privileged in our society, I suspect we may see a change in elites' openness to this idea. Perhaps, they will think, it is actually necessary to impose some democratic restraints on the pace of innovation. We should heed, in this regard, the words of Robert Frost. I've quoted them before, but they remain the best lines of poetry ever penned on this precise problem, and I can't help but endorse his ultimate recommendation in the final lines-- quixotic and counter-historical as it may be: 

Even while we talk, some chemist at Columbia

Is stealthily contriving wool from jute 

That when let loose upon the grazing world 

Will put ten thousand farmers out of sheep.

Everyone asks for freedom for himself,

The man free love, the businessman free trade [...]

Political ambition has been taught,

By being punished back, it is not free: 

It must at some point gracefully refrain.

Greed has been taught a little abnegation [... So too,]

None should be as ingenious as he could,

Not if I had my say. Bounds should be set

To ingenuity for being so cruel 

In bringing change unheralded[.]


No comments:

Post a Comment