The coming "symbolic analyst" meltdown

For decades, traditional manufacturing jobs were gobbled up by automation and offshoring. This led Robert Reich to postulate a hierarchy of work in which the “symbolic analysts” – essentially, people who worked with information as opposed to actual stuff – were at the top, while people who worked with actual things were at the bottom. With a remarkable lack of sympathy, journalists and politicians told coal miners and auto workers that they should “learn to code” as their jobs vanished.

Advertisement

But it turns out that the people whose jobs are at the most risk from AI are, well, the coders. ChatGPT can write code, and sometimes it’s pretty good code. (Sometimes it’s not, but then again, you can say that about the code that people write, too.) ChatGPT can write news stories, and essays, and speeches, and again, they’re not always gems, but neither are the actual human products in those areas, either. And the AI programs get better from one year to the next, while human beings stay pretty much the same. That being the case, we can expect them to become a serious threat to jobs in the near future. …

Now ChatGPT is threatening to replace people who write catalog copy, online entries, etc. Buzzfeed is replacing a lot of its writers with AI-generated text. You may joke that that’s low hanging fruit (I mean, you know, Buzzfeed) but as Clayton Christensen noted, innovation often enters a field at the bottom, and gradually works its way up.

[Gulp. Time to launch a new podcast. — Ed]

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement